Implementation Guides
Counter-UAS Integration
Detect, track, classify, and respond to hostile drones. From RF sensors and radar to autonomous intercept — the full C-UAS pipeline for your platform.
Legal notice: Active drone defeat (jamming, spoofing, kinetic intercept) is heavily regulated. In the US, only specific federal agencies (DoD, DHS, DOJ, DOE, Coast Guard) have legal authority to defeat UAS under 10 U.S.C. §130i and 6 U.S.C. §124n. This guide covers detection and integration architecture. Defeat methods are included for reference within authorized contexts only. Check your jurisdiction.
Jargon Buster
C-UAS / CUAS
Counter-Unmanned Aircraft System. The full system that detects, tracks, identifies, and (optionally) defeats hostile drones. Can be a single sensor or a multi-layered defense.
RF Detection
Listening for radio frequency emissions from a drone's controller, video transmitter, or telemetry link. Most commercial drones transmit on 2.4 GHz, 5.8 GHz, or 900 MHz. RF detection is passive — you're just listening, not transmitting.
Radar
Active detection — sends electromagnetic pulses and listens for reflections. Can detect drones that aren't transmitting RF (autonomous drones, fiber-optic controlled). Harder to implement, more expensive, but catches what RF can't.
EO/IR (Electro-Optical / Infrared)
Camera-based detection. Visible light cameras for day, thermal cameras for night. AI classifies objects in the feed. Good for confirmation and tracking, limited by range and weather.
Acoustic Detection
Listening for drone motor/propeller sound signatures with microphone arrays. Short range (100-500m) but useful for confirming presence and direction. Works where RF is silent (autonomous drones).
RF Jamming
Transmitting noise on the drone's control frequency to break the link between pilot and drone. The drone then follows its failsafe (usually RTH or land). Requires authorization. Affects all devices on that frequency — including your own.
GPS Spoofing
Sending fake GPS signals to the drone, making it think it's somewhere else. Can redirect the drone to a capture zone or force a landing. Highly controlled — affects all GPS receivers in the area.
Kinetic Defeat
Physically destroying or capturing the drone. Nets (from drones or launchers), projectiles, directed energy (lasers, HPM). The "last resort" layer.
SIGINT / Protocol Analysis
Signals Intelligence. Decoding the actual data in the drone's RF emissions — identifying the protocol (DJI Aeroscope, Autel, MAVLink), extracting serial numbers, operator position, flight path. Much more useful than raw RF detection.
Remote ID
FAA-mandated broadcast identification. As of 2024, most commercial drones broadcast operator ID, position, altitude, and serial number on Bluetooth 5 and Wi-Fi NaN. A free detection layer — but only catches compliant drones.
The C-UAS Kill Chain
Every C-UAS system follows this pipeline. You don't need all layers — but more layers = fewer gaps.
Detect
→
Track
→
Classify
→
Decide
→
Act
| Layer | Sensors | Range | Catches | Misses |
| RF Detection | SDR, spectrum analyzer, dedicated RF sensors | 1-10 km | Any drone transmitting (controller, video, telemetry) | Autonomous drones, fiber-optic, GPS-only nav |
| Remote ID | Bluetooth/WiFi receiver, smartphone | 300m (BT5) / 1km (WiFi) | Compliant commercial drones | Anything non-compliant, DIY builds, military |
| Radar | AESA, FMCW, phased array | 2-20 km | Any flying object (including birds) | Very small drones (<250g), clutter in urban environments |
| EO/IR Camera | PTZ camera, thermal camera + AI | 500m-5 km | Visual confirmation, classification | Bad weather, night (visible only), very small at distance |
| Acoustic | Microphone arrays | 100-500m | Close-range confirmation, direction | Noisy environments, long range |
RF Detection with SDR
The cheapest entry point — a software-defined radio (SDR) can detect drone RF signatures for under $50.
# ═══ DETECT DRONE RF WITH RTL-SDR ($25) + GNU RADIO ═══
# Install RTL-SDR driver + GNU Radio
sudo apt install rtl-sdr gnuradio gr-osmosdr
# Scan 2.4 GHz band (most drones) — needs an upconverter or
# a HackRF One ($300) since RTL-SDR maxes at 1.7 GHz
# With HackRF One — scan 2.4 GHz for drone video/control:
hackrf_sweep -f 2400:2500 -w 500000 -l 32 -g 30 | \
python3 detect_peaks.py
# detect_peaks.py: look for strong signals at known drone freqs
# DJI: 2.400-2.4835 GHz (OcuSync/Lightbridge)
# FPV: 5.650-5.925 GHz (analog/digital video)
# ExpLRS: 2.400 GHz or 868/915 MHz
# Crossfire: 868 MHz (EU) / 915 MHz (US)SDR Scan
# ═══ REMOTE ID RECEIVER (free detection layer) ═══
# Every compliant drone broadcasts ID on Bluetooth 5 + WiFi NaN
# Use any phone with OpenDroneID app, or build a receiver:
# ESP32-based Remote ID receiver (Arduino)
# https://github.com/sxjack/uav_electronic_ids
# or the official ASTM F3411 compatible:
# https://github.com/opendroneid/receiver-android
# Outputs: operator ID, drone serial, lat/lon/alt, speed, heading
# Range: ~300m Bluetooth, ~1km WiFi NaN
# Pipe to TAK via CoT for map display:
# Set type="a-h-A" (hostile air) or "a-u-A" (unknown air)Remote ID
Radar Integration
Radar catches what RF can't — autonomous drones, fiber-optic controlled drones, and anything not transmitting.
| Product | Type | Range | Approx Price | Integration |
| Echodyne EchoGuard | Metamaterial ESA (MESA) | 3 km | $40K-$80K | REST API, NMEA, ASTERIX |
| Echodyne EchoGuard CR | 360° continuous rotation | 3 km (360°) | $80K-$120K | REST API, ASTERIX |
| Robin Radar ELVIRA | FMCW | 5 km | $100K+ | Proprietary API |
| SpotterRF | Compact FMCW | 1.2 km | $15K-$30K | JSON API, SDK |
| Ainstein US-D1 | Radar altimeter (self-protection) | 50m | $300 | Serial/UART, MAVLink |
# ═══ ECHODYNE ECHOGUARD API — TRACK DRONE ON MAP ═══
# EchoGuard exposes tracks via REST API
import requests, json
def poll_radar(radar_ip="192.168.1.100"):
resp = requests.get(f"http://{radar_ip}:8080/api/v1/tracks")
tracks = resp.json()
for track in tracks:
lat = track['latitude']
lon = track['longitude']
alt = track['altitude_msl']
rcs = track['rcs_dbsm'] # Radar cross section (small = drone)
vel = track['velocity_mps']
track_id = track['id']
# Filter: RCS < -10 dBsm = likely drone (not bird/aircraft)
if rcs < -10:
# Send to TAK as hostile air track:
send_cot(lat, lon, alt, uid=f"radar-{track_id}",
type="a-u-A") # unknown air
# Poll at 1 Hz — radar updates faster but network is the bottleneck
import time
while True:
poll_radar()
time.sleep(1)Radar API
Camera + AI Detection
Use your existing YOLO pipeline (see
AI/CV Guide) to detect drones in camera feeds. Train a custom model on drone silhouettes.
# ═══ YOLO DRONE DETECTION — CUSTOM TRAINED ═══
# Datasets for training drone detection models:
# - Drone-vs-Bird: https://github.com/wosdetc/drone-vs-bird
# - Anti-UAV: https://anti-uav.github.io/
# - DUT Anti-UAV: thermal + visible drone detection dataset
# Train on your desktop GPU:
yolo train model=yolov8s.pt data=drone_detection.yaml epochs=200 imgsz=640
# Classes to train: drone, bird, aircraft, helicopter, noise
# The model learns to distinguish drones from birds at distance
# Deploy on a PTZ camera with Jetson:
# 1. Camera → GStreamer → YOLO inference
# 2. Detection → PTZ auto-track (keep drone centered)
# 3. Track data → TAK (hostile air icon on map)
# For thermal: train separate model on LWIR drone signatures
# Drones are hot (motors/batteries) against cold sky — easy detectAI Detection
Sensor Fusion Architecture
No single sensor is enough. Real C-UAS systems fuse multiple sensors for high-confidence detection.
# ═══ MULTI-SENSOR FUSION VIA ROS ═══
# Each sensor publishes to its own ROS topic:
# /cuas/rf_detections — RF sensor (lat/lon/freq/signal strength)
# /cuas/radar_tracks — Radar (lat/lon/alt/velocity/RCS)
# /cuas/camera_detections — EO/IR AI (bounding box, class, confidence)
# /cuas/remote_id — Remote ID (operator ID, serial, position)
# /cuas/acoustic_bearing — Mic array (azimuth, elevation, confidence)
# Fusion node subscribes to all, correlates by:
# 1. Spatial proximity (detections within 50m = same target)
# 2. Temporal correlation (detections within 2s window)
# 3. Confidence weighting (radar > RF > camera > acoustic)
# Output: /cuas/fused_tracks — high-confidence tracks with threat level
# Publish to TAK for operator decision
# Architecture:
# ┌──────────┐ ┌──────────┐ ┌──────────┐
# │ RF Sensor│ │ Radar │ │ Camera+AI│
# └────┬─────┘ └────┬─────┘ └────┬─────┘
# ┌────▼──────────────▼──────────────▼─────┐
# │ ROS FUSION NODE │
# └────────────────┬────────────────────────┘
# ┌─────▼─────┐
# │ TAK/ATAK │
# └───────────┘Fusion
Defeat Methods (Authorized Use Only)
RESTRICTED. Active defeat is illegal for most entities in most jurisdictions. The following is for reference within authorized military, law enforcement, and critical infrastructure protection contexts only.
| Method | How It Works | Pros | Cons | Products |
| RF Jamming | Overwhelm control/video freq with noise | Non-kinetic, instant effect, wide area | Affects all devices on freq, drone may not land (autonomous mode), collateral | DroneShield DroneGun, SRC Silent Archer, Blighter AUDS |
| GPS Spoofing | Fake GPS signals redirect drone to capture zone | Precise control of drone path | Affects all GPS receivers in area, complex to implement | Citadel Defense Titan, Battelle DroneDefender (GPS component) |
| Protocol Takeover | Exploit drone protocol vulnerabilities to take control | Full control of target drone | Protocol-specific (works on DJI, not on custom FC), cat-and-mouse with updates | Department 13 MESMER, Zignal |
| Net Capture | Drone-mounted or ground-launched net entangles target | Physical capture preserves evidence | Short range, single shot, target must be tracked | Fortem DroneHunter, SkyWall Patrol |
| Directed Energy | High-power microwave (HPM) or laser damages electronics | Speed of light, low cost per shot, magazine depth | Expensive system, line of sight, power requirements | Raytheon PHASER (HPM), HELWS (laser), BlueHalo LOCUST |
| Kinetic Intercept | Drone-launched interceptor physically collides with target | Works against any target type | Risk of debris, single-use interceptor, needs tracking | Anduril Anvil, CACI SkyTracker + interceptor, Fortem SkyDome |
Drone-Mounted C-UAS (Interceptor Drones)
Your Forge-built drone can BE the C-UAS system. An interceptor drone with AI tracking autonomously pursues and captures/defeats hostile drones.
# ═══ INTERCEPTOR DRONE ARCHITECTURE ═══
# Hardware: Fast FPV-class quad (7"+ for speed) + companion computer
# Sensors: Forward camera + AI (YOLO drone detection model)
# Payload: Net launcher, or direct kinetic intercept (Anduril Anvil style)
# Software stack on companion computer:
# 1. Camera → YOLO drone detection (trained on drone silhouettes)
# 2. Detection → visual servoing (keep target centered in frame)
# 3. Visual servoing → velocity commands to FC via MAVROS
# 4. Approach + capture/defeat at close range
# ROS architecture:
# /camera/image_raw → /yolo_detector → /target/bbox
# /target/bbox → /visual_servo → /mavros/setpoint_velocity/cmd_vel
# Key challenges:
# - Target drone is also moving (pursuit dynamics)
# - Need high FPS detection (30+ Hz) for tracking
# - Prop wash turbulence at close range
# - Legal authorization for kinetic intercept
# See: Anduril Anvil (production system)
# See: Fortem DroneHunter (net capture drone)Interceptor
Tying it all together: Ground C-UAS sensors (radar, RF, camera) detect and track the hostile drone. Track data goes to TAK. Operator authorizes intercept. Interceptor drone receives target coordinates via mesh network. AI visual servoing guides the final approach. The entire Forge guide ecosystem covers every piece of this pipeline: SLAM for navigation, mesh for comms, TAK for situational awareness, AI for targeting.
Back to Implementation Guides