Skip to content

Recordis-dev/wifi-densepose

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

123 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

WiFi DensePose

See through walls with WiFi. No cameras. No wearables. Just radio waves.

WiFi DensePose turns commodity WiFi signals into real-time human pose estimation, vital sign monitoring, and presence detection β€” all without a single pixel of video. By analyzing Channel State Information (CSI) disturbances caused by human movement, the system reconstructs body position, breathing rate, and heartbeat using physics-based signal processing and machine learning.

Rust 1.85+ License: MIT Tests: 542+ Docker: 132 MB Vital Signs ESP32 Ready

What How Speed
Pose estimation CSI subcarrier amplitude/phase β†’ DensePose UV maps 54K fps (Rust)
Breathing detection Bandpass 0.1-0.5 Hz β†’ FFT peak 6-30 BPM
Heart rate Bandpass 0.8-2.0 Hz β†’ FFT peak 40-120 BPM
Presence sensing RSSI variance + motion band power < 1ms latency
Through-wall Fresnel zone geometry + multipath modeling Up to 5m depth
# 30 seconds to live sensing β€” no toolchain required
docker pull ruvnet/wifi-densepose:latest
docker run -p 3000:3000 ruvnet/wifi-densepose:latest
# Open http://localhost:3000

Note

CSI-capable hardware required. Pose estimation, vital signs, and through-wall sensing rely on Channel State Information (CSI) β€” per-subcarrier amplitude and phase data that standard consumer WiFi does not expose. You need CSI-capable hardware (ESP32-S3 or a research NIC) for full functionality. Consumer WiFi laptops can only provide RSSI-based presence detection, which is significantly less capable.

Hardware options for live CSI capture:

Option Hardware Cost Full CSI Capabilities
ESP32 Mesh (recommended) 3-6x ESP32-S3 + WiFi router ~$54 Yes Pose, breathing, heartbeat, motion, presence
Research NIC Intel 5300 / Atheros AR9580 ~$50-100 Yes Full CSI with 3x3 MIMO
Any WiFi Windows/Linux laptop $0 No RSSI-only: coarse presence and motion

No hardware? Verify the signal processing pipeline with the deterministic reference signal: python v1/data/proof/verify.py


πŸš€ Key Features

Feature What It Means
πŸ”’ Privacy-First Tracks human pose using only WiFi signals β€” no cameras, no video, no images stored
⚑ Real-Time Analyzes WiFi signals in under 100 microseconds per frame β€” fast enough for live monitoring
πŸ’“ Vital Signs Detects breathing rate (6-30 breaths/min) and heart rate (40-120 bpm) without any wearable
πŸ‘₯ Multi-Person Tracks multiple people simultaneously, each with independent pose and vitals β€” no hard software limit (physics: ~3-5 per AP with 56 subcarriers, more with multi-AP)
🧱 Through-Wall WiFi passes through walls, furniture, and debris β€” works where cameras cannot
πŸš‘ Disaster Response Detects trapped survivors through rubble and classifies injury severity (START triage)
🐳 One-Command Setup docker pull ruvnet/wifi-densepose:latest β€” live sensing in 30 seconds, no toolchain needed
πŸ“¦ Portable Models Trained models package into a single .rvf file β€” runs on edge, cloud, or browser (WASM)
πŸ¦€ 810x Faster Complete Rust rewrite: 54,000 frames/sec pipeline, 132 MB Docker image, 542+ tests

🏒 Use Cases & Applications

WiFi sensing works anywhere WiFi exists. No new hardware in most cases β€” just software on existing access points or a $8 ESP32 add-on. Because there are no cameras, deployments avoid privacy regulations (GDPR video, HIPAA imaging) by design.

Scaling: Each AP distinguishes ~3-5 people (56 subcarriers). Multi-AP multiplies linearly β€” a 4-AP retail mesh covers ~15-20 occupants. No hard software limit; the practical ceiling is signal physics.

Why WiFi sensing wins Traditional alternative
πŸ”’ No video, no GDPR/HIPAA imaging rules Cameras require consent, signage, data retention policies
🧱 Works through walls, shelving, debris Cameras need line-of-sight per room
πŸŒ™ Works in total darkness Cameras need IR or visible light
πŸ’° $0-$8 per zone (existing WiFi or ESP32) Camera systems: $200-$2,000 per zone
πŸ”Œ WiFi already deployed everywhere PIR/radar sensors require new wiring per room
πŸ₯ Everyday β€” Healthcare, retail, office, hospitality (commodity WiFi)
Use Case What It Does Hardware Key Metric
Elderly care / assisted living Fall detection, nighttime activity monitoring, breathing rate during sleep β€” no wearable compliance needed 1 ESP32-S3 per room ($8) Fall alert <2s
Hospital patient monitoring Continuous breathing + heart rate for non-critical beds without wired sensors; nurse alert on anomaly 1-2 APs per ward Breathing: 6-30 BPM
Emergency room triage Automated occupancy count + wait-time estimation; detect patient distress (abnormal breathing) in waiting areas Existing hospital WiFi Occupancy accuracy >95%
Retail occupancy & flow Real-time foot traffic, dwell time by zone, queue length β€” no cameras, no opt-in, GDPR-friendly Existing store WiFi + 1 ESP32 Dwell resolution ~1m
Office space utilization Which desks/rooms are actually occupied, meeting room no-shows, HVAC optimization based on real presence Existing enterprise WiFi Presence latency <1s
Hotel & hospitality Room occupancy without door sensors, minibar/bathroom usage patterns, energy savings on empty rooms Existing hotel WiFi 15-30% HVAC savings
Restaurants & food service Table turnover tracking, kitchen staff presence, restroom occupancy displays β€” no cameras in dining areas Existing WiFi Queue wait Β±30s
Parking garages Pedestrian presence in stairwells and elevators where cameras have blind spots; security alert if someone lingers Existing WiFi Through-concrete walls
🏟️ Specialized β€” Events, fitness, education, civic (CSI-capable hardware)
Use Case What It Does Hardware Key Metric
Smart home automation Room-level presence triggers (lights, HVAC, music) that work through walls β€” no dead zones, no motion-sensor timeouts 2-3 ESP32-S3 nodes ($24) Through-wall range ~5m
Fitness & sports Rep counting, posture correction, breathing cadence during exercise β€” no wearable, no camera in locker rooms 3+ ESP32-S3 mesh Pose: 17 keypoints
Childcare & schools Naptime breathing monitoring, playground headcount, restricted-area alerts β€” privacy-safe for minors 2-4 ESP32-S3 per zone Breathing: Β±1 BPM
Event venues & concerts Crowd density mapping, crush-risk detection via breathing compression, emergency evacuation flow tracking Multi-AP mesh (4-8 APs) Density per mΒ²
Stadiums & arenas Section-level occupancy for dynamic pricing, concession staffing, emergency egress flow modeling Enterprise AP grid 15-20 per AP mesh
Houses of worship Attendance counting without facial recognition β€” privacy-sensitive congregations, multi-room campus tracking Existing WiFi Zone-level accuracy
Warehouse & logistics Worker safety zones, forklift proximity alerts, occupancy in hazardous areas β€” works through shelving and pallets Industrial AP mesh Alert latency <500ms
Civic infrastructure Public restroom occupancy (no cameras possible), subway platform crowding, shelter headcount during emergencies Municipal WiFi + ESP32 Real-time headcount
Museums & galleries Visitor flow heatmaps, exhibit dwell time, crowd bottleneck alerts β€” no cameras near artwork (flash/theft risk) Existing WiFi Zone dwell Β±5s
πŸ€– Robotics & Industrial β€” Autonomous systems, manufacturing, android spatial awareness

WiFi sensing gives robots and autonomous systems a spatial awareness layer that works where LIDAR and cameras fail β€” through dust, smoke, fog, and around corners. The CSI signal field acts as a "sixth sense" for detecting humans in the environment without requiring line-of-sight.

Use Case What It Does Hardware Key Metric
Cobot safety zones Detect human presence near collaborative robots β€” auto-slow or stop before contact, even behind obstructions 2-3 ESP32-S3 per cell Presence latency <100ms
Warehouse AMR navigation Autonomous mobile robots sense humans around blind corners, through shelving racks β€” no LIDAR occlusion ESP32 mesh along aisles Through-shelf detection
Android / humanoid spatial awareness Ambient human pose sensing for social robots β€” detect gestures, approach direction, and personal space without cameras always on Onboard ESP32-S3 module 17-keypoint pose
Manufacturing line monitoring Worker presence at each station, ergonomic posture alerts, headcount for shift compliance β€” works through equipment Industrial AP per zone Pose + breathing
Construction site safety Exclusion zone enforcement around heavy machinery, fall detection from scaffolding, personnel headcount Ruggedized ESP32 mesh Alert <2s, through-dust
Agricultural robotics Detect farm workers near autonomous harvesters in dusty/foggy field conditions where cameras are unreliable Weatherproof ESP32 nodes Range ~10m open field
Drone landing zones Verify landing area is clear of humans β€” WiFi sensing works in rain, dust, and low light where downward cameras fail Ground ESP32 nodes Presence: >95% accuracy
Clean room monitoring Personnel tracking without cameras (particle contamination risk from camera fans) β€” gown compliance via pose Existing cleanroom WiFi No particulate emission
πŸ”₯ Extreme β€” Through-wall, disaster, defense, underground

These scenarios exploit WiFi's ability to penetrate solid materials β€” concrete, rubble, earth β€” where no optical or infrared sensor can reach. The WiFi-Mat disaster module (ADR-001) is specifically designed for this tier.

Use Case What It Does Hardware Key Metric
Search & rescue (WiFi-Mat) Detect survivors through rubble/debris via breathing signature, START triage color classification, 3D localization Portable ESP32 mesh + laptop Through 30cm concrete
Firefighting Locate occupants through smoke and walls before entry; breathing detection confirms life signs remotely Portable mesh on truck Works in zero visibility
Prison & secure facilities Cell occupancy verification, distress detection (abnormal vitals), perimeter sensing β€” no camera blind spots Dedicated AP infrastructure 24/7 vital signs
Military / tactical Through-wall personnel detection, room clearing confirmation, hostage vital signs at standoff distance Directional WiFi + custom FW Range: 5m through wall
Border & perimeter security Detect human presence in tunnels, behind fences, in vehicles β€” passive sensing, no active illumination to reveal position Concealed ESP32 mesh Passive / covert
Mining & underground Worker presence in tunnels where GPS/cameras fail, breathing detection after collapse, headcount at safety points Ruggedized ESP32 mesh Through rock/earth
Maritime & naval Below-deck personnel tracking through steel bulkheads (limited range, requires tuning), man-overboard detection Ship WiFi + ESP32 Through 1-2 bulkheads
Wildlife research Non-invasive animal activity monitoring in enclosures or dens β€” no light pollution, no visual disturbance Weatherproof ESP32 nodes Zero light emission

πŸ“¦ Installation

Guided Installer β€” Interactive hardware detection and profile selection
./install.sh

The installer walks through 7 steps: system detection, toolchain check, WiFi hardware scan, profile recommendation, dependency install, build, and verification.

Profile What it installs Size Requirements
verify Pipeline verification only ~5 MB Python 3.8+
python Full Python API server + sensing ~500 MB Python 3.8+
rust Rust pipeline (~810x faster) ~200 MB Rust 1.70+
browser WASM for in-browser execution ~10 MB Rust + wasm-pack
iot ESP32 sensor mesh + aggregator varies Rust + ESP-IDF
docker Docker-based deployment ~1 GB Docker
field WiFi-Mat disaster response kit ~62 MB Rust + wasm-pack
full Everything available ~2 GB All toolchains
# Non-interactive
./install.sh --profile rust --yes

# Hardware check only
./install.sh --check-only
From Source β€” Rust (primary) or Python
git clone https://github.com/ruvnet/wifi-densepose.git
cd wifi-densepose

# Rust (primary β€” 810x faster)
cd rust-port/wifi-densepose-rs
cargo build --release
cargo test --workspace

# Python (legacy v1)
pip install -r requirements.txt
pip install -e .

# Or via pip
pip install wifi-densepose
pip install wifi-densepose[gpu]   # GPU acceleration
pip install wifi-densepose[all]   # All optional deps
Docker β€” Pre-built images, no toolchain needed
# Rust sensing server (132 MB β€” recommended)
docker pull ruvnet/wifi-densepose:latest
docker run -p 3000:3000 -p 3001:3001 -p 5005:5005/udp ruvnet/wifi-densepose:latest

# Python sensing pipeline (569 MB)
docker pull ruvnet/wifi-densepose:python
docker run -p 8765:8765 -p 8080:8080 ruvnet/wifi-densepose:python

# Both via docker-compose
cd docker && docker compose up

# Export RVF model
docker run --rm -v $(pwd):/out ruvnet/wifi-densepose:latest --export-rvf /out/model.rvf
Image Tag Size Ports
ruvnet/wifi-densepose latest, rust 132 MB 3000 (REST), 3001 (WS), 5005/udp (ESP32)
ruvnet/wifi-densepose python 569 MB 8765 (WS), 8080 (UI)
System Requirements
  • Rust: 1.70+ (primary runtime β€” install via rustup)
  • Python: 3.8+ (for verification and legacy v1 API)
  • OS: Linux (Ubuntu 18.04+), macOS (10.15+), Windows 10+
  • Memory: Minimum 4GB RAM, Recommended 8GB+
  • Storage: 2GB free space for models and data
  • Network: WiFi interface with CSI capability (optional β€” installer detects what you have)
  • GPU: Optional (NVIDIA CUDA or Apple Metal)

πŸš€ Quick Start

First API call in 3 commands

1. Install

# Fastest path β€” Docker
docker pull ruvnet/wifi-densepose:latest
docker run -p 3000:3000 ruvnet/wifi-densepose:latest

# Or from source (Rust)
./install.sh --profile rust --yes

2. Start the System

from wifi_densepose import WiFiDensePose

system = WiFiDensePose()
system.start()
poses = system.get_latest_poses()
print(f"Detected {len(poses)} persons")
system.stop()

3. REST API

# Health check
curl http://localhost:3000/health

# Latest sensing frame
curl http://localhost:3000/api/v1/sensing/latest

# Vital signs
curl http://localhost:3000/api/v1/vital-signs

# Pose estimation
curl http://localhost:3000/api/v1/pose/current

# Server info
curl http://localhost:3000/api/v1/info

4. Real-time WebSocket

import asyncio, websockets, json

async def stream():
    async with websockets.connect("ws://localhost:3001/ws/sensing") as ws:
        async for msg in ws:
            data = json.loads(msg)
            print(f"Persons: {len(data.get('persons', []))}")

asyncio.run(stream())

πŸ“‹ Table of Contents

πŸ“‘ Signal Processing & Sensing β€” From raw WiFi frames to vital signs

The signal processing stack transforms raw WiFi Channel State Information into actionable human sensing data. Starting from 56-192 subcarrier complex values captured at 20 Hz, the pipeline applies research-grade algorithms (SpotFi phase correction, Hampel outlier rejection, Fresnel zone modeling) to extract breathing rate, heart rate, motion level, and multi-person body pose β€” all in pure Rust with zero external ML dependencies.

Section Description Docs
Key Features Privacy-first sensing, real-time performance, multi-person tracking, Docker β€”
ESP32-S3 Hardware Pipeline 20 Hz CSI streaming, binary frame parsing, flash & provision ADR-018 Β· Tutorial #34
Vital Sign Detection Breathing 6-30 BPM, heartbeat 40-120 BPM, FFT peak detection ADR-021
WiFi Scan Domain Layer 8-stage RSSI pipeline, multi-BSSID fingerprinting, Windows WiFi ADR-022 Β· Tutorial #36
WiFi-Mat Disaster Response Search & rescue, START triage, 3D localization through debris ADR-001 Β· User Guide
SOTA Signal Processing SpotFi, Hampel, Fresnel, STFT spectrogram, subcarrier selection, BVP ADR-014
🧠 Models & Training β€” DensePose pipeline, RVF containers, SONA adaptation, RuVector integration

The neural pipeline uses a graph transformer with cross-attention to map CSI feature matrices to 17 COCO body keypoints and DensePose UV coordinates. Models are packaged as single-file .rvf containers with progressive loading (Layer A instant, Layer B warm, Layer C full). SONA (Self-Optimizing Neural Architecture) enables continuous on-device adaptation via micro-LoRA + EWC++ without catastrophic forgetting. Signal processing is powered by 5 RuVector crates (v2.0.4) with 7 integration points across the Rust workspace, plus 6 additional vendored crates for inference and graph intelligence.

Section Description Docs
RVF Model Container Binary packaging with Ed25519 signing, progressive 3-layer loading, SIMD quantization ADR-023
Training & Fine-Tuning 8-phase pure Rust pipeline (7,832 lines), MM-Fi/Wi-Pose pre-training, 6-term composite loss, SONA LoRA ADR-023
RuVector Crates 11 vendored Rust crates from ruvector: attention, min-cut, solver, GNN, HNSW, temporal compression, sparse inference GitHub Β· Source
πŸ–₯️ Usage & Configuration β€” CLI flags, API endpoints, hardware setup

The Rust sensing server is the primary interface, offering a comprehensive CLI with flags for data source selection, model loading, training, benchmarking, and RVF export. A REST API (Axum) and WebSocket server provide real-time data access. The Python v1 CLI remains available for legacy workflows.

Section Description Docs
CLI Usage --source, --train, --benchmark, --export-rvf, --model, --progressive β€”
REST API & WebSocket 6 REST endpoints (sensing, vitals, BSSID, SONA), WebSocket real-time stream β€”
Hardware Support ESP32-S3 ($8), Intel 5300 ($15), Atheros AR9580 ($20), Windows RSSI ($0) ADR-012 Β· ADR-013
βš™οΈ Development & Testing β€” 542+ tests, CI, deployment

The project maintains 542+ pure-Rust tests across 7 crate suites with zero mocks β€” every test runs against real algorithm implementations. Hardware-free simulation mode (--source simulate) enables full-stack testing without physical devices. Docker images are published on Docker Hub for zero-setup deployment.

Section Description Docs
Testing 7 test suites: sensing-server (229), signal (83), mat (139), wifiscan (91), RVF (16), vitals (18) β€”
Deployment Docker images (132 MB Rust / 569 MB Python), docker-compose, env vars β€”
Contributing Fork β†’ branch β†’ test β†’ PR workflow, Rust and Python dev setup β€”
πŸ“Š Performance & Benchmarks β€” Measured throughput, latency, resource usage

All benchmarks are measured on the Rust sensing server using cargo bench and the built-in --benchmark CLI flag. The Rust v2 implementation delivers 810x end-to-end speedup over the Python v1 baseline, with motion detection reaching 5,400x improvement. The vital sign detector processes 11,665 frames/second in a single-threaded benchmark.

Section Description Key Metric
Performance Metrics Vital signs, CSI pipeline, motion detection, Docker image, memory 11,665 fps vitals Β· 54K fps pipeline
Rust vs Python Side-by-side benchmarks across 5 operations 810x full pipeline speedup
πŸ“„ Meta β€” License, changelog, support

WiFi DensePose is MIT-licensed open source, developed by ruvnet. The project has been in active development since March 2025, with 3 major releases delivering the Rust port, SOTA signal processing, disaster response module, and end-to-end training pipeline.

Section Description Link
Changelog v2.3.0 (training pipeline + Docker), v2.2.0 (SOTA + WiFi-Mat), v2.1.0 (Rust port) β€”
License MIT License LICENSE
Support Bug reports, feature requests, community discussion Issues Β· Discussions

πŸ“‘ Signal Processing & Sensing

πŸ“‘ ESP32-S3 Hardware Pipeline (ADR-018) β€” 20 Hz CSI streaming, flash & provision
ESP32-S3 (STA + promiscuous)     UDP/5005      Rust aggregator
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    ──────────>    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ WiFi CSI callback 20 Hz β”‚    ADR-018        β”‚ Esp32CsiParser   β”‚
β”‚ ADR-018 binary frames   β”‚    binary         β”‚ CsiFrame output  β”‚
β”‚ stream_sender (UDP)     β”‚                   β”‚ presence detect  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                   β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
Metric Measured
Frame rate ~20 Hz sustained
Subcarriers 64 / 128 / 192 (LLTF, HT, HT40)
Latency < 1ms (UDP loopback)
Presence detection Motion score 10/10 at 3m
# Pre-built binaries β€” no toolchain required
# https://github.com/ruvnet/wifi-densepose/releases/tag/v0.1.0-esp32

python -m esptool --chip esp32s3 --port COM7 --baud 460800 \
  write-flash --flash-mode dio --flash-size 4MB \
  0x0 bootloader.bin 0x8000 partition-table.bin 0x10000 esp32-csi-node.bin

python scripts/provision.py --port COM7 \
  --ssid "YourWiFi" --password "secret" --target-ip 192.168.1.20

cargo run -p wifi-densepose-hardware --bin aggregator -- --bind 0.0.0.0:5005 --verbose

See firmware/esp32-csi-node/README.md and Tutorial #34.

πŸ¦€ Rust Implementation (v2) β€” 810x faster, 54K fps pipeline

Performance Benchmarks (Validated)

Operation Python (v1) Rust (v2) Speedup
CSI Preprocessing (4x64) ~5ms 5.19 Β΅s ~1000x
Phase Sanitization (4x64) ~3ms 3.84 Β΅s ~780x
Feature Extraction (4x64) ~8ms 9.03 Β΅s ~890x
Motion Detection ~1ms 186 ns ~5400x
Full Pipeline ~15ms 18.47 Β΅s ~810x
Vital Signs N/A 86 Β΅s 11,665 fps
Resource Python (v1) Rust (v2)
Memory ~500 MB ~100 MB
Docker Image 569 MB 132 MB
Tests 41 542+
WASM Support No Yes
cd rust-port/wifi-densepose-rs
cargo build --release
cargo test --workspace
cargo bench --package wifi-densepose-signal
πŸ’“ Vital Sign Detection (ADR-021) β€” Breathing and heartbeat via FFT
Capability Range Method
Breathing Rate 6-30 BPM (0.1-0.5 Hz) Bandpass filter + FFT peak detection
Heart Rate 40-120 BPM (0.8-2.0 Hz) Bandpass filter + FFT peak detection
Sampling Rate 20 Hz (ESP32 CSI) Real-time streaming
Confidence 0.0-1.0 per sign Spectral coherence + signal quality
./target/release/sensing-server --source simulate --ui-path ../../ui
curl http://localhost:8080/api/v1/vital-signs

See ADR-021.

πŸ“‘ WiFi Scan Domain Layer (ADR-022) β€” 8-stage RSSI pipeline for Windows WiFi
Stage Purpose
Predictive Gating Pre-filter scan results using temporal prediction
Attention Weighting Weight BSSIDs by signal relevance
Spatial Correlation Cross-AP spatial signal correlation
Motion Estimation Detect movement from RSSI variance
Breathing Extraction Extract respiratory rate from sub-Hz oscillations
Quality Gating Reject low-confidence estimates
Fingerprint Matching Location and posture classification via RF fingerprints
Orchestration Fuse all stages into unified sensing output
cargo test -p wifi-densepose-wifiscan

See ADR-022 and Tutorial #36.

🚨 WiFi-Mat: Disaster Response β€” Search & rescue, START triage, 3D localization

WiFi signals penetrate non-metallic debris (concrete, wood, drywall) where cameras and thermal sensors cannot reach. The WiFi-Mat module (wifi-densepose-mat, 139 tests) uses CSI analysis to detect survivors trapped under rubble, classify their condition using the START triage protocol, and estimate their 3D position β€” giving rescue teams actionable intelligence within seconds of deployment.

Capability How It Works Performance Target
Breathing Detection Bandpass 0.07-1.0 Hz + Fresnel zone modeling detects chest displacement of 5-10mm at 5 GHz 4-60 BPM, <500ms latency
Heartbeat Detection Micro-Doppler shift extraction from fine-grained CSI phase variation Via ruvector-temporal-tensor
3D Localization Multi-AP triangulation + CSI fingerprint matching + depth estimation through rubble layers 3-5m penetration
START Triage Ensemble classifier votes on breathing + movement + vital stability β†’ P1-P4 priority <1% false negative
Zone Scanning 16+ concurrent scan zones with periodic re-scan and audit logging Full disaster site

Triage classification (START protocol compatible):

Status Color Detection Criteria Priority
Immediate Red Breathing detected, no movement P1
Delayed Yellow Movement + breathing, stable vitals P2
Minor Green Strong movement, responsive patterns P3
Deceased Black No vitals for >30 min continuous scan P4

Deployment modes: portable (single TX/RX handheld), distributed (multiple APs around collapse site), drone-mounted (UAV scanning), vehicle-mounted (mobile command post).

use wifi_densepose_mat::{DisasterResponse, DisasterConfig, DisasterType, ScanZone, ZoneBounds};

let config = DisasterConfig::builder()
    .disaster_type(DisasterType::Earthquake)
    .sensitivity(0.85)
    .max_depth(5.0)
    .build();

let mut response = DisasterResponse::new(config);
response.initialize_event(location, "Building collapse")?;
response.add_zone(ScanZone::new("North Wing", ZoneBounds::rectangle(0.0, 0.0, 30.0, 20.0)))?;
response.start_scanning().await?;

Safety guarantees: fail-safe defaults (assume life present on ambiguous signals), redundant multi-algorithm voting, complete audit trail, offline-capable (no network required).

πŸ”¬ SOTA Signal Processing (ADR-014) β€” 6 research-grade algorithms

The signal processing layer bridges the gap between raw commodity WiFi hardware output and research-grade sensing accuracy. Each algorithm addresses a specific limitation of naive CSI processing β€” from hardware-induced phase corruption to environment-dependent multipath interference. All six are implemented in wifi-densepose-signal/src/ with deterministic tests and no mock data.

Algorithm What It Does Why It Matters Math Source
Conjugate Multiplication Multiplies CSI antenna pairs: H₁[k] Γ— conj(Hβ‚‚[k]) Cancels CFO, SFO, and packet detection delay that corrupt raw phase β€” preserves only environment-caused phase differences CSI_ratio[k] = H₁[k] * conj(Hβ‚‚[k]) SpotFi (SIGCOMM 2015)
Hampel Filter Replaces outliers using running median Β± scaled MAD Z-score uses mean/std which are corrupted by the very outliers it detects (masking effect). Hampel uses median/MAD, resisting up to 50% contamination ΟƒΜ‚ = 1.4826 Γ— MAD Standard DSP; WiGest (2015)
Fresnel Zone Model Models signal variation from chest displacement crossing Fresnel zone boundaries Zero-crossing counting fails in multipath-rich environments. Fresnel predicts where breathing should appear based on TX-RX-body geometry ΔΦ = 2Ο€ Γ— 2Ξ”d / Ξ», A = |sin(ΔΦ/2)| FarSense (MobiCom 2019)
CSI Spectrogram Sliding-window FFT (STFT) per subcarrier β†’ 2D time-frequency matrix Breathing = 0.2-0.4 Hz band, walking = 1-2 Hz, static = noise. 2D structure enables CNN spatial pattern recognition that 1D features miss S[t,f] = |Ξ£β‚™ x[n] w[n-t] e^{-j2Ο€fn}|Β² Standard since 2018
Subcarrier Selection Ranks subcarriers by motion sensitivity (variance ratio) and selects top-K Not all subcarriers respond to motion β€” some sit in multipath nulls. Selecting the 10-20 most sensitive improves SNR by 6-10 dB sensitivity[k] = var_motion / var_static WiDance (MobiCom 2017)
Body Velocity Profile Extracts velocity distribution from Doppler shifts across subcarriers BVP is domain-independent β€” same velocity profile regardless of room layout, furniture, or AP placement. Basis for cross-environment recognition BVP[v,t] = Ξ£β‚– |STFTβ‚–[v,t]| Widar 3.0 (MobiSys 2019)

Processing pipeline order: Raw CSI β†’ Conjugate multiplication (phase cleaning) β†’ Hampel filter (outlier removal) β†’ Subcarrier selection (top-K) β†’ CSI spectrogram (time-frequency) β†’ Fresnel model (breathing) + BVP (activity)

See ADR-014 for full mathematical derivations.


🧠 Models & Training

πŸ“¦ RVF Model Container β€” Single-file deployment with progressive loading

The RuVector Format (RVF) packages an entire trained model β€” weights, HNSW indexes, quantization codebooks, SONA adaptation deltas, and WASM inference runtime β€” into a single self-contained binary file. No external dependencies are needed at deployment time.

Container structure:

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ RVF Container (.rvf)                                  β”‚
β”‚                                                       β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  64-byte header per segment          β”‚
β”‚  β”‚ Manifest     β”‚  Magic: 0x52564653 ("RVFS")         β”‚
β”‚  β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€  Type + content hash + compression   β”‚
β”‚  β”‚ Weights      β”‚  Model parameters (f32/f16/u8)      β”‚
β”‚  β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€                                      β”‚
β”‚  β”‚ HNSW Index   β”‚  Vector search index                β”‚
β”‚  β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€                                      β”‚
β”‚  β”‚ Quant        β”‚  Quantization codebooks              β”‚
β”‚  β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€                                      β”‚
β”‚  β”‚ SONA Profile β”‚  LoRA deltas + EWC++ Fisher matrix  β”‚
β”‚  β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€                                      β”‚
β”‚  β”‚ Witness      β”‚  Ed25519 training proof              β”‚
β”‚  β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€                                      β”‚
β”‚  β”‚ Vitals Configβ”‚  Breathing/HR filter parameters     β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜                                      β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Deployment targets:

Target Quantization Size Load Time Use Case
ESP32 / IoT int4 ~0.7 MB <5ms (Layer A) Presence + breathing only
Mobile / WebView int8 ~6 MB ~200ms (Layer B) Pose estimation on phone
Browser (WASM) int8 ~10 MB ~500ms (Layer B) In-browser demo
Field (WiFi-Mat) fp16 ~62 MB ~2s (Layer C) Full DensePose + disaster triage
Server / Cloud f32 ~50+ MB ~3s (Layer C) Training + full inference
Property Detail
Format Segment-based binary, 20+ segment types, CRC32 integrity per segment
Progressive Loading Layer A (<5ms): manifest + entry points β†’ Layer B (100ms-1s): hot weights + adjacency β†’ Layer C (seconds): full graph
Signing Ed25519 training proofs for verifiable provenance β€” chain of custody from training data to deployed model
Quantization Per-segment temperature-tiered: f32 (full), f16 (half), u8 (int8), int4 β€” with SIMD-accelerated distance computation
CLI --export-rvf (generate), --load-rvf (config), --save-rvf (persist), --model (inference), --progressive (3-layer load)
# Export model package
./target/release/sensing-server --export-rvf wifi-densepose-v1.rvf

# Load and run with progressive loading
./target/release/sensing-server --model wifi-densepose-v1.rvf --progressive

# Export via Docker
docker run --rm -v $(pwd):/out ruvnet/wifi-densepose:latest --export-rvf /out/model.rvf

Built on the rvf crate family (rvf-types, rvf-wire, rvf-manifest, rvf-index, rvf-quant, rvf-crypto, rvf-runtime). See ADR-023.

🧬 Training & Fine-Tuning β€” MM-Fi/Wi-Pose pre-training, SONA adaptation

The training pipeline implements 8 phases in pure Rust (7,832 lines, zero external ML dependencies). It trains a graph transformer with cross-attention to map CSI feature matrices to 17 COCO body keypoints and DensePose UV coordinates β€” following the approach from the CMU "DensePose From WiFi" paper (arXiv:2301.00250). RuVector crates provide the core building blocks: ruvector-attention for cross-attention layers, ruvector-mincut for multi-person matching, and ruvector-temporal-tensor for CSI buffer compression.

Three-tier data strategy:

Tier Method Purpose RuVector Integration
1. Pre-train MM-Fi + Wi-Pose public datasets Cross-environment generalization (multi-subject, multi-room) ruvector-temporal-tensor compresses CSI windows (114β†’56 subcarrier resampling)
2. Fine-tune ESP32 CSI + camera pseudo-labels Environment-specific multipath adaptation ruvector-solver for Fresnel geometry, ruvector-attn-mincut for subcarrier gating
3. SONA adapt Micro-LoRA (rank-4) + EWC++ Continuous on-device learning without catastrophic forgetting SONA architecture (Self-Optimizing Neural Architecture)

Training pipeline components:

Phase Module What It Does RuVector Crate
1 dataset.rs (850 lines) MM-Fi .npy + Wi-Pose .mat loaders, subcarrier resampling (114β†’56, 30β†’56), windowing ruvector-temporal-tensor
2 graph_transformer.rs (855 lines) COCO BodyGraph (17 kp, 16 edges), AntennaGraph, multi-head CrossAttention, GCN message passing ruvector-attention
3 trainer.rs (881 lines) 6-term composite loss (MSE, CE, UV, temporal, bone, symmetry), SGD+momentum, cosine+warmup, PCK/OKS ruvector-mincut (person matching)
4 sona.rs (639 lines) LoRA adapters (AΓ—B delta), EWC++ Fisher regularization, EnvironmentDetector (3-sigma drift) sona
5 sparse_inference.rs (753 lines) NeuronProfiler hot/cold partitioning, SparseLinear (skip cold rows), INT8/FP16 quantization ruvector-sparse-inference
6 rvf_pipeline.rs (1,027 lines) Progressive 3-layer loader, HNSW index, OverlayGraph, RvfModelBuilder ruvector-core (HNSW)
7 rvf_container.rs (914 lines) Binary container format, 6+ segment types, CRC32 integrity rvf
8 main.rs integration --train, --model, --progressive CLI flags, REST endpoints β€”

SONA (Self-Optimizing Neural Architecture) β€” the continuous adaptation system:

Component What It Does Why It Matters
Micro-LoRA (rank-4) Trains small AΓ—B weight deltas instead of full weights 100x fewer parameters to update β†’ runs on ESP32
EWC++ (Fisher matrix) Penalizes changes to important weights from previous environments Prevents catastrophic forgetting when moving between rooms
EnvironmentDetector Monitors CSI feature drift with 3-sigma threshold Auto-triggers adaptation when the model is moved to a new space
Best-epoch snapshot Saves best validation loss weights, restores before export Prevents shipping overfit final-epoch parameters
# Pre-train on MM-Fi dataset
./target/release/sensing-server --train --dataset data/ --dataset-type mmfi --epochs 100

# Train and export to RVF in one step
./target/release/sensing-server --train --dataset data/ --epochs 100 --save-rvf model.rvf

# Via Docker (no toolchain needed)
docker run --rm -v $(pwd)/data:/data ruvnet/wifi-densepose:latest \
  --train --dataset /data --epochs 100 --export-rvf /data/model.rvf

See ADR-023 Β· SONA crate Β· arXiv:2301.00250

πŸ”© RuVector Crates β€” 11 vendored signal intelligence crates from github.com/ruvnet/ruvector

5 directly-used crates (v2.0.4, declared in Cargo.toml, 7 integration points):

Crate What It Does Where It's Used in WiFi-DensePose Source
ruvector-attention Scaled dot-product attention, MoE routing, sparse attention model.rs (spatial attention), bvp.rs (sensitivity-weighted velocity profiles) crate
ruvector-mincut Subpolynomial dynamic min-cut O(n^1.5 log n) metrics.rs (DynamicPersonMatcher β€” multi-person assignment), subcarrier_selection.rs (sensitive/insensitive split) crate
ruvector-attn-mincut Attention-gated spectrogram noise suppression model.rs (antenna attention gating), spectrogram.rs (gate noisy time-frequency bins) crate
ruvector-solver Sparse Neumann series solver O(sqrt(n)) fresnel.rs (TX-body-RX geometry), triangulation.rs (3D localization), subcarrier.rs (sparse interpolation 114β†’56) crate
ruvector-temporal-tensor Tiered temporal compression (8/7/5/3-bit) dataset.rs (CSI buffer compression), breathing.rs + heartbeat.rs (compressed vital sign spectrograms) crate

6 additional vendored crates (used by training pipeline and inference):

Crate What It Does Source
ruvector-core VectorDB engine, HNSW index, SIMD distance functions, quantization codebooks crate
ruvector-gnn Graph neural network layers, graph attention, EWC-regularized training crate
ruvector-graph-transformer Proof-gated graph transformer with cross-attention crate
ruvector-sparse-inference PowerInfer-style hot/cold neuron partitioning, skip cold rows at runtime crate
ruvector-nervous-system PredictiveLayer, OscillatoryRouter, Hopfield associative memory crate
ruvector-coherence Spectral coherence monitoring, HNSW graph health, Fiedler connectivity crate

The full RuVector ecosystem includes 90+ crates. See github.com/ruvnet/ruvector for the complete library, and vendor/ruvector/ for the vendored source in this project.


πŸ—οΈ System Architecture β€” End-to-end data flow from CSI capture to REST/WebSocket API

End-to-End Pipeline

graph TB
    subgraph HW ["πŸ“‘ Hardware Layer"]
        direction LR
        R1["WiFi Router 1<br/><small>CSI Source</small>"]
        R2["WiFi Router 2<br/><small>CSI Source</small>"]
        R3["WiFi Router 3<br/><small>CSI Source</small>"]
        ESP["ESP32-S3 Mesh<br/><small>20 Hz Β· 56 subcarriers</small>"]
        WIN["Windows WiFi<br/><small>RSSI scanning</small>"]
    end

    subgraph INGEST ["⚑ Ingestion"]
        AGG["Aggregator<br/><small>UDP :5005 Β· ADR-018 frames</small>"]
        BRIDGE["Bridge<br/><small>I/Q β†’ amplitude + phase</small>"]
    end

    subgraph SIGNAL ["πŸ”¬ Signal Processing β€” RuVector v2.0.4"]
        direction TB
        PHASE["Phase Sanitization<br/><small>SpotFi conjugate multiply</small>"]
        HAMPEL["Hampel Filter<br/><small>Outlier rejection Β· Οƒ=3</small>"]
        SUBSEL["Subcarrier Selection<br/><small>ruvector-mincut Β· sensitive/insensitive split</small>"]
        SPEC["Spectrogram<br/><small>ruvector-attn-mincut Β· gated STFT</small>"]
        FRESNEL["Fresnel Geometry<br/><small>ruvector-solver Β· TX-body-RX distance</small>"]
        BVP["Body Velocity Profile<br/><small>ruvector-attention Β· weighted BVP</small>"]
    end

    subgraph ML ["🧠 Neural Pipeline"]
        direction TB
        GRAPH["Graph Transformer<br/><small>17 COCO keypoints Β· 16 edges</small>"]
        CROSS["Cross-Attention<br/><small>CSI features β†’ body pose</small>"]
        SONA["SONA Adapter<br/><small>LoRA rank-4 Β· EWC++</small>"]
    end

    subgraph VITAL ["πŸ’“ Vital Signs"]
        direction LR
        BREATH["Breathing<br/><small>0.1–0.5 Hz Β· FFT peak</small>"]
        HEART["Heart Rate<br/><small>0.8–2.0 Hz Β· FFT peak</small>"]
        MOTION["Motion Level<br/><small>Variance + band power</small>"]
    end

    subgraph API ["🌐 Output Layer"]
        direction LR
        REST["REST API<br/><small>Axum :3000 Β· 6 endpoints</small>"]
        WS["WebSocket<br/><small>:3001 Β· real-time stream</small>"]
        ANALYTICS["Analytics<br/><small>Fall Β· Activity Β· START triage</small>"]
        UI["Web UI<br/><small>Three.js Β· Gaussian splats</small>"]
    end

    R1 & R2 & R3 --> AGG
    ESP --> AGG
    WIN --> BRIDGE
    AGG --> BRIDGE
    BRIDGE --> PHASE
    PHASE --> HAMPEL
    HAMPEL --> SUBSEL
    SUBSEL --> SPEC
    SPEC --> FRESNEL
    FRESNEL --> BVP
    BVP --> GRAPH
    GRAPH --> CROSS
    CROSS --> SONA
    SONA --> BREATH & HEART & MOTION
    BREATH & HEART & MOTION --> REST & WS & ANALYTICS
    WS --> UI

    style HW fill:#1a1a2e,stroke:#e94560,color:#eee
    style INGEST fill:#16213e,stroke:#0f3460,color:#eee
    style SIGNAL fill:#0f3460,stroke:#533483,color:#eee
    style ML fill:#533483,stroke:#e94560,color:#eee
    style VITAL fill:#2d132c,stroke:#e94560,color:#eee
    style API fill:#1a1a2e,stroke:#0f3460,color:#eee
Loading

Signal Processing Detail

graph LR
    subgraph RAW ["Raw CSI Frame"]
        IQ["I/Q Samples<br/><small>56–192 subcarriers Γ— N antennas</small>"]
    end

    subgraph CLEAN ["Phase Cleanup"]
        CONJ["Conjugate Multiply<br/><small>Remove carrier freq offset</small>"]
        UNWRAP["Phase Unwrap<br/><small>Remove 2Ο€ discontinuities</small>"]
        HAMPEL2["Hampel Filter<br/><small>Remove impulse noise</small>"]
    end

    subgraph SELECT ["Subcarrier Intelligence"]
        MINCUT["Min-Cut Partition<br/><small>ruvector-mincut</small>"]
        GATE["Attention Gate<br/><small>ruvector-attn-mincut</small>"]
    end

    subgraph EXTRACT ["Feature Extraction"]
        STFT["STFT Spectrogram<br/><small>Time-frequency decomposition</small>"]
        FRESNELZ["Fresnel Zones<br/><small>ruvector-solver</small>"]
        BVPE["BVP Estimation<br/><small>ruvector-attention</small>"]
    end

    subgraph OUT ["Output Features"]
        AMP["Amplitude Matrix"]
        PHASE2["Phase Matrix"]
        DOPPLER["Doppler Shifts"]
        VITALS["Vital Band Power"]
    end

    IQ --> CONJ --> UNWRAP --> HAMPEL2
    HAMPEL2 --> MINCUT --> GATE
    GATE --> STFT --> FRESNELZ --> BVPE
    BVPE --> AMP & PHASE2 & DOPPLER & VITALS

    style RAW fill:#0d1117,stroke:#58a6ff,color:#c9d1d9
    style CLEAN fill:#161b22,stroke:#58a6ff,color:#c9d1d9
    style SELECT fill:#161b22,stroke:#d29922,color:#c9d1d9
    style EXTRACT fill:#161b22,stroke:#3fb950,color:#c9d1d9
    style OUT fill:#0d1117,stroke:#8b949e,color:#c9d1d9
Loading

Deployment Topology

graph TB
    subgraph EDGE ["Edge (ESP32-S3 Mesh)"]
        E1["Node 1<br/><small>Kitchen</small>"]
        E2["Node 2<br/><small>Living room</small>"]
        E3["Node 3<br/><small>Bedroom</small>"]
    end

    subgraph SERVER ["Server (Rust Β· 132 MB Docker)"]
        SENSE["Sensing Server<br/><small>:3000 REST Β· :3001 WS Β· :5005 UDP</small>"]
        RVF["RVF Model<br/><small>Progressive 3-layer load</small>"]
        STORE["Time-Series Store<br/><small>In-memory ring buffer</small>"]
    end

    subgraph CLIENT ["Clients"]
        BROWSER["Browser<br/><small>Three.js UI Β· Gaussian splats</small>"]
        MOBILE["Mobile App<br/><small>WebSocket stream</small>"]
        DASH["Dashboard<br/><small>REST polling</small>"]
        IOT["Home Automation<br/><small>MQTT bridge</small>"]
    end

    E1 -->|"UDP :5005<br/>ADR-018 frames"| SENSE
    E2 -->|"UDP :5005"| SENSE
    E3 -->|"UDP :5005"| SENSE
    SENSE <--> RVF
    SENSE <--> STORE
    SENSE -->|"WS :3001<br/>real-time JSON"| BROWSER & MOBILE
    SENSE -->|"REST :3000<br/>on-demand"| DASH & IOT

    style EDGE fill:#1a1a2e,stroke:#e94560,color:#eee
    style SERVER fill:#16213e,stroke:#533483,color:#eee
    style CLIENT fill:#0f3460,stroke:#0f3460,color:#eee
Loading
Component Crate / Module Description
Aggregator wifi-densepose-hardware ESP32 UDP listener, ADR-018 frame parser, I/Q β†’ amplitude/phase bridge
Signal Processor wifi-densepose-signal SpotFi phase sanitization, Hampel filter, STFT spectrogram, Fresnel geometry, BVP
Subcarrier Selection ruvector-mincut + ruvector-attn-mincut Dynamic sensitive/insensitive partitioning, attention-gated noise suppression
Fresnel Solver ruvector-solver Sparse Neumann series O(sqrt(n)) for TX-body-RX distance estimation
Graph Transformer wifi-densepose-train COCO BodyGraph (17 kp, 16 edges), cross-attention CSI→pose, GCN message passing
SONA sona crate Micro-LoRA (rank-4) adaptation, EWC++ catastrophic forgetting prevention
Vital Signs wifi-densepose-signal FFT-based breathing (0.1-0.5 Hz) and heartbeat (0.8-2.0 Hz) extraction
REST API wifi-densepose-sensing-server Axum server: /api/v1/sensing, /health, /vital-signs, /bssid, /sona
WebSocket wifi-densepose-sensing-server Real-time pose, sensing, and vital sign streaming on :3001
Analytics wifi-densepose-mat Fall detection, activity recognition, START triage (WiFi-Mat disaster module)
Web UI ui/ Three.js scene, Gaussian splat visualization, signal dashboard

πŸ–₯️ CLI Usage

Rust Sensing Server β€” Primary CLI interface
# Start with simulated data (no hardware)
./target/release/sensing-server --source simulate --ui-path ../../ui

# Start with ESP32 CSI hardware
./target/release/sensing-server --source esp32 --udp-port 5005

# Start with Windows WiFi RSSI
./target/release/sensing-server --source wifi

# Run vital sign benchmark
./target/release/sensing-server --benchmark

# Export RVF model package
./target/release/sensing-server --export-rvf model.rvf

# Train a model
./target/release/sensing-server --train --dataset data/ --epochs 100

# Load trained model with progressive loading
./target/release/sensing-server --model wifi-densepose-v1.rvf --progressive
Flag Description
--source Data source: auto, wifi, esp32, simulate
--http-port HTTP port for UI and REST API (default: 8080)
--ws-port WebSocket port (default: 8765)
--udp-port UDP port for ESP32 CSI frames (default: 5005)
--benchmark Run vital sign benchmark (1000 frames) and exit
--export-rvf Export RVF container package and exit
--load-rvf Load model config from RVF container
--save-rvf Save model state on shutdown
--model Load trained .rvf model for inference
--progressive Enable progressive loading (Layer A instant start)
--train Train a model and exit
--dataset Path to dataset directory (MM-Fi or Wi-Pose)
--epochs Training epochs (default: 100)
REST API & WebSocket β€” Endpoints reference

REST API (Rust Sensing Server)

GET  /api/v1/sensing              # Latest sensing frame
GET  /api/v1/vital-signs          # Breathing, heart rate, confidence
GET  /api/v1/bssid                # Multi-BSSID registry
GET  /api/v1/model/layers         # Progressive loading status
GET  /api/v1/model/sona/profiles  # SONA profiles
POST /api/v1/model/sona/activate  # Activate SONA profile

WebSocket: ws://localhost:8765/ws/sensing (real-time sensing + vital signs)

Default ports: HTTP 8080, WS 8765. Docker images remap to 3000/3001 via --http-port / --ws-port.

Hardware Support β€” Devices, cost, and guides
Hardware CSI Cost Guide
ESP32-S3 Native ~$8 Tutorial #34
Intel 5300 Firmware mod ~$15 Linux iwl-csi
Atheros AR9580 ath9k patch ~$20 Linux only
Any Windows WiFi RSSI only $0 Tutorial #36
Python Legacy CLI β€” v1 API server commands
wifi-densepose start                    # Start API server
wifi-densepose -c config.yaml start     # Custom config
wifi-densepose -v start                 # Verbose logging
wifi-densepose status                   # Check status
wifi-densepose stop                     # Stop server
wifi-densepose config show              # Show configuration
wifi-densepose db init                  # Initialize database
wifi-densepose tasks list               # List background tasks
Documentation Links

πŸ§ͺ Testing

542+ tests across 7 suites β€” zero mocks, hardware-free simulation
# Rust tests (primary β€” 542+ tests)
cd rust-port/wifi-densepose-rs
cargo test --workspace

# Sensing server tests (229 tests)
cargo test -p wifi-densepose-sensing-server

# Vital sign benchmark
./target/release/sensing-server --benchmark

# Python tests
python -m pytest v1/tests/ -v

# Pipeline verification (no hardware needed)
./verify
Suite Tests What It Covers
sensing-server lib 147 Graph transformer, trainer, SONA, sparse inference, RVF
sensing-server bin 48 CLI integration, WebSocket, REST API
RVF integration 16 Container build, read, progressive load
Vital signs integration 18 FFT detection, breathing, heartbeat
wifi-densepose-signal 83 SOTA algorithms, Doppler, Fresnel
wifi-densepose-mat 139 Disaster response, triage, localization
wifi-densepose-wifiscan 91 8-stage RSSI pipeline

πŸš€ Deployment

Docker deployment β€” Production setup with docker-compose
# Rust sensing server (132 MB)
docker pull ruvnet/wifi-densepose:latest
docker run -p 3000:3000 -p 3001:3001 -p 5005:5005/udp ruvnet/wifi-densepose:latest

# Python pipeline (569 MB)
docker pull ruvnet/wifi-densepose:python
docker run -p 8765:8765 -p 8080:8080 ruvnet/wifi-densepose:python

# Both via docker-compose
cd docker && docker compose up

# Export RVF model
docker run --rm -v $(pwd):/out ruvnet/wifi-densepose:latest --export-rvf /out/model.rvf

Environment Variables

RUST_LOG=info                    # Logging level
WIFI_INTERFACE=wlan0             # WiFi interface for RSSI
POSE_CONFIDENCE_THRESHOLD=0.7    # Minimum confidence
POSE_MAX_PERSONS=10              # Max tracked individuals

πŸ“Š Performance Metrics

Measured benchmarks β€” Rust sensing server, validated via cargo bench

Rust Sensing Server

Metric Value
Vital sign detection 11,665 fps (86 Β΅s/frame)
Full CSI pipeline 54,000 fps (18.47 Β΅s/frame)
Motion detection 186 ns (~5,400x vs Python)
Docker image 132 MB
Memory usage ~100 MB
Test count 542+

Python vs Rust

Operation Python Rust Speedup
CSI Preprocessing ~5 ms 5.19 Β΅s 1000x
Phase Sanitization ~3 ms 3.84 Β΅s 780x
Feature Extraction ~8 ms 9.03 Β΅s 890x
Motion Detection ~1 ms 186 ns 5400x
Full Pipeline ~15 ms 18.47 Β΅s 810x

🀝 Contributing

Dev setup, code standards, PR process
git clone https://github.com/ruvnet/wifi-densepose.git
cd wifi-densepose

# Rust development
cd rust-port/wifi-densepose-rs
cargo build --release
cargo test --workspace

# Python development
python -m venv venv && source venv/bin/activate
pip install -r requirements-dev.txt && pip install -e .
pre-commit install
  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes
  4. Push and open a Pull Request

πŸ“„ Changelog

Release history

v2.3.0 β€” 2026-03-01

The largest release to date β€” delivers the complete end-to-end training pipeline, Docker images, and vital sign detection. The Rust sensing server now supports full model training, RVF export, and progressive model loading from a single binary.

  • Docker images published β€” ruvnet/wifi-densepose:latest (132 MB Rust) and :python (569 MB)
  • 8-phase DensePose training pipeline (ADR-023) β€” Dataset loaders (MM-Fi, Wi-Pose), graph transformer with cross-attention, 6-term composite loss, cosine-scheduled SGD, PCK/OKS validation, SONA adaptation, sparse inference engine, RVF model packaging
  • --export-rvf CLI flag β€” Standalone RVF model container generation with vital config, training proof, and SONA profiles
  • --train CLI flag β€” Full training mode with best-epoch snapshotting and checkpoint saving
  • Vital sign detection (ADR-021) β€” FFT-based breathing (6-30 BPM) and heartbeat (40-120 BPM) extraction, 11,665 fps benchmark
  • WiFi scan domain layer (ADR-022) β€” 8-stage pure-Rust signal intelligence pipeline for Windows WiFi RSSI
  • New crates β€” wifi-densepose-vitals (1,863 lines) and wifi-densepose-wifiscan (4,829 lines)
  • 542+ Rust tests β€” All passing, zero mocks

v2.2.0 β€” 2026-02-28

Introduced the guided installer, SOTA signal processing algorithms, and the WiFi-Mat disaster response module. This release established the ESP32 hardware path and security hardening.

  • Guided installer β€” ./install.sh with 7-step hardware detection and 8 install profiles
  • 6 SOTA signal algorithms (ADR-014) β€” SpotFi conjugate multiplication, Hampel filter, Fresnel zone model, CSI spectrogram, subcarrier selection, body velocity profile
  • WiFi-Mat disaster response β€” START triage, scan zones, 3D localization, priority alerts β€” 139 tests
  • ESP32 CSI hardware parser β€” Binary frame parsing with I/Q extraction β€” 28 tests
  • Security hardening β€” 10 vulnerabilities fixed (CVE remediation, input validation, path security)

v2.1.0 β€” 2026-02-28

The foundational Rust release β€” ported the Python v1 pipeline to Rust with 810x speedup, integrated the RuVector signal intelligence crates, and added the Three.js real-time visualization.

  • RuVector integration β€” 11 vendored crates (ADR-002 through ADR-013) for HNSW indexing, attention, GNN, temporal compression, min-cut, solver
  • ESP32 CSI sensor mesh β€” $54 starter kit with 3-6 ESP32-S3 nodes streaming at 20 Hz
  • Three.js visualization β€” 3D body model with 17 joints, real-time WebSocket streaming
  • CI verification pipeline β€” Determinism checks and unseeded random scan across all signal operations

πŸ“„ License

MIT License β€” see LICENSE for details.

πŸ“ž Support

GitHub Issues | Discussions | PyPI


WiFi DensePose β€” Privacy-preserving human pose estimation through WiFi signals.

About

Production-ready implementation of InvisPose - a revolutionary WiFi-based dense human pose estimation system that enables real-time full-body tracking through walls using commodity mesh routers

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Rust 45.8%
  • Python 32.8%
  • JavaScript 11.7%
  • Shell 4.9%
  • HTML 2.7%
  • CSS 1.6%
  • Other 0.5%