feat: Training mode, ADR docs, vitals and wifiscan crates
- Add --train CLI flag with dataset loading, graph transformer training, cosine-scheduled SGD, PCK/OKS validation, and checkpoint saving - Refactor main.rs to import training modules from lib.rs instead of duplicating mod declarations - Add ADR-021 (vital sign detection), ADR-022 (Windows WiFi enhanced fidelity), ADR-023 (trained DensePose pipeline) documentation - Add wifi-densepose-vitals crate: breathing, heartrate, anomaly detection, preprocessor, and temporal store - Add wifi-densepose-wifiscan crate: 8-stage signal intelligence pipeline with netsh/wlanapi adapters, multi-BSSID registry, attention weighting, spatial correlation, and breathing extraction Co-Authored-By: claude-flow <ruv@ruv.net>
This commit is contained in:
1092
docs/adr/ADR-021-vital-sign-detection-rvdna-pipeline.md
Normal file
1092
docs/adr/ADR-021-vital-sign-detection-rvdna-pipeline.md
Normal file
File diff suppressed because it is too large
Load Diff
1357
docs/adr/ADR-022-windows-wifi-enhanced-fidelity-ruvector.md
Normal file
1357
docs/adr/ADR-022-windows-wifi-enhanced-fidelity-ruvector.md
Normal file
File diff suppressed because it is too large
Load Diff
825
docs/adr/ADR-023-trained-densepose-model-ruvector-pipeline.md
Normal file
825
docs/adr/ADR-023-trained-densepose-model-ruvector-pipeline.md
Normal file
@@ -0,0 +1,825 @@
|
||||
# ADR-023: Trained DensePose Model with RuVector Signal Intelligence Pipeline
|
||||
|
||||
| Field | Value |
|
||||
|-------|-------|
|
||||
| **Status** | Proposed |
|
||||
| **Date** | 2026-02-28 |
|
||||
| **Deciders** | ruv |
|
||||
| **Relates to** | ADR-003 (RVF Cognitive Containers), ADR-005 (SONA Self-Learning), ADR-015 (Public Dataset Strategy), ADR-016 (RuVector Integration), ADR-017 (RuVector-Signal-MAT), ADR-020 (Rust AI Migration), ADR-021 (Vital Sign Detection) |
|
||||
|
||||
## Context
|
||||
|
||||
### The Gap Between Sensing and DensePose
|
||||
|
||||
The WiFi-DensePose system currently operates in two distinct modes:
|
||||
|
||||
1. **WiFi CSI sensing** (working): ESP32 streams CSI frames → Rust aggregator → feature extraction → presence/motion classification. 41 tests passing, verified at ~20 Hz with real hardware.
|
||||
|
||||
2. **Heuristic pose derivation** (working but approximate): The Rust sensing server generates 17 COCO keypoints from WiFi signal properties using hand-crafted rules (`derive_pose_from_sensing()` in `sensing-server/src/main.rs`). This is not a trained model — keypoint positions are derived from signal amplitude, phase variance, and motion metrics rather than learned from labeled data.
|
||||
|
||||
Neither mode produces **DensePose-quality** body surface estimation. The CMU "DensePose From WiFi" paper (arXiv:2301.00250) demonstrated that a neural network trained on paired WiFi CSI + camera pose data can produce dense body surface UV coordinates from WiFi alone. However, that approach requires:
|
||||
|
||||
- **Environment-specific training**: The model must be trained or fine-tuned for each deployment environment because CSI multipath patterns are environment-dependent.
|
||||
- **Paired training data**: Simultaneous WiFi CSI captures + ground-truth pose annotations (or a camera-based teacher model generating pseudo-labels).
|
||||
- **Substantial compute**: Training a modality translation network + DensePose head requires GPU time (hours to days depending on dataset size).
|
||||
|
||||
### What Exists in the Codebase
|
||||
|
||||
The Rust workspace already has the complete model architecture ready for training:
|
||||
|
||||
| Component | Crate | File | Status |
|
||||
|-----------|-------|------|--------|
|
||||
| `WiFiDensePoseModel` | `wifi-densepose-train` | `model.rs` | Implemented (random weights) |
|
||||
| `ModalityTranslator` | `wifi-densepose-train` | `model.rs` | Implemented with RuVector attention |
|
||||
| `KeypointHead` | `wifi-densepose-train` | `model.rs` | Implemented (17 COCO heatmaps) |
|
||||
| `DensePoseHead` | `wifi-densepose-nn` | `densepose.rs` | Implemented (25 parts + 48 UV) |
|
||||
| `WiFiDensePoseLoss` | `wifi-densepose-train` | `losses.rs` | Implemented (keypoint + part + UV + transfer) |
|
||||
| `MmFiDataset` loader | `wifi-densepose-train` | `dataset.rs` | Planned (ADR-015) |
|
||||
| `WiFiDensePosePipeline` | `wifi-densepose-nn` | `inference.rs` | Implemented (generic over Backend) |
|
||||
| Training proof verification | `wifi-densepose-train` | `proof.rs` | Implemented (deterministic hash) |
|
||||
| Subcarrier resampling (114→56) | `wifi-densepose-train` | `subcarrier.rs` | Planned (ADR-016) |
|
||||
|
||||
### RuVector Crates Available
|
||||
|
||||
The `vendor/ruvector/` subtree provides 90+ crates. The following are directly relevant to a trained DensePose pipeline:
|
||||
|
||||
**Already integrated (5 crates, ADR-016):**
|
||||
|
||||
| Crate | Algorithm | Current Use |
|
||||
|-------|-----------|-------------|
|
||||
| `ruvector-mincut` | Subpolynomial dynamic min-cut O(n^{o(1)}) | Multi-person assignment in `metrics.rs` |
|
||||
| `ruvector-attn-mincut` | Attention-gated min-cut | Noise-suppressed spectrogram in `model.rs` |
|
||||
| `ruvector-attention` | Scaled dot-product + geometric attention | Spatial decoder in `model.rs` |
|
||||
| `ruvector-solver` | Sparse Neumann solver O(√n) | Subcarrier resampling in `subcarrier.rs` |
|
||||
| `ruvector-temporal-tensor` | Tiered temporal compression | CSI frame buffering in `dataset.rs` |
|
||||
|
||||
**Newly proposed for DensePose pipeline (6 additional crates):**
|
||||
|
||||
| Crate | Description | Proposed Use |
|
||||
|-------|-------------|-------------|
|
||||
| `ruvector-gnn` | Graph neural network on HNSW topology | Spatial body-graph reasoning |
|
||||
| `ruvector-graph-transformer` | Proof-gated graph transformer (8 modules) | CSI-to-pose cross-attention |
|
||||
| `ruvector-sparse-inference` | PowerInfer-style sparse inference engine | Edge deployment with neuron activation sparsity |
|
||||
| `ruvector-sona` | Self-Optimizing Neural Architecture (LoRA + EWC++) | Online environment adaptation |
|
||||
| `ruvector-fpga-transformer` | FPGA-optimized transformer | Hardware-accelerated inference path |
|
||||
| `ruvector-math` | Optimal transport, information geometry | Domain adaptation loss functions |
|
||||
|
||||
### RVF Container Format
|
||||
|
||||
The RuVector Format (RVF) is a segment-based binary container format designed to package
|
||||
intelligence artifacts — embeddings, HNSW indexes, quantized weights, WASM runtimes, witness
|
||||
proofs, and metadata — into a single self-contained file. Key properties:
|
||||
|
||||
- **64-byte segment headers** (`SegmentHeader`, magic `0x52564653` "RVFS") with type discriminator, content hash, compression, and timestamp
|
||||
- **Progressive loading**: Layer A (entry points, <5ms) → Layer B (hot adjacency, 100ms–1s) → Layer C (full graph, seconds)
|
||||
- **20+ segment types**: `Vec` (embeddings), `Index` (HNSW), `Overlay` (min-cut witnesses), `Quant` (codebooks), `Witness` (proof-of-computation), `Wasm` (self-bootstrapping runtime), `Dashboard` (embedded UI), `AggregateWeights` (federated SONA deltas), `Crypto` (Ed25519 signatures), and more
|
||||
- **Temperature-tiered quantization** (`rvf-quant`): f32 / f16 / u8 / binary per-segment, with SIMD-accelerated distance computation
|
||||
- **AGI Cognitive Container** (`agi_container.rs`): packages kernel + WASM + world model + orchestrator + evaluation harness + witness chains into a single deployable file
|
||||
|
||||
The trained DensePose model will be packaged as an `.rvf` container, making it a single
|
||||
self-contained artifact that includes model weights, HNSW-indexed embedding tables, min-cut
|
||||
graph overlays, quantization codebooks, SONA adaptation deltas, and the WASM inference
|
||||
runtime — deployable to any host without external dependencies.
|
||||
|
||||
## Decision
|
||||
|
||||
Implement a fully trained DensePose model using RuVector signal intelligence as the backbone signal processing layer, packaged in the RVF container format. The pipeline has three stages: (1) offline training on public datasets, (2) teacher-student distillation for DensePose UV labels, and (3) online SONA adaptation for environment-specific fine-tuning. The trained model, its embeddings, indexes, and adaptation state are serialized into a single `.rvf` file.
|
||||
|
||||
### Architecture Overview
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
||||
│ TRAINED DENSEPOSE PIPELINE │
|
||||
│ │
|
||||
│ ┌─────────────┐ ┌──────────────────────┐ ┌──────────────────────┐ │
|
||||
│ │ ESP32 CSI │ │ RuVector Signal │ │ Trained Neural │ │
|
||||
│ │ Raw I/Q │───▶│ Intelligence Layer │───▶│ Network │ │
|
||||
│ │ [ant×sub×T] │ │ (preprocessing) │ │ (inference) │ │
|
||||
│ └─────────────┘ └──────────────────────┘ └──────────────────────┘ │
|
||||
│ │ │ │
|
||||
│ ┌─────────┴─────────┐ ┌────────┴────────┐ │
|
||||
│ │ 5 RuVector crates │ │ 6 RuVector │ │
|
||||
│ │ (signal processing)│ │ crates (neural) │ │
|
||||
│ └───────────────────┘ └─────────────────┘ │
|
||||
│ │ │
|
||||
│ ┌──────────────────────────┘ │
|
||||
│ ▼ │
|
||||
│ ┌──────────────────────────────────────┐ │
|
||||
│ │ Outputs │ │
|
||||
│ │ • 17 COCO keypoints [B,17,H,W] │ │
|
||||
│ │ • 25 body parts [B,25,H,W] │ │
|
||||
│ │ • 48 UV coords [B,48,H,W] │ │
|
||||
│ │ • Confidence scores │ │
|
||||
│ └──────────────────────────────────────┘ │
|
||||
└─────────────────────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### Stage 1: RuVector Signal Preprocessing Layer
|
||||
|
||||
Raw CSI frames from ESP32 (56–192 subcarriers × N antennas × T time frames) are processed through the RuVector signal intelligence stack before entering the neural network. This replaces hand-crafted feature extraction with learned, graph-aware preprocessing.
|
||||
|
||||
```
|
||||
Raw CSI [ant, sub, T]
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────────────────────┐
|
||||
│ 1. ruvector-attn-mincut: gate_spectrogram() │
|
||||
│ Input: Q=amplitude, K=phase, V=combined │
|
||||
│ Effect: Suppress multipath noise, keep motion- │
|
||||
│ relevant subcarrier paths │
|
||||
│ Output: Gated spectrogram [ant, sub', T] │
|
||||
├─────────────────────────────────────────────────────┤
|
||||
│ 2. ruvector-mincut: mincut_subcarrier_partition() │
|
||||
│ Input: Subcarrier coherence graph │
|
||||
│ Effect: Partition into sensitive (motion- │
|
||||
│ responsive) vs insensitive (static) │
|
||||
│ Output: Partition mask + per-subcarrier weights │
|
||||
├─────────────────────────────────────────────────────┤
|
||||
│ 3. ruvector-attention: attention_weighted_bvp() │
|
||||
│ Input: Gated spectrogram + partition weights │
|
||||
│ Effect: Compute body velocity profile with │
|
||||
│ sensitivity-weighted attention │
|
||||
│ Output: BVP feature vector [D_bvp] │
|
||||
├─────────────────────────────────────────────────────┤
|
||||
│ 4. ruvector-solver: solve_fresnel_geometry() │
|
||||
│ Input: Amplitude + known TX/RX positions │
|
||||
│ Effect: Estimate TX-body-RX ellipsoid distances │
|
||||
│ Output: Fresnel geometry features [D_fresnel] │
|
||||
├─────────────────────────────────────────────────────┤
|
||||
│ 5. ruvector-temporal-tensor: compress + buffer │
|
||||
│ Input: Temporal CSI window (100 frames) │
|
||||
│ Effect: Tiered quantization (hot/warm/cold) │
|
||||
│ Output: Compressed tensor, 50-75% memory saving │
|
||||
└─────────────────────────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
Feature tensor [B, T*tx*rx, sub] (preprocessed, noise-suppressed)
|
||||
```
|
||||
|
||||
### Stage 2: Neural Network Architecture
|
||||
|
||||
The neural network follows the CMU teacher-student architecture with RuVector enhancements at three critical points.
|
||||
|
||||
#### 2a. ModalityTranslator (CSI → Visual Feature Space)
|
||||
|
||||
```
|
||||
CSI features [B, T*tx*rx, sub]
|
||||
│
|
||||
├──amplitude──┐
|
||||
│ ├─► Encoder (Conv1D stack, 64→128→256)
|
||||
└──phase──────┘ │
|
||||
▼
|
||||
┌──────────────────────────────┐
|
||||
│ ruvector-graph-transformer │
|
||||
│ │
|
||||
│ Treat antenna-pair×time as │
|
||||
│ graph nodes. Edges connect │
|
||||
│ spatially adjacent antenna │
|
||||
│ pairs and temporally │
|
||||
│ adjacent frames. │
|
||||
│ │
|
||||
│ Proof-gated attention: │
|
||||
│ Each layer verifies that │
|
||||
│ attention weights satisfy │
|
||||
│ physical constraints │
|
||||
│ (Fresnel ellipsoid bounds) │
|
||||
└──────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
Decoder (ConvTranspose2d stack, 256→128→64→3)
|
||||
│
|
||||
▼
|
||||
Visual features [B, 3, 48, 48]
|
||||
```
|
||||
|
||||
**RuVector enhancement**: Replace standard multi-head self-attention in the bottleneck with `ruvector-graph-transformer`. The graph structure encodes the physical antenna topology — nodes that are closer in space (adjacent ESP32 nodes in the mesh) or time (consecutive frames) have stronger edge weights. This injects domain-specific inductive bias that standard attention lacks.
|
||||
|
||||
#### 2b. GNN Body Graph Reasoning
|
||||
|
||||
```
|
||||
Visual features [B, 3, 48, 48]
|
||||
│
|
||||
▼
|
||||
ResNet18 backbone → feature maps [B, 256, 12, 12]
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────────┐
|
||||
│ ruvector-gnn: Body Graph Network │
|
||||
│ │
|
||||
│ 17 COCO keypoints as graph nodes │
|
||||
│ Edges: anatomical connections │
|
||||
│ (shoulder→elbow, hip→knee, etc.) │
|
||||
│ │
|
||||
│ GNN message passing (3 rounds): │
|
||||
│ h_i^{l+1} = σ(W·h_i^l + Σ_j α_ij·h_j)│
|
||||
│ α_ij = attention(h_i, h_j, edge_ij) │
|
||||
│ │
|
||||
│ Enforces anatomical constraints: │
|
||||
│ - Limb length ratios │
|
||||
│ - Joint angle limits │
|
||||
│ - Left-right symmetry priors │
|
||||
└─────────────────────────────────────────┘
|
||||
│
|
||||
├──────────────────┬──────────────────┐
|
||||
▼ ▼ ▼
|
||||
KeypointHead DensePoseHead ConfidenceHead
|
||||
[B,17,H,W] [B,25+48,H,W] [B,1]
|
||||
heatmaps parts + UV quality score
|
||||
```
|
||||
|
||||
**RuVector enhancement**: `ruvector-gnn` replaces the flat spatial decoder with a graph neural network that operates on the human body graph. WiFi CSI is inherently noisy — GNN message passing between anatomically connected joints enforces that predicted keypoints maintain plausible body structure even when individual joint predictions are uncertain.
|
||||
|
||||
#### 2c. Sparse Inference for Edge Deployment
|
||||
|
||||
```
|
||||
Trained model weights (full precision)
|
||||
│
|
||||
▼
|
||||
┌─────────────────────────────────────────────┐
|
||||
│ ruvector-sparse-inference │
|
||||
│ │
|
||||
│ PowerInfer-style activation sparsity: │
|
||||
│ - Profile neuron activation frequency │
|
||||
│ - Partition into hot (always active, 20%) │
|
||||
│ and cold (conditionally active, 80%) │
|
||||
│ - Hot neurons: GPU/SIMD fast path │
|
||||
│ - Cold neurons: sparse lookup on demand │
|
||||
│ │
|
||||
│ Quantization: │
|
||||
│ - Backbone: INT8 (4x memory reduction) │
|
||||
│ - DensePose head: FP16 (2x reduction) │
|
||||
│ - ModalityTranslator: FP16 │
|
||||
│ │
|
||||
│ Target: <50ms inference on ESP32-S3 │
|
||||
│ <10ms on x86 with AVX2 │
|
||||
└─────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### Stage 3: Training Pipeline
|
||||
|
||||
#### 3a. Dataset Loading and Preprocessing
|
||||
|
||||
Primary dataset: **MM-Fi** (NeurIPS 2023) — 40 subjects, 27 actions, 114 subcarriers, 3 RX antennas, 17 COCO keypoints + DensePose UV annotations.
|
||||
|
||||
Secondary dataset: **Wi-Pose** — 12 subjects, 12 actions, 30 subcarriers, 3×3 antenna array, 18 keypoints.
|
||||
|
||||
```
|
||||
┌──────────────────────────────────────────────────────────┐
|
||||
│ Data Loading Pipeline │
|
||||
│ │
|
||||
│ MM-Fi .npy ──► Resample 114→56 subcarriers ──┐ │
|
||||
│ (ruvector-solver NeumannSolver) │ │
|
||||
│ ├──► Batch│
|
||||
│ Wi-Pose .mat ──► Zero-pad 30→56 subcarriers ──┘ [B,T*│
|
||||
│ ant, │
|
||||
│ Phase sanitize ──► Hampel filter ──► unwrap sub] │
|
||||
│ (wifi-densepose-signal::phase_sanitizer) │
|
||||
│ │
|
||||
│ Temporal buffer ──► ruvector-temporal-tensor │
|
||||
│ (100 frames/sample, tiered quantization) │
|
||||
└──────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
#### 3b. Teacher-Student DensePose Labels
|
||||
|
||||
For samples with 3D keypoints but no DensePose UV maps:
|
||||
|
||||
1. Run Detectron2 DensePose R-CNN on paired RGB frames (one-time preprocessing step on GPU workstation)
|
||||
2. Generate `(part_labels [H,W], u_coords [H,W], v_coords [H,W])` pseudo-labels
|
||||
3. Cache as `.npy` alongside original data
|
||||
4. Teacher model is discarded after label generation — inference uses WiFi only
|
||||
|
||||
#### 3c. Loss Function
|
||||
|
||||
```rust
|
||||
L_total = λ_kp · L_keypoint // MSE on predicted vs GT heatmaps
|
||||
+ λ_part · L_part // Cross-entropy on 25-class body part segmentation
|
||||
+ λ_uv · L_uv // Smooth L1 on UV coordinate regression
|
||||
+ λ_xfer · L_transfer // MSE between CSI features and teacher visual features
|
||||
+ λ_ot · L_ot // Optimal transport regularization (ruvector-math)
|
||||
+ λ_graph · L_graph // GNN edge consistency loss (ruvector-gnn)
|
||||
```
|
||||
|
||||
**RuVector enhancement**: `ruvector-math` provides optimal transport (Wasserstein distance) as a regularization term. This penalizes predicted body part distributions that are far from the ground truth in the Wasserstein metric, which is more geometrically meaningful than pixel-wise cross-entropy for spatial body part segmentation.
|
||||
|
||||
#### 3d. Training Configuration
|
||||
|
||||
| Parameter | Value | Rationale |
|
||||
|-----------|-------|-----------|
|
||||
| Optimizer | AdamW | Weight decay regularization |
|
||||
| Learning rate | 1e-3, cosine decay to 1e-5 | Standard for modality translation |
|
||||
| Batch size | 32 | Fits in 24GB GPU VRAM |
|
||||
| Epochs | 100 | With early stopping (patience=15) |
|
||||
| Warmup | 5 epochs | Linear LR warmup |
|
||||
| Train/val split | Subjects 1-32 / 33-40 | Subject-disjoint for generalization |
|
||||
| Augmentation | Time-shift ±5 frames, amplitude noise ±2dB, antenna dropout 10% | CSI-domain augmentations |
|
||||
| Hardware | Single RTX 3090 or A100 | ~8 hours on A100 |
|
||||
| Checkpoint | Every epoch, keep best-by-validation-PCK | Deterministic seed |
|
||||
|
||||
#### 3e. Metrics
|
||||
|
||||
| Metric | Target | Description |
|
||||
|--------|--------|-------------|
|
||||
| PCK@0.2 | >70% on MM-Fi val | Percentage of correct keypoints (threshold = 0.2 × torso diameter) |
|
||||
| OKS mAP | >0.50 on MM-Fi val | Object Keypoint Similarity, COCO-standard |
|
||||
| DensePose GPS | >0.30 on MM-Fi val | Geodesic Point Similarity for UV accuracy |
|
||||
| Inference latency | <50ms per frame | On x86 with ONNX Runtime |
|
||||
| Model size | <25MB (FP16) | Suitable for edge deployment |
|
||||
|
||||
### Stage 4: Online Adaptation with SONA
|
||||
|
||||
After offline training produces a base model, SONA enables continuous adaptation to new environments without retraining from scratch.
|
||||
|
||||
```
|
||||
┌──────────────────────────────────────────────────────────┐
|
||||
│ SONA Online Adaptation Loop │
|
||||
│ │
|
||||
│ Base model (frozen weights W) │
|
||||
│ │ │
|
||||
│ ▼ │
|
||||
│ ┌──────────────────────────────────┐ │
|
||||
│ │ LoRA Adaptation Matrices │ │
|
||||
│ │ W_effective = W + α · A·B │ │
|
||||
│ │ │ │
|
||||
│ │ Rank r=4 for translator layers │ │
|
||||
│ │ Rank r=2 for backbone layers │ │
|
||||
│ │ Rank r=8 for DensePose head │ │
|
||||
│ │ │ │
|
||||
│ │ Total trainable params: ~50K │ │
|
||||
│ │ (vs ~5M frozen base) │ │
|
||||
│ └──────────────────────────────────┘ │
|
||||
│ │ │
|
||||
│ ▼ │
|
||||
│ ┌──────────────────────────────────┐ │
|
||||
│ │ EWC++ Regularizer │ │
|
||||
│ │ L = L_task + λ·Σ F_i(θ-θ*)² │ │
|
||||
│ │ │ │
|
||||
│ │ Prevents forgetting base model │ │
|
||||
│ │ knowledge when adapting to new │ │
|
||||
│ │ environment │ │
|
||||
│ └──────────────────────────────────┘ │
|
||||
│ │ │
|
||||
│ ▼ │
|
||||
│ Adaptation triggers: │
|
||||
│ • First deployment in new room │
|
||||
│ • PCK drops below threshold (drift detection) │
|
||||
│ • User manually initiates calibration │
|
||||
│ • Furniture/layout change detected (CSI baseline shift) │
|
||||
│ │
|
||||
│ Adaptation data: │
|
||||
│ • Self-supervised: temporal consistency loss │
|
||||
│ (pose at t should be similar to t-1 for slow motion) │
|
||||
│ • Semi-supervised: user confirmation of presence/count │
|
||||
│ • Optional: brief camera calibration session (5 min) │
|
||||
│ │
|
||||
│ Convergence: 10-50 gradient steps, <5 seconds on CPU │
|
||||
└──────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### Stage 5: Inference Pipeline (Production)
|
||||
|
||||
```
|
||||
ESP32 CSI (UDP :5005)
|
||||
│
|
||||
▼
|
||||
Rust Axum server (port 8080)
|
||||
│
|
||||
├─► RuVector signal preprocessing (Stage 1)
|
||||
│ 5 crates, ~2ms per frame
|
||||
│
|
||||
├─► ONNX Runtime inference (Stage 2)
|
||||
│ Quantized model, ~10ms per frame
|
||||
│ OR ruvector-sparse-inference, ~8ms per frame
|
||||
│
|
||||
├─► GNN post-processing (ruvector-gnn)
|
||||
│ Anatomical constraint enforcement, ~1ms
|
||||
│
|
||||
├─► SONA adaptation check (Stage 4)
|
||||
│ <0.05ms per frame (gradient accumulation only)
|
||||
│
|
||||
└─► Output: DensePose results
|
||||
│
|
||||
├──► /api/v1/stream/pose (WebSocket, 17 keypoints)
|
||||
├──► /api/v1/pose/current (REST, full DensePose)
|
||||
└──► /ws/sensing (WebSocket, raw + processed)
|
||||
```
|
||||
|
||||
Total inference budget: **<15ms per frame** at 20 Hz on x86, **<50ms** on ESP32-S3 (with sparse inference).
|
||||
|
||||
### Stage 6: RVF Model Container Format
|
||||
|
||||
The trained model is packaged as a single `.rvf` file that contains everything needed for
|
||||
inference — no external weight files, no ONNX runtime, no Python dependencies.
|
||||
|
||||
#### RVF DensePose Container Layout
|
||||
|
||||
```
|
||||
wifi-densepose-v1.rvf (single file, ~15-30 MB)
|
||||
┌───────────────────────────────────────────────────────────────┐
|
||||
│ SEGMENT 0: Manifest (0x05) │
|
||||
│ ├── Model ID: "wifi-densepose-v1.0" │
|
||||
│ ├── Training dataset: "mmfi-v1+wipose-v1" │
|
||||
│ ├── Training config hash: SHA-256 │
|
||||
│ ├── Target hardware: x86_64, aarch64, wasm32 │
|
||||
│ ├── Segment directory (offsets to all segments) │
|
||||
│ └── Level-1 TLV manifest with metadata tags │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 1: Vec (0x01) — Model Weight Embeddings │
|
||||
│ ├── ModalityTranslator weights [64→128→256→3, Conv1D+ConvT] │
|
||||
│ ├── ResNet18 backbone weights [3→64→128→256, residual blocks] │
|
||||
│ ├── KeypointHead weights [256→17, deconv layers] │
|
||||
│ ├── DensePoseHead weights [256→25+48, deconv layers] │
|
||||
│ ├── GNN body graph weights [3 message-passing rounds] │
|
||||
│ └── Graph transformer attention weights [proof-gated layers] │
|
||||
│ Format: flat f32 vectors, 768-dim per weight tensor │
|
||||
│ Total: ~5M parameters → ~20MB f32, ~10MB f16, ~5MB INT8 │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 2: Index (0x02) — HNSW Embedding Index │
|
||||
│ ├── Layer A: Entry points + coarse routing centroids │
|
||||
│ │ (loaded first, <5ms, enables approximate search) │
|
||||
│ ├── Layer B: Hot region adjacency for frequently │
|
||||
│ │ accessed weight clusters (100ms load) │
|
||||
│ └── Layer C: Full adjacency graph for exact nearest │
|
||||
│ neighbor lookup across all weight partitions │
|
||||
│ Use: Fast weight lookup for sparse inference — │
|
||||
│ only load hot neurons, skip cold neurons via HNSW routing │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 3: Overlay (0x03) — Dynamic Min-Cut Graph │
|
||||
│ ├── Subcarrier partition graph (sensitive vs insensitive) │
|
||||
│ ├── Min-cut witnesses from ruvector-mincut │
|
||||
│ ├── Antenna topology graph (ESP32 mesh spatial layout) │
|
||||
│ └── Body skeleton graph (17 COCO joints, 16 edges) │
|
||||
│ Use: Pre-computed graph structures loaded at init time. │
|
||||
│ Dynamic updates via ruvector-mincut insert/delete_edge │
|
||||
│ as environment changes (furniture moves, new obstacles) │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 4: Quant (0x06) — Quantization Codebooks │
|
||||
│ ├── INT8 codebook for backbone (4x memory reduction) │
|
||||
│ ├── FP16 scale factors for translator + heads │
|
||||
│ ├── Binary quantization tables for SIMD distance compute │
|
||||
│ └── Per-layer calibration statistics (min, max, zero-point) │
|
||||
│ Use: rvf-quant temperature-tiered quantization — │
|
||||
│ hot layers stay f16, warm layers u8, cold layers binary │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 5: Witness (0x0A) — Training Proof Chain │
|
||||
│ ├── Deterministic training proof (seed, loss curve, hash) │
|
||||
│ ├── Dataset provenance (MM-Fi commit hash, download URL) │
|
||||
│ ├── Validation metrics (PCK@0.2, OKS mAP, GPS scores) │
|
||||
│ ├── Ed25519 signature over weight hash │
|
||||
│ └── Attestation: training hardware, duration, config │
|
||||
│ Use: Verifiable proof that model weights match a specific │
|
||||
│ training run. Anyone can re-run training with same seed │
|
||||
│ and verify the weight hash matches the witness. │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 6: Meta (0x07) — Model Metadata │
|
||||
│ ├── COCO keypoint names and skeleton connectivity │
|
||||
│ ├── DensePose body part labels (24 parts + background) │
|
||||
│ ├── UV coordinate range and resolution │
|
||||
│ ├── Input normalization statistics (mean, std per subcarrier)│
|
||||
│ ├── RuVector crate versions used during training │
|
||||
│ └── Environment calibration profiles (named, per-room) │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 7: AggregateWeights (0x36) — SONA LoRA Deltas │
|
||||
│ ├── Per-environment LoRA adaptation matrices (A, B per layer)│
|
||||
│ ├── EWC++ Fisher information diagonal │
|
||||
│ ├── Optimal θ* reference parameters │
|
||||
│ ├── Adaptation round count and convergence metrics │
|
||||
│ └── Named profiles: "lab-a", "living-room", "office-3f" │
|
||||
│ Use: Multiple environment adaptations stored in one file. │
|
||||
│ Server loads the matching profile or creates a new one. │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 8: Profile (0x0B) — RVDNA Domain Profile │
|
||||
│ ├── Domain: "wifi-csi-densepose" │
|
||||
│ ├── Input spec: [B, T*ant, sub] CSI tensor format │
|
||||
│ ├── Output spec: keypoints [B,17,H,W], parts [B,25,H,W], │
|
||||
│ │ UV [B,48,H,W], confidence [B,1] │
|
||||
│ ├── Hardware requirements: min RAM, recommended GPU │
|
||||
│ └── Supported data sources: esp32, wifi-rssi, simulation │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 9: Crypto (0x0C) — Signature and Keys │
|
||||
│ ├── Ed25519 public key for model publisher │
|
||||
│ ├── Signature over all segment content hashes │
|
||||
│ └── Certificate chain (optional, for enterprise deployment) │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 10: Wasm (0x10) — Self-Bootstrapping Runtime │
|
||||
│ ├── Compiled WASM inference engine │
|
||||
│ │ (ruvector-sparse-inference-wasm) │
|
||||
│ ├── WASM microkernel for RVF segment parsing │
|
||||
│ └── Browser-compatible: load .rvf → run inference in-browser │
|
||||
│ Use: The .rvf file is fully self-contained — a WASM host │
|
||||
│ can execute inference without any external dependencies. │
|
||||
├───────────────────────────────────────────────────────────────┤
|
||||
│ SEGMENT 11: Dashboard (0x11) — Embedded Visualization │
|
||||
│ ├── Three.js-based pose visualization (HTML/JS/CSS) │
|
||||
│ ├── Gaussian splat renderer for signal field │
|
||||
│ └── Served at http://localhost:8080/ when model is loaded │
|
||||
│ Use: Open the .rvf file → get a working UI with no install │
|
||||
└───────────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
#### RVF Loading Sequence
|
||||
|
||||
```
|
||||
1. Read tail → find_latest_manifest() → SegmentDirectory
|
||||
2. Load Manifest (seg 0) → validate magic, version, model ID
|
||||
3. Load Profile (seg 8) → verify input/output spec compatibility
|
||||
4. Load Crypto (seg 9) → verify Ed25519 signature chain
|
||||
5. Load Quant (seg 4) → prepare quantization codebooks
|
||||
6. Load Index Layer A (seg 2) → entry points ready (<5ms)
|
||||
↓ (inference available at reduced accuracy)
|
||||
7. Load Vec (seg 1) → hot weight partitions via Layer A routing
|
||||
8. Load Index Layer B (seg 2) → hot adjacency ready (100ms)
|
||||
↓ (inference at full accuracy for common poses)
|
||||
9. Load Overlay (seg 3) → min-cut graphs, body skeleton
|
||||
10. Load AggregateWeights (seg 7) → apply matching SONA profile
|
||||
11. Load Index Layer C (seg 2) → complete graph loaded
|
||||
↓ (full inference with all weight partitions)
|
||||
12. Load Wasm (seg 10) → WASM runtime available (optional)
|
||||
13. Load Dashboard (seg 11) → UI served (optional)
|
||||
```
|
||||
|
||||
**Progressive availability**: Inference begins after step 6 (~5ms) with approximate
|
||||
results. Full accuracy is reached by step 9 (~500ms). This enables instant startup
|
||||
with gradually improving quality — critical for real-time applications.
|
||||
|
||||
#### RVF Build Pipeline
|
||||
|
||||
After training completes, the model is packaged into an `.rvf` file:
|
||||
|
||||
```bash
|
||||
# Build the RVF container from trained checkpoint
|
||||
cargo run -p wifi-densepose-train --bin build-rvf -- \
|
||||
--checkpoint checkpoints/best-pck.pt \
|
||||
--quantize int8,fp16 \
|
||||
--hnsw-build \
|
||||
--sign --key model-signing-key.pem \
|
||||
--include-wasm \
|
||||
--include-dashboard ../../ui \
|
||||
--output wifi-densepose-v1.rvf
|
||||
|
||||
# Verify the built container
|
||||
cargo run -p wifi-densepose-train --bin verify-rvf -- \
|
||||
--input wifi-densepose-v1.rvf \
|
||||
--verify-signature \
|
||||
--verify-witness \
|
||||
--benchmark-inference
|
||||
```
|
||||
|
||||
#### RVF Runtime Integration
|
||||
|
||||
The sensing server loads the `.rvf` container at startup:
|
||||
|
||||
```bash
|
||||
# Load model from RVF container
|
||||
./target/release/sensing-server \
|
||||
--model wifi-densepose-v1.rvf \
|
||||
--source auto \
|
||||
--ui-from-rvf # serve Dashboard segment instead of --ui-path
|
||||
```
|
||||
|
||||
```rust
|
||||
// In sensing-server/src/main.rs
|
||||
use rvf_runtime::RvfContainer;
|
||||
use rvf_index::layers::IndexLayer;
|
||||
use rvf_quant::QuantizedVec;
|
||||
|
||||
let container = RvfContainer::open("wifi-densepose-v1.rvf")?;
|
||||
|
||||
// Progressive load: Layer A first for instant startup
|
||||
let index = container.load_index(IndexLayer::A)?;
|
||||
let weights = container.load_vec_hot(&index)?; // hot partitions only
|
||||
|
||||
// Full load in background
|
||||
tokio::spawn(async move {
|
||||
container.load_index(IndexLayer::B).await?;
|
||||
container.load_index(IndexLayer::C).await?;
|
||||
container.load_vec_cold().await?; // remaining partitions
|
||||
});
|
||||
|
||||
// SONA environment adaptation
|
||||
let sona_deltas = container.load_aggregate_weights("office-3f")?;
|
||||
model.apply_lora_deltas(&sona_deltas);
|
||||
|
||||
// Serve embedded dashboard
|
||||
let dashboard = container.load_dashboard()?;
|
||||
// Mount at /ui/* routes in Axum
|
||||
```
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
### Phase 1: Dataset Loaders (2 weeks)
|
||||
|
||||
- Implement `MmFiDataset` in `wifi-densepose-train/src/dataset.rs`
|
||||
- Read MM-Fi `.npy` files with antenna correction (1TX/3RX → 3×3 zero-padding)
|
||||
- Subcarrier resampling 114→56 via `ruvector-solver::NeumannSolver`
|
||||
- Phase sanitization via `wifi-densepose-signal::phase_sanitizer`
|
||||
- Implement `WiPoseDataset` for secondary dataset
|
||||
- Temporal windowing with `ruvector-temporal-tensor`
|
||||
- **Deliverable**: `cargo test -p wifi-densepose-train` with dataset loading tests
|
||||
|
||||
### Phase 2: Graph Transformer Integration (2 weeks)
|
||||
|
||||
- Add `ruvector-graph-transformer` dependency to `wifi-densepose-train`
|
||||
- Replace bottleneck self-attention in `ModalityTranslator` with proof-gated graph transformer
|
||||
- Build antenna topology graph (nodes = antenna pairs, edges = spatial/temporal proximity)
|
||||
- Add `ruvector-gnn` dependency for body graph reasoning
|
||||
- Build COCO body skeleton graph (17 nodes, 16 anatomical edges)
|
||||
- Implement GNN message passing in spatial decoder
|
||||
- **Deliverable**: Model forward pass produces correct output shapes with graph layers
|
||||
|
||||
### Phase 3: Teacher-Student Label Generation (1 week)
|
||||
|
||||
- Python script using Detectron2 DensePose to generate UV pseudo-labels from MM-Fi RGB frames
|
||||
- Cache labels as `.npy` for Rust loader consumption
|
||||
- Validate label quality on a random subset (visual inspection)
|
||||
- **Deliverable**: Complete UV label set for MM-Fi training split
|
||||
|
||||
### Phase 4: Training Loop (3 weeks)
|
||||
|
||||
- Implement `WiFiDensePoseTrainer` with full loss function (6 terms)
|
||||
- Add `ruvector-math` optimal transport loss term
|
||||
- Integrate GNN edge consistency loss
|
||||
- Training loop with cosine LR schedule, early stopping, checkpointing
|
||||
- Validation metrics: PCK@0.2, OKS mAP, DensePose GPS
|
||||
- Deterministic proof verification (`proof.rs`) with weight hash
|
||||
- **Deliverable**: Trained model checkpoint achieving PCK@0.2 >70% on MM-Fi validation
|
||||
|
||||
### Phase 5: SONA Online Adaptation (2 weeks)
|
||||
|
||||
- Integrate `ruvector-sona` into inference pipeline
|
||||
- Implement LoRA injection at translator, backbone, and DensePose head layers
|
||||
- Implement EWC++ Fisher information computation and regularization
|
||||
- Self-supervised temporal consistency loss for unsupervised adaptation
|
||||
- Calibration mode: 5-minute camera session for supervised fine-tuning
|
||||
- Drift detection: monitor rolling PCK on temporal consistency proxy
|
||||
- **Deliverable**: Adaptation converges in <50 gradient steps, PCK recovers within 10% of base
|
||||
|
||||
### Phase 6: Sparse Inference and Edge Deployment (2 weeks)
|
||||
|
||||
- Profile neuron activation frequencies on validation set
|
||||
- Apply `ruvector-sparse-inference` hot/cold neuron partitioning
|
||||
- INT8 quantization for backbone, FP16 for heads
|
||||
- ONNX export with quantized weights
|
||||
- Benchmark on x86 (target: <10ms) and ARM (target: <50ms)
|
||||
- WASM export via `ruvector-sparse-inference-wasm` for browser inference
|
||||
- **Deliverable**: Quantized ONNX model, benchmark results, WASM binary
|
||||
|
||||
### Phase 7: RVF Container Build Pipeline (2 weeks)
|
||||
|
||||
- Implement `build-rvf` binary in `wifi-densepose-train`
|
||||
- Serialize trained weights into `Vec` segment (SegmentType::Vec, 0x01)
|
||||
- Build HNSW index over weight partitions for sparse inference (SegmentType::Index, 0x02)
|
||||
- Serialize min-cut graph overlays: subcarrier partition, antenna topology, body skeleton (SegmentType::Overlay, 0x03)
|
||||
- Generate quantization codebooks via `rvf-quant` (SegmentType::Quant, 0x06)
|
||||
- Write training proof witness with Ed25519 signature (SegmentType::Witness, 0x0A)
|
||||
- Store model metadata, COCO keypoint schema, normalization stats (SegmentType::Meta, 0x07)
|
||||
- Store SONA LoRA adaptation deltas per environment (SegmentType::AggregateWeights, 0x36)
|
||||
- Write RVDNA domain profile for WiFi CSI DensePose (SegmentType::Profile, 0x0B)
|
||||
- Optionally embed WASM inference runtime (SegmentType::Wasm, 0x10)
|
||||
- Optionally embed Three.js dashboard (SegmentType::Dashboard, 0x11)
|
||||
- Build Level-1 manifest and segment directory (SegmentType::Manifest, 0x05)
|
||||
- Implement `verify-rvf` binary for container validation
|
||||
- **Deliverable**: `wifi-densepose-v1.rvf` single-file container, verifiable and self-contained
|
||||
|
||||
### Phase 8: Integration with Sensing Server (1 week)
|
||||
|
||||
- Load `.rvf` container in `wifi-densepose-sensing-server` via `rvf-runtime`
|
||||
- Progressive loading: Layer A first for instant startup, full graph in background
|
||||
- Replace `derive_pose_from_sensing()` heuristic with trained model inference
|
||||
- Add `--model` CLI flag accepting `.rvf` path (or legacy `.onnx`)
|
||||
- Apply SONA LoRA deltas from `AggregateWeights` segment based on `--env` flag
|
||||
- Serve embedded Dashboard segment at `/ui/*` when `--ui-from-rvf` is set
|
||||
- Graceful fallback to heuristic when no model file present
|
||||
- Update WebSocket protocol to include DensePose UV data
|
||||
- **Deliverable**: Sensing server serves trained model from single `.rvf` file
|
||||
|
||||
## File Changes
|
||||
|
||||
### New Files
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `rust-port/.../wifi-densepose-train/src/dataset_mmfi.rs` | MM-Fi dataset loader with subcarrier resampling |
|
||||
| `rust-port/.../wifi-densepose-train/src/dataset_wipose.rs` | Wi-Pose dataset loader |
|
||||
| `rust-port/.../wifi-densepose-train/src/graph_transformer.rs` | Graph transformer integration |
|
||||
| `rust-port/.../wifi-densepose-train/src/body_gnn.rs` | GNN body graph reasoning |
|
||||
| `rust-port/.../wifi-densepose-train/src/adaptation.rs` | SONA LoRA + EWC++ adaptation |
|
||||
| `rust-port/.../wifi-densepose-train/src/trainer.rs` | Training loop with multi-term loss |
|
||||
| `scripts/generate_densepose_labels.py` | Teacher-student UV label generation |
|
||||
| `scripts/benchmark_inference.py` | Inference latency benchmarking |
|
||||
| `rust-port/.../wifi-densepose-train/src/rvf_builder.rs` | RVF container build pipeline |
|
||||
| `rust-port/.../wifi-densepose-train/src/bin/build_rvf.rs` | CLI binary for building `.rvf` containers |
|
||||
| `rust-port/.../wifi-densepose-train/src/bin/verify_rvf.rs` | CLI binary for verifying `.rvf` containers |
|
||||
|
||||
### Modified Files
|
||||
|
||||
| File | Change |
|
||||
|------|--------|
|
||||
| `rust-port/.../wifi-densepose-train/Cargo.toml` | Add ruvector-gnn, graph-transformer, sona, sparse-inference, math, rvf-types, rvf-wire, rvf-manifest, rvf-index, rvf-quant, rvf-crypto, rvf-runtime deps |
|
||||
| `rust-port/.../wifi-densepose-train/src/model.rs` | Integrate graph transformer + GNN layers |
|
||||
| `rust-port/.../wifi-densepose-train/src/losses.rs` | Add optimal transport + GNN edge consistency loss terms |
|
||||
| `rust-port/.../wifi-densepose-train/src/config.rs` | Add training hyperparameters for new components |
|
||||
| `rust-port/.../sensing-server/Cargo.toml` | Add rvf-runtime, rvf-types, rvf-index, rvf-quant deps |
|
||||
| `rust-port/.../sensing-server/src/main.rs` | Add `--model` flag, load `.rvf` container, progressive startup, serve embedded dashboard |
|
||||
|
||||
## Consequences
|
||||
|
||||
### Positive
|
||||
|
||||
- **Trained model produces accurate DensePose**: Moves from heuristic keypoints to learned body surface estimation backed by public dataset evaluation
|
||||
- **RuVector signal intelligence is a differentiator**: Graph transformers on antenna topology and GNN body reasoning are novel — no prior WiFi pose system uses these techniques
|
||||
- **SONA enables zero-shot deployment**: New environments don't require full retraining — LoRA adaptation with <50 gradient steps converges in seconds
|
||||
- **Sparse inference enables edge deployment**: PowerInfer-style neuron partitioning brings DensePose inference to ESP32-class hardware
|
||||
- **Graceful degradation**: Server falls back to heuristic pose when no model file is present — existing functionality is preserved
|
||||
- **Single-file deployment via RVF**: Trained model, embeddings, HNSW index, quantization codebooks, SONA adaptation profiles, WASM runtime, and dashboard UI packaged in one `.rvf` file — deploy by copying a single file
|
||||
- **Progressive loading**: RVF Layer A loads in <5ms for instant startup; full accuracy reached in ~500ms as remaining segments load
|
||||
- **Verifiable provenance**: RVF Witness segment contains deterministic training proof with Ed25519 signature — anyone can re-run training and verify weight hash
|
||||
- **Self-bootstrapping**: RVF Wasm segment enables browser-based inference with no server-side dependencies
|
||||
- **Open evaluation**: PCK, OKS, GPS metrics on public MM-Fi dataset provide reproducible, comparable results
|
||||
|
||||
### Negative
|
||||
|
||||
- **Training requires GPU**: Initial model training needs RTX 3090 or better (~8 hours on A100). Not all developers will have access.
|
||||
- **Teacher-student label generation requires Detectron2**: One-time Python + CUDA dependency for generating UV pseudo-labels from RGB frames
|
||||
- **MM-Fi CC BY-NC license**: Weights trained on MM-Fi cannot be used commercially without collecting proprietary data
|
||||
- **Environment-specific adaptation still required**: SONA reduces the burden but a brief calibration session in each new environment is still recommended for best accuracy
|
||||
- **6 additional RuVector crate dependencies**: Increases compile time and binary size. Mitigated by feature flags (e.g., `--features trained-model`).
|
||||
- **Model size on disk**: ~25MB (FP16) or ~12MB (INT8). Acceptable for server deployment, may need further pruning for WASM.
|
||||
|
||||
### Risks and Mitigations
|
||||
|
||||
| Risk | Mitigation |
|
||||
|------|------------|
|
||||
| MM-Fi 114→56 interpolation loses accuracy | Train at native 114 as alternative; ESP32 mesh can collect 56-sub data natively |
|
||||
| GNN overfits to training body types | Augment with diverse body proportions; Wi-Pose adds subject diversity |
|
||||
| SONA adaptation diverges in adversarial environments | EWC++ regularization caps parameter drift; rollback to base weights on detection |
|
||||
| Sparse inference degrades accuracy | Benchmark INT8 vs FP16 vs FP32; fall back to full precision if quality drops |
|
||||
| Training proof hash changes with RuVector version updates | Pin ruvector crate versions in Cargo.toml; regenerate hash on version bumps |
|
||||
|
||||
## References
|
||||
|
||||
- Geng et al., "DensePose From WiFi" (CMU, arXiv:2301.00250, 2023)
|
||||
- Yang et al., "MM-Fi: Multi-Modal Non-Intrusive 4D Human Dataset" (NeurIPS 2023, arXiv:2305.10345)
|
||||
- Hu et al., "LoRA: Low-Rank Adaptation of Large Language Models" (ICLR 2022)
|
||||
- Kirkpatrick et al., "Overcoming Catastrophic Forgetting in Neural Networks" (PNAS, 2017)
|
||||
- Song et al., "PowerInfer: Fast Large Language Model Serving with a Consumer-grade GPU" (2024)
|
||||
- ADR-005: SONA Self-Learning for Pose Estimation
|
||||
- ADR-015: Public Dataset Strategy for Trained Pose Estimation Model
|
||||
- ADR-016: RuVector Integration for Training Pipeline
|
||||
- ADR-020: Migrate AI/Model Inference to Rust with RuVector and ONNX Runtime
|
||||
|
||||
## Appendix A: RuQu Consideration
|
||||
|
||||
**ruQu** ("Classical nervous system for quantum machines") provides real-time coherence
|
||||
assessment via dynamic min-cut. While primarily designed for quantum error correction
|
||||
(syndrome decoding, surface code arbitration), its core primitive — the `CoherenceGate` —
|
||||
is architecturally relevant to WiFi CSI processing:
|
||||
|
||||
- **CoherenceGate** uses `ruvector-mincut` to make real-time gate/pass decisions on
|
||||
signal streams based on structural coherence thresholds. In quantum computing, this
|
||||
gates qubit syndrome streams. For WiFi CSI, the same mechanism could gate CSI
|
||||
subcarrier streams — passing only subcarriers whose coherence (phase stability across
|
||||
antennas) exceeds a dynamic threshold.
|
||||
|
||||
- **Syndrome filtering** (`filters.rs`) implements Kalman-like adaptive filters that
|
||||
could be repurposed for CSI noise filtering — treating each subcarrier's amplitude
|
||||
drift as a "syndrome" stream.
|
||||
|
||||
- **Min-cut gated transformer** integration (optional feature) provides coherence-optimized
|
||||
attention with 50% FLOP reduction — directly applicable to the `ModalityTranslator`
|
||||
bottleneck.
|
||||
|
||||
**Decision**: ruQu is not included in the initial pipeline (Phase 1-8) but is marked as a
|
||||
**Phase 9 exploration** candidate for coherence-gated CSI filtering. The CoherenceGate
|
||||
primitive maps naturally to subcarrier quality assessment, and the integration path is
|
||||
clean since ruQu already depends on `ruvector-mincut`.
|
||||
|
||||
## Appendix B: Training Data Strategy
|
||||
|
||||
The pipeline supports three data sources for training, used in combination:
|
||||
|
||||
| Source | Subcarriers | Pose Labels | Volume | Cost | When |
|
||||
|--------|-------------|-------------|--------|------|------|
|
||||
| **MM-Fi** (public) | 114 → 56 (interpolated) | 17 COCO + DensePose UV | 40 subjects, 320K frames | Free (CC BY-NC) | Phase 1 — bootstrap |
|
||||
| **Wi-Pose** (public) | 30 → 56 (zero-padded) | 18 keypoints | 12 subjects, 166K packets | Free (research) | Phase 1 — diversity |
|
||||
| **ESP32 self-collected** | 56 (native) | Teacher-student from camera | Unlimited, environment-specific | Hardware only ($54) | Phase 4+ — fine-tuning |
|
||||
|
||||
**Recommended approach: Both public + ESP32 data.**
|
||||
|
||||
1. **Pre-train on MM-Fi + Wi-Pose** (public data, Phase 1-4): Provides the base model
|
||||
with diverse subjects and actions. The 114→56 subcarrier interpolation is acceptable
|
||||
for learning general CSI-to-pose mappings.
|
||||
|
||||
2. **Fine-tune on ESP32 self-collected data** (Phase 5+, SONA adaptation): Collect
|
||||
5-30 minutes of paired ESP32 CSI + camera data in each target environment. The camera
|
||||
serves as the teacher model (Detectron2 generates pseudo-labels). SONA LoRA adaptation
|
||||
takes <50 gradient steps to converge.
|
||||
|
||||
3. **Continuous adaptation** (runtime): SONA's self-supervised temporal consistency loss
|
||||
refines the model without any camera, using the assumption that poses change smoothly
|
||||
over short time windows.
|
||||
|
||||
This three-tier strategy gives you:
|
||||
- A working model from day one (public data)
|
||||
- Environment-specific accuracy (ESP32 fine-tuning)
|
||||
- Ongoing drift correction (SONA runtime adaptation)
|
||||
@@ -745,4 +745,94 @@ mod tests {
|
||||
assert!((sum - 1.0).abs() < 1e-5);
|
||||
for &wi in &w3 { assert!(wi.is_finite()); }
|
||||
}
|
||||
|
||||
// ── Weight serialization integration tests ────────────────────────
|
||||
|
||||
#[test]
|
||||
fn linear_flatten_unflatten_roundtrip() {
|
||||
let lin = Linear::with_seed(8, 4, 42);
|
||||
let mut flat = Vec::new();
|
||||
lin.flatten_into(&mut flat);
|
||||
assert_eq!(flat.len(), lin.param_count());
|
||||
let (restored, consumed) = Linear::unflatten_from(&flat, 8, 4);
|
||||
assert_eq!(consumed, flat.len());
|
||||
let inp = vec![1.0f32; 8];
|
||||
assert_eq!(lin.forward(&inp), restored.forward(&inp));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn cross_attention_flatten_unflatten_roundtrip() {
|
||||
let ca = CrossAttention::new(16, 4);
|
||||
let mut flat = Vec::new();
|
||||
ca.flatten_into(&mut flat);
|
||||
assert_eq!(flat.len(), ca.param_count());
|
||||
let (restored, consumed) = CrossAttention::unflatten_from(&flat, 16, 4);
|
||||
assert_eq!(consumed, flat.len());
|
||||
let q = vec![vec![0.5f32; 16]; 3];
|
||||
let k = vec![vec![0.3f32; 16]; 5];
|
||||
let v = vec![vec![0.7f32; 16]; 5];
|
||||
let orig = ca.forward(&q, &k, &v);
|
||||
let rest = restored.forward(&q, &k, &v);
|
||||
for (a, b) in orig.iter().zip(rest.iter()) {
|
||||
for (x, y) in a.iter().zip(b.iter()) {
|
||||
assert!((x - y).abs() < 1e-6, "mismatch: {x} vs {y}");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn transformer_weight_roundtrip() {
|
||||
let config = TransformerConfig {
|
||||
n_subcarriers: 16, n_keypoints: 17, d_model: 8, n_heads: 2, n_gnn_layers: 1,
|
||||
};
|
||||
let t = CsiToPoseTransformer::new(config.clone());
|
||||
let weights = t.flatten_weights();
|
||||
assert_eq!(weights.len(), t.param_count());
|
||||
|
||||
let mut t2 = CsiToPoseTransformer::new(config);
|
||||
t2.unflatten_weights(&weights).expect("unflatten should succeed");
|
||||
|
||||
// Forward pass should produce identical results
|
||||
let csi = vec![vec![0.5f32; 16]; 4];
|
||||
let out1 = t.forward(&csi);
|
||||
let out2 = t2.forward(&csi);
|
||||
for (a, b) in out1.keypoints.iter().zip(out2.keypoints.iter()) {
|
||||
assert!((a.0 - b.0).abs() < 1e-6);
|
||||
assert!((a.1 - b.1).abs() < 1e-6);
|
||||
assert!((a.2 - b.2).abs() < 1e-6);
|
||||
}
|
||||
for (a, b) in out1.confidences.iter().zip(out2.confidences.iter()) {
|
||||
assert!((a - b).abs() < 1e-6);
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn transformer_param_count_positive() {
|
||||
let t = CsiToPoseTransformer::new(TransformerConfig::default());
|
||||
assert!(t.param_count() > 1000, "expected many params, got {}", t.param_count());
|
||||
let flat = t.flatten_weights();
|
||||
assert_eq!(flat.len(), t.param_count());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn gnn_stack_flatten_unflatten() {
|
||||
let bg = BodyGraph::new();
|
||||
let gnn = GnnStack::new(8, 8, 2, &bg);
|
||||
let mut flat = Vec::new();
|
||||
gnn.flatten_into(&mut flat);
|
||||
assert_eq!(flat.len(), gnn.param_count());
|
||||
|
||||
let mut gnn2 = GnnStack::new(8, 8, 2, &bg);
|
||||
let consumed = gnn2.unflatten_from(&flat);
|
||||
assert_eq!(consumed, flat.len());
|
||||
|
||||
let feats = vec![vec![1.0f32; 8]; 17];
|
||||
let o1 = gnn.forward(&feats);
|
||||
let o2 = gnn2.forward(&feats);
|
||||
for (a, b) in o1.iter().zip(o2.iter()) {
|
||||
for (x, y) in a.iter().zip(b.iter()) {
|
||||
assert!((x - y).abs() < 1e-6);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -11,11 +11,9 @@
|
||||
mod rvf_container;
|
||||
mod rvf_pipeline;
|
||||
mod vital_signs;
|
||||
mod graph_transformer;
|
||||
mod trainer;
|
||||
mod dataset;
|
||||
mod sparse_inference;
|
||||
mod sona;
|
||||
|
||||
// Training pipeline modules (exposed via lib.rs)
|
||||
use wifi_densepose_sensing_server::{graph_transformer, trainer, dataset};
|
||||
|
||||
use std::collections::VecDeque;
|
||||
use std::net::SocketAddr;
|
||||
@@ -1538,6 +1536,169 @@ async fn main() {
|
||||
return;
|
||||
}
|
||||
|
||||
// Handle --train mode: train a model and exit
|
||||
if args.train {
|
||||
eprintln!("=== WiFi-DensePose Training Mode ===");
|
||||
|
||||
// Build data pipeline
|
||||
let ds_path = args.dataset.clone().unwrap_or_else(|| PathBuf::from("data"));
|
||||
let source = match args.dataset_type.as_str() {
|
||||
"wipose" => dataset::DataSource::WiPose(ds_path.clone()),
|
||||
_ => dataset::DataSource::MmFi(ds_path.clone()),
|
||||
};
|
||||
let pipeline = dataset::DataPipeline::new(dataset::DataConfig {
|
||||
source,
|
||||
..Default::default()
|
||||
});
|
||||
|
||||
// Load samples
|
||||
let samples = match pipeline.load() {
|
||||
Ok(s) if !s.is_empty() => {
|
||||
eprintln!("Loaded {} samples from {}", s.len(), ds_path.display());
|
||||
s
|
||||
}
|
||||
Ok(_) => {
|
||||
eprintln!("No samples found at {}. Generating synthetic training data...", ds_path.display());
|
||||
// Generate synthetic samples for testing the pipeline
|
||||
let mut synth = Vec::new();
|
||||
for i in 0..50 {
|
||||
let csi: Vec<Vec<f32>> = (0..4).map(|a| {
|
||||
(0..56).map(|s| ((i * 7 + a * 13 + s) as f32 * 0.31).sin() * 0.5).collect()
|
||||
}).collect();
|
||||
let mut kps = [(0.0f32, 0.0f32, 1.0f32); 17];
|
||||
for (k, kp) in kps.iter_mut().enumerate() {
|
||||
kp.0 = (k as f32 * 0.1 + i as f32 * 0.02).sin() * 100.0 + 320.0;
|
||||
kp.1 = (k as f32 * 0.15 + i as f32 * 0.03).cos() * 80.0 + 240.0;
|
||||
}
|
||||
synth.push(dataset::TrainingSample {
|
||||
csi_window: csi,
|
||||
pose_label: dataset::PoseLabel {
|
||||
keypoints: kps,
|
||||
body_parts: Vec::new(),
|
||||
confidence: 1.0,
|
||||
},
|
||||
source: "synthetic",
|
||||
});
|
||||
}
|
||||
synth
|
||||
}
|
||||
Err(e) => {
|
||||
eprintln!("Failed to load dataset: {e}");
|
||||
eprintln!("Generating synthetic training data...");
|
||||
let mut synth = Vec::new();
|
||||
for i in 0..50 {
|
||||
let csi: Vec<Vec<f32>> = (0..4).map(|a| {
|
||||
(0..56).map(|s| ((i * 7 + a * 13 + s) as f32 * 0.31).sin() * 0.5).collect()
|
||||
}).collect();
|
||||
let mut kps = [(0.0f32, 0.0f32, 1.0f32); 17];
|
||||
for (k, kp) in kps.iter_mut().enumerate() {
|
||||
kp.0 = (k as f32 * 0.1 + i as f32 * 0.02).sin() * 100.0 + 320.0;
|
||||
kp.1 = (k as f32 * 0.15 + i as f32 * 0.03).cos() * 80.0 + 240.0;
|
||||
}
|
||||
synth.push(dataset::TrainingSample {
|
||||
csi_window: csi,
|
||||
pose_label: dataset::PoseLabel {
|
||||
keypoints: kps,
|
||||
body_parts: Vec::new(),
|
||||
confidence: 1.0,
|
||||
},
|
||||
source: "synthetic",
|
||||
});
|
||||
}
|
||||
synth
|
||||
}
|
||||
};
|
||||
|
||||
// Convert dataset samples to trainer format
|
||||
let trainer_samples: Vec<trainer::TrainingSample> = samples.iter()
|
||||
.map(trainer::from_dataset_sample)
|
||||
.collect();
|
||||
|
||||
// Split 80/20 train/val
|
||||
let split = (trainer_samples.len() * 4) / 5;
|
||||
let (train_data, val_data) = trainer_samples.split_at(split.max(1));
|
||||
eprintln!("Train: {} samples, Val: {} samples", train_data.len(), val_data.len());
|
||||
|
||||
// Create transformer + trainer
|
||||
let n_subcarriers = train_data.first()
|
||||
.and_then(|s| s.csi_features.first())
|
||||
.map(|f| f.len())
|
||||
.unwrap_or(56);
|
||||
let tf_config = graph_transformer::TransformerConfig {
|
||||
n_subcarriers,
|
||||
n_keypoints: 17,
|
||||
d_model: 64,
|
||||
n_heads: 4,
|
||||
n_gnn_layers: 2,
|
||||
};
|
||||
let transformer = graph_transformer::CsiToPoseTransformer::new(tf_config);
|
||||
eprintln!("Transformer params: {}", transformer.param_count());
|
||||
|
||||
let trainer_config = trainer::TrainerConfig {
|
||||
epochs: args.epochs,
|
||||
batch_size: 8,
|
||||
lr: 0.001,
|
||||
warmup_epochs: 5,
|
||||
min_lr: 1e-6,
|
||||
early_stop_patience: 20,
|
||||
checkpoint_every: 10,
|
||||
..Default::default()
|
||||
};
|
||||
let mut t = trainer::Trainer::with_transformer(trainer_config, transformer);
|
||||
|
||||
// Run training
|
||||
eprintln!("Starting training for {} epochs...", args.epochs);
|
||||
let result = t.run_training(train_data, val_data);
|
||||
eprintln!("Training complete in {:.1}s", result.total_time_secs);
|
||||
eprintln!(" Best epoch: {}, PCK@0.2: {:.4}, OKS mAP: {:.4}",
|
||||
result.best_epoch, result.best_pck, result.best_oks);
|
||||
|
||||
// Save checkpoint
|
||||
if let Some(ref ckpt_dir) = args.checkpoint_dir {
|
||||
let _ = std::fs::create_dir_all(ckpt_dir);
|
||||
let ckpt_path = ckpt_dir.join("best_checkpoint.json");
|
||||
let ckpt = t.checkpoint();
|
||||
match ckpt.save_to_file(&ckpt_path) {
|
||||
Ok(()) => eprintln!("Checkpoint saved to {}", ckpt_path.display()),
|
||||
Err(e) => eprintln!("Failed to save checkpoint: {e}"),
|
||||
}
|
||||
}
|
||||
|
||||
// Sync weights back to transformer and save as RVF
|
||||
t.sync_transformer_weights();
|
||||
if let Some(ref save_path) = args.save_rvf {
|
||||
eprintln!("Saving trained model to RVF: {}", save_path.display());
|
||||
let weights = t.params().to_vec();
|
||||
let mut builder = RvfBuilder::new();
|
||||
builder.add_manifest(
|
||||
"wifi-densepose-trained",
|
||||
env!("CARGO_PKG_VERSION"),
|
||||
"WiFi DensePose trained model weights",
|
||||
);
|
||||
builder.add_metadata(&serde_json::json!({
|
||||
"training": {
|
||||
"epochs": args.epochs,
|
||||
"best_epoch": result.best_epoch,
|
||||
"best_pck": result.best_pck,
|
||||
"best_oks": result.best_oks,
|
||||
"n_train_samples": train_data.len(),
|
||||
"n_val_samples": val_data.len(),
|
||||
"n_subcarriers": n_subcarriers,
|
||||
"param_count": weights.len(),
|
||||
},
|
||||
}));
|
||||
builder.add_vital_config(&VitalSignConfig::default());
|
||||
builder.add_weights(&weights);
|
||||
match builder.write_to_file(save_path) {
|
||||
Ok(()) => eprintln!("RVF saved ({} params, {} bytes)",
|
||||
weights.len(), weights.len() * 4),
|
||||
Err(e) => eprintln!("Failed to save RVF: {e}"),
|
||||
}
|
||||
}
|
||||
|
||||
return;
|
||||
}
|
||||
|
||||
info!("WiFi-DensePose Sensing Server (Rust + Axum + RuVector)");
|
||||
info!(" HTTP: http://localhost:{}", args.http_port);
|
||||
info!(" WebSocket: ws://localhost:{}/ws/sensing", args.ws_port);
|
||||
@@ -1761,10 +1922,18 @@ async fn main() {
|
||||
"uptime_secs": s.start_time.elapsed().as_secs(),
|
||||
}));
|
||||
builder.add_vital_config(&VitalSignConfig::default());
|
||||
// Save dummy weights (placeholder for real model weights)
|
||||
builder.add_weights(&[0.0f32; 0]);
|
||||
// Save transformer weights if a model is loaded, otherwise empty
|
||||
let weights: Vec<f32> = if s.model_loaded {
|
||||
// If we loaded via --model, the progressive loader has the weights
|
||||
// For now, save runtime state placeholder
|
||||
let tf = graph_transformer::CsiToPoseTransformer::new(Default::default());
|
||||
tf.flatten_weights()
|
||||
} else {
|
||||
Vec::new()
|
||||
};
|
||||
builder.add_weights(&weights);
|
||||
match builder.write_to_file(save_path) {
|
||||
Ok(()) => info!(" RVF saved successfully"),
|
||||
Ok(()) => info!(" RVF saved ({} weight params)", weights.len()),
|
||||
Err(e) => error!(" Failed to save RVF: {e}"),
|
||||
}
|
||||
}
|
||||
|
||||
@@ -687,4 +687,67 @@ mod tests {
|
||||
assert!(r.speedup > 0.0);
|
||||
assert!(r.accuracy_loss.is_finite());
|
||||
}
|
||||
|
||||
// ── Quantization integration tests ────────────────────────────
|
||||
|
||||
#[test]
|
||||
fn apply_quantization_enables_quantized_forward() {
|
||||
let w = vec![
|
||||
vec![1.0, 2.0, 3.0, 4.0],
|
||||
vec![-1.0, -2.0, -3.0, -4.0],
|
||||
vec![0.5, 1.5, 2.5, 3.5],
|
||||
];
|
||||
let b = vec![0.1, 0.2, 0.3];
|
||||
let mut m = SparseModel::new(SparseConfig {
|
||||
quant_mode: QuantMode::Int8Symmetric,
|
||||
..Default::default()
|
||||
});
|
||||
m.add_layer("fc1", w.clone(), b.clone());
|
||||
|
||||
// Before quantization: dense forward
|
||||
let input = vec![1.0, 0.5, -1.0, 0.0];
|
||||
let dense_out = m.forward(&input);
|
||||
|
||||
// Apply quantization
|
||||
m.apply_quantization();
|
||||
|
||||
// After quantization: should use dequantized weights
|
||||
let quant_out = m.forward(&input);
|
||||
|
||||
// Output should be close to dense (within INT8 precision)
|
||||
for (d, q) in dense_out.iter().zip(quant_out.iter()) {
|
||||
let rel_err = if d.abs() > 0.01 { (d - q).abs() / d.abs() } else { (d - q).abs() };
|
||||
assert!(rel_err < 0.05, "quantized error too large: dense={d}, quant={q}, err={rel_err}");
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn quantized_forward_accuracy_within_5_percent() {
|
||||
// Multi-layer model
|
||||
let mut m = SparseModel::new(SparseConfig {
|
||||
quant_mode: QuantMode::Int8Symmetric,
|
||||
..Default::default()
|
||||
});
|
||||
let w1: Vec<Vec<f32>> = (0..8).map(|r| {
|
||||
(0..8).map(|c| ((r * 8 + c) as f32 * 0.17).sin() * 2.0).collect()
|
||||
}).collect();
|
||||
let b1 = vec![0.0f32; 8];
|
||||
let w2: Vec<Vec<f32>> = (0..4).map(|r| {
|
||||
(0..8).map(|c| ((r * 8 + c) as f32 * 0.23).cos() * 1.5).collect()
|
||||
}).collect();
|
||||
let b2 = vec![0.0f32; 4];
|
||||
m.add_layer("fc1", w1, b1);
|
||||
m.add_layer("fc2", w2, b2);
|
||||
|
||||
let input = vec![1.0, -0.5, 0.3, 0.7, -0.2, 0.9, -0.4, 0.6];
|
||||
let dense_out = m.forward(&input);
|
||||
|
||||
m.apply_quantization();
|
||||
let quant_out = m.forward(&input);
|
||||
|
||||
// MSE between dense and quantized should be small
|
||||
let mse: f32 = dense_out.iter().zip(quant_out.iter())
|
||||
.map(|(d, q)| (d - q).powi(2)).sum::<f32>() / dense_out.len() as f32;
|
||||
assert!(mse < 0.5, "quantization MSE too large: {mse}");
|
||||
}
|
||||
}
|
||||
|
||||
@@ -777,4 +777,98 @@ mod tests {
|
||||
let _ = std::fs::remove_file(&path);
|
||||
let _ = std::fs::remove_dir(&dir);
|
||||
}
|
||||
|
||||
// ── Integration tests: transformer + trainer pipeline ──────────
|
||||
|
||||
#[test]
|
||||
fn dataset_to_trainer_conversion() {
|
||||
let ds = crate::dataset::TrainingSample {
|
||||
csi_window: vec![vec![1.0; 8]; 4],
|
||||
pose_label: crate::dataset::PoseLabel {
|
||||
keypoints: {
|
||||
let mut kp = [(0.0f32, 0.0f32, 1.0f32); 17];
|
||||
for (i, k) in kp.iter_mut().enumerate() {
|
||||
k.0 = i as f32; k.1 = i as f32 * 2.0;
|
||||
}
|
||||
kp
|
||||
},
|
||||
body_parts: Vec::new(),
|
||||
confidence: 1.0,
|
||||
},
|
||||
source: "test",
|
||||
};
|
||||
let ts = from_dataset_sample(&ds);
|
||||
assert_eq!(ts.csi_features.len(), 4);
|
||||
assert_eq!(ts.csi_features[0].len(), 8);
|
||||
assert_eq!(ts.target_keypoints.len(), 17);
|
||||
assert!((ts.target_keypoints[0].0 - 0.0).abs() < 1e-6);
|
||||
assert!((ts.target_keypoints[1].0 - 1.0).abs() < 1e-6);
|
||||
assert!(ts.target_body_parts.is_empty()); // no body parts in source
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn trainer_with_transformer_runs_epoch() {
|
||||
use crate::graph_transformer::{CsiToPoseTransformer, TransformerConfig};
|
||||
let tf_config = TransformerConfig {
|
||||
n_subcarriers: 8, n_keypoints: 17, d_model: 8, n_heads: 2, n_gnn_layers: 1,
|
||||
};
|
||||
let transformer = CsiToPoseTransformer::new(tf_config);
|
||||
let config = TrainerConfig {
|
||||
epochs: 2, batch_size: 4, lr: 0.001,
|
||||
warmup_epochs: 0, early_stop_patience: 100,
|
||||
..Default::default()
|
||||
};
|
||||
let mut t = Trainer::with_transformer(config, transformer);
|
||||
|
||||
// The params should be the transformer's flattened weights
|
||||
assert!(t.params().len() > 100, "transformer should have many params");
|
||||
|
||||
// Create samples matching the transformer's n_subcarriers=8
|
||||
let samples: Vec<TrainingSample> = (0..8).map(|i| TrainingSample {
|
||||
csi_features: vec![vec![(i as f32 * 0.1).sin(); 8]; 4],
|
||||
target_keypoints: (0..17).map(|k| (k as f32 * 0.5, k as f32 * 0.3, 1.0)).collect(),
|
||||
target_body_parts: vec![0, 1, 2],
|
||||
target_uv: (vec![0.5; 3], vec![0.5; 3]),
|
||||
}).collect();
|
||||
|
||||
let stats = t.train_epoch(&samples);
|
||||
assert!(stats.train_loss.is_finite(), "loss should be finite");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn trainer_with_transformer_loss_finite_after_training() {
|
||||
use crate::graph_transformer::{CsiToPoseTransformer, TransformerConfig};
|
||||
let tf_config = TransformerConfig {
|
||||
n_subcarriers: 8, n_keypoints: 17, d_model: 8, n_heads: 2, n_gnn_layers: 1,
|
||||
};
|
||||
let transformer = CsiToPoseTransformer::new(tf_config);
|
||||
let config = TrainerConfig {
|
||||
epochs: 3, batch_size: 4, lr: 0.0001,
|
||||
warmup_epochs: 0, early_stop_patience: 100,
|
||||
..Default::default()
|
||||
};
|
||||
let mut t = Trainer::with_transformer(config, transformer);
|
||||
|
||||
let samples: Vec<TrainingSample> = (0..4).map(|i| TrainingSample {
|
||||
csi_features: vec![vec![(i as f32 * 0.2).sin(); 8]; 4],
|
||||
target_keypoints: (0..17).map(|k| (k as f32 * 0.5, k as f32 * 0.3, 1.0)).collect(),
|
||||
target_body_parts: vec![],
|
||||
target_uv: (vec![], vec![]),
|
||||
}).collect();
|
||||
|
||||
let result = t.run_training(&samples, &[]);
|
||||
assert!(result.history.iter().all(|s| s.train_loss.is_finite()),
|
||||
"all losses should be finite");
|
||||
|
||||
// Sync weights back and verify transformer still works
|
||||
t.sync_transformer_weights();
|
||||
if let Some(tf) = t.transformer() {
|
||||
let out = tf.forward(&vec![vec![1.0; 8]; 4]);
|
||||
assert_eq!(out.keypoints.len(), 17);
|
||||
for (i, &(x, y, z)) in out.keypoints.iter().enumerate() {
|
||||
assert!(x.is_finite() && y.is_finite() && z.is_finite(),
|
||||
"kp {i} not finite after training");
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -0,0 +1,36 @@
|
||||
[package]
|
||||
name = "wifi-densepose-vitals"
|
||||
version.workspace = true
|
||||
edition.workspace = true
|
||||
description = "ESP32 CSI-grade vital sign extraction (ADR-021): heart rate and respiratory rate from WiFi Channel State Information"
|
||||
license.workspace = true
|
||||
|
||||
[dependencies]
|
||||
tracing.workspace = true
|
||||
serde = { workspace = true, optional = true }
|
||||
|
||||
[dev-dependencies]
|
||||
serde_json.workspace = true
|
||||
|
||||
[features]
|
||||
default = ["serde"]
|
||||
serde = ["dep:serde"]
|
||||
|
||||
[lints.rust]
|
||||
unsafe_code = "forbid"
|
||||
|
||||
[lints.clippy]
|
||||
all = "warn"
|
||||
pedantic = "warn"
|
||||
doc_markdown = "allow"
|
||||
module_name_repetitions = "allow"
|
||||
must_use_candidate = "allow"
|
||||
missing_errors_doc = "allow"
|
||||
missing_panics_doc = "allow"
|
||||
cast_precision_loss = "allow"
|
||||
cast_lossless = "allow"
|
||||
cast_possible_truncation = "allow"
|
||||
cast_sign_loss = "allow"
|
||||
many_single_char_names = "allow"
|
||||
uninlined_format_args = "allow"
|
||||
assigning_clones = "allow"
|
||||
@@ -0,0 +1,399 @@
|
||||
//! Vital sign anomaly detection.
|
||||
//!
|
||||
//! Monitors vital sign readings for anomalies (apnea, tachycardia,
|
||||
//! bradycardia, sudden changes) using z-score detection with
|
||||
//! running mean and standard deviation.
|
||||
//!
|
||||
//! Modeled on the DNA biomarker anomaly detection pattern from
|
||||
//! `vendor/ruvector/examples/dna`, using Welford's online algorithm
|
||||
//! for numerically stable running statistics.
|
||||
|
||||
use crate::types::VitalReading;
|
||||
|
||||
#[cfg(feature = "serde")]
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// An anomaly alert generated from vital sign analysis.
|
||||
#[derive(Debug, Clone)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub struct AnomalyAlert {
|
||||
/// Type of vital sign: `"respiratory"` or `"cardiac"`.
|
||||
pub vital_type: String,
|
||||
/// Type of anomaly: `"apnea"`, `"tachypnea"`, `"bradypnea"`,
|
||||
/// `"tachycardia"`, `"bradycardia"`, `"sudden_change"`.
|
||||
pub alert_type: String,
|
||||
/// Severity [0.0, 1.0].
|
||||
pub severity: f64,
|
||||
/// Human-readable description.
|
||||
pub message: String,
|
||||
}
|
||||
|
||||
/// Welford online statistics accumulator.
|
||||
#[derive(Debug, Clone)]
|
||||
struct WelfordStats {
|
||||
count: u64,
|
||||
mean: f64,
|
||||
m2: f64,
|
||||
}
|
||||
|
||||
impl WelfordStats {
|
||||
fn new() -> Self {
|
||||
Self {
|
||||
count: 0,
|
||||
mean: 0.0,
|
||||
m2: 0.0,
|
||||
}
|
||||
}
|
||||
|
||||
fn update(&mut self, value: f64) {
|
||||
self.count += 1;
|
||||
let delta = value - self.mean;
|
||||
self.mean += delta / self.count as f64;
|
||||
let delta2 = value - self.mean;
|
||||
self.m2 += delta * delta2;
|
||||
}
|
||||
|
||||
fn variance(&self) -> f64 {
|
||||
if self.count < 2 {
|
||||
return 0.0;
|
||||
}
|
||||
self.m2 / (self.count - 1) as f64
|
||||
}
|
||||
|
||||
fn std_dev(&self) -> f64 {
|
||||
self.variance().sqrt()
|
||||
}
|
||||
|
||||
fn z_score(&self, value: f64) -> f64 {
|
||||
let sd = self.std_dev();
|
||||
if sd < 1e-10 {
|
||||
return 0.0;
|
||||
}
|
||||
(value - self.mean) / sd
|
||||
}
|
||||
}
|
||||
|
||||
/// Vital sign anomaly detector using z-score analysis with
|
||||
/// running statistics.
|
||||
pub struct VitalAnomalyDetector {
|
||||
/// Running statistics for respiratory rate.
|
||||
rr_stats: WelfordStats,
|
||||
/// Running statistics for heart rate.
|
||||
hr_stats: WelfordStats,
|
||||
/// Recent respiratory rate values for windowed analysis.
|
||||
rr_history: Vec<f64>,
|
||||
/// Recent heart rate values for windowed analysis.
|
||||
hr_history: Vec<f64>,
|
||||
/// Maximum window size for history.
|
||||
window: usize,
|
||||
/// Z-score threshold for anomaly detection.
|
||||
z_threshold: f64,
|
||||
}
|
||||
|
||||
impl VitalAnomalyDetector {
|
||||
/// Create a new anomaly detector.
|
||||
///
|
||||
/// - `window`: number of recent readings to retain.
|
||||
/// - `z_threshold`: z-score threshold for anomaly alerts (default: 2.5).
|
||||
#[must_use]
|
||||
pub fn new(window: usize, z_threshold: f64) -> Self {
|
||||
Self {
|
||||
rr_stats: WelfordStats::new(),
|
||||
hr_stats: WelfordStats::new(),
|
||||
rr_history: Vec::with_capacity(window),
|
||||
hr_history: Vec::with_capacity(window),
|
||||
window,
|
||||
z_threshold,
|
||||
}
|
||||
}
|
||||
|
||||
/// Create with defaults (window = 60, z_threshold = 2.5).
|
||||
#[must_use]
|
||||
pub fn default_config() -> Self {
|
||||
Self::new(60, 2.5)
|
||||
}
|
||||
|
||||
/// Check a vital sign reading for anomalies.
|
||||
///
|
||||
/// Updates running statistics and returns a list of detected
|
||||
/// anomaly alerts (may be empty if all readings are normal).
|
||||
pub fn check(&mut self, reading: &VitalReading) -> Vec<AnomalyAlert> {
|
||||
let mut alerts = Vec::new();
|
||||
|
||||
let rr = reading.respiratory_rate.value_bpm;
|
||||
let hr = reading.heart_rate.value_bpm;
|
||||
|
||||
// Update histories
|
||||
self.rr_history.push(rr);
|
||||
if self.rr_history.len() > self.window {
|
||||
self.rr_history.remove(0);
|
||||
}
|
||||
self.hr_history.push(hr);
|
||||
if self.hr_history.len() > self.window {
|
||||
self.hr_history.remove(0);
|
||||
}
|
||||
|
||||
// Update running statistics
|
||||
self.rr_stats.update(rr);
|
||||
self.hr_stats.update(hr);
|
||||
|
||||
// Need at least a few readings before detecting anomalies
|
||||
if self.rr_stats.count < 5 {
|
||||
return alerts;
|
||||
}
|
||||
|
||||
// --- Respiratory rate anomalies ---
|
||||
let rr_z = self.rr_stats.z_score(rr);
|
||||
|
||||
// Clinical thresholds for respiratory rate (adult)
|
||||
if rr < 4.0 && reading.respiratory_rate.confidence > 0.3 {
|
||||
alerts.push(AnomalyAlert {
|
||||
vital_type: "respiratory".to_string(),
|
||||
alert_type: "apnea".to_string(),
|
||||
severity: 0.9,
|
||||
message: format!("Possible apnea detected: RR = {rr:.1} BPM"),
|
||||
});
|
||||
} else if rr > 30.0 && reading.respiratory_rate.confidence > 0.3 {
|
||||
alerts.push(AnomalyAlert {
|
||||
vital_type: "respiratory".to_string(),
|
||||
alert_type: "tachypnea".to_string(),
|
||||
severity: ((rr - 30.0) / 20.0).clamp(0.3, 1.0),
|
||||
message: format!("Elevated respiratory rate: RR = {rr:.1} BPM"),
|
||||
});
|
||||
} else if rr < 8.0 && reading.respiratory_rate.confidence > 0.3 {
|
||||
alerts.push(AnomalyAlert {
|
||||
vital_type: "respiratory".to_string(),
|
||||
alert_type: "bradypnea".to_string(),
|
||||
severity: ((8.0 - rr) / 8.0).clamp(0.3, 0.8),
|
||||
message: format!("Low respiratory rate: RR = {rr:.1} BPM"),
|
||||
});
|
||||
}
|
||||
|
||||
// Z-score based sudden change detection for RR
|
||||
if rr_z.abs() > self.z_threshold {
|
||||
alerts.push(AnomalyAlert {
|
||||
vital_type: "respiratory".to_string(),
|
||||
alert_type: "sudden_change".to_string(),
|
||||
severity: (rr_z.abs() / (self.z_threshold * 2.0)).clamp(0.2, 1.0),
|
||||
message: format!(
|
||||
"Sudden respiratory rate change: z-score = {rr_z:.2} (RR = {rr:.1} BPM)"
|
||||
),
|
||||
});
|
||||
}
|
||||
|
||||
// --- Heart rate anomalies ---
|
||||
let hr_z = self.hr_stats.z_score(hr);
|
||||
|
||||
if hr > 100.0 && reading.heart_rate.confidence > 0.3 {
|
||||
alerts.push(AnomalyAlert {
|
||||
vital_type: "cardiac".to_string(),
|
||||
alert_type: "tachycardia".to_string(),
|
||||
severity: ((hr - 100.0) / 80.0).clamp(0.3, 1.0),
|
||||
message: format!("Elevated heart rate: HR = {hr:.1} BPM"),
|
||||
});
|
||||
} else if hr < 50.0 && reading.heart_rate.confidence > 0.3 {
|
||||
alerts.push(AnomalyAlert {
|
||||
vital_type: "cardiac".to_string(),
|
||||
alert_type: "bradycardia".to_string(),
|
||||
severity: ((50.0 - hr) / 30.0).clamp(0.3, 1.0),
|
||||
message: format!("Low heart rate: HR = {hr:.1} BPM"),
|
||||
});
|
||||
}
|
||||
|
||||
// Z-score based sudden change detection for HR
|
||||
if hr_z.abs() > self.z_threshold {
|
||||
alerts.push(AnomalyAlert {
|
||||
vital_type: "cardiac".to_string(),
|
||||
alert_type: "sudden_change".to_string(),
|
||||
severity: (hr_z.abs() / (self.z_threshold * 2.0)).clamp(0.2, 1.0),
|
||||
message: format!(
|
||||
"Sudden heart rate change: z-score = {hr_z:.2} (HR = {hr:.1} BPM)"
|
||||
),
|
||||
});
|
||||
}
|
||||
|
||||
alerts
|
||||
}
|
||||
|
||||
/// Reset all accumulated statistics and history.
|
||||
pub fn reset(&mut self) {
|
||||
self.rr_stats = WelfordStats::new();
|
||||
self.hr_stats = WelfordStats::new();
|
||||
self.rr_history.clear();
|
||||
self.hr_history.clear();
|
||||
}
|
||||
|
||||
/// Number of readings processed so far.
|
||||
#[must_use]
|
||||
pub fn reading_count(&self) -> u64 {
|
||||
self.rr_stats.count
|
||||
}
|
||||
|
||||
/// Current running mean for respiratory rate.
|
||||
#[must_use]
|
||||
pub fn rr_mean(&self) -> f64 {
|
||||
self.rr_stats.mean
|
||||
}
|
||||
|
||||
/// Current running mean for heart rate.
|
||||
#[must_use]
|
||||
pub fn hr_mean(&self) -> f64 {
|
||||
self.hr_stats.mean
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::types::{VitalEstimate, VitalReading, VitalStatus};
|
||||
|
||||
fn make_reading(rr_bpm: f64, hr_bpm: f64) -> VitalReading {
|
||||
VitalReading {
|
||||
respiratory_rate: VitalEstimate {
|
||||
value_bpm: rr_bpm,
|
||||
confidence: 0.8,
|
||||
status: VitalStatus::Valid,
|
||||
},
|
||||
heart_rate: VitalEstimate {
|
||||
value_bpm: hr_bpm,
|
||||
confidence: 0.8,
|
||||
status: VitalStatus::Valid,
|
||||
},
|
||||
subcarrier_count: 56,
|
||||
signal_quality: 0.9,
|
||||
timestamp_secs: 0.0,
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn no_alerts_for_normal_readings() {
|
||||
let mut det = VitalAnomalyDetector::new(30, 2.5);
|
||||
// Feed 20 normal readings
|
||||
for _ in 0..20 {
|
||||
let alerts = det.check(&make_reading(15.0, 72.0));
|
||||
// After warmup, should have no alerts
|
||||
if det.reading_count() > 5 {
|
||||
assert!(alerts.is_empty(), "normal readings should not trigger alerts");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn detects_tachycardia() {
|
||||
let mut det = VitalAnomalyDetector::new(30, 2.5);
|
||||
// Warmup with normal
|
||||
for _ in 0..10 {
|
||||
det.check(&make_reading(15.0, 72.0));
|
||||
}
|
||||
// Elevated HR
|
||||
let alerts = det.check(&make_reading(15.0, 130.0));
|
||||
let tachycardia = alerts
|
||||
.iter()
|
||||
.any(|a| a.alert_type == "tachycardia");
|
||||
assert!(tachycardia, "should detect tachycardia at 130 BPM");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn detects_bradycardia() {
|
||||
let mut det = VitalAnomalyDetector::new(30, 2.5);
|
||||
for _ in 0..10 {
|
||||
det.check(&make_reading(15.0, 72.0));
|
||||
}
|
||||
let alerts = det.check(&make_reading(15.0, 40.0));
|
||||
let brady = alerts.iter().any(|a| a.alert_type == "bradycardia");
|
||||
assert!(brady, "should detect bradycardia at 40 BPM");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn detects_apnea() {
|
||||
let mut det = VitalAnomalyDetector::new(30, 2.5);
|
||||
for _ in 0..10 {
|
||||
det.check(&make_reading(15.0, 72.0));
|
||||
}
|
||||
let alerts = det.check(&make_reading(2.0, 72.0));
|
||||
let apnea = alerts.iter().any(|a| a.alert_type == "apnea");
|
||||
assert!(apnea, "should detect apnea at 2 BPM");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn detects_tachypnea() {
|
||||
let mut det = VitalAnomalyDetector::new(30, 2.5);
|
||||
for _ in 0..10 {
|
||||
det.check(&make_reading(15.0, 72.0));
|
||||
}
|
||||
let alerts = det.check(&make_reading(35.0, 72.0));
|
||||
let tachypnea = alerts.iter().any(|a| a.alert_type == "tachypnea");
|
||||
assert!(tachypnea, "should detect tachypnea at 35 BPM");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn detects_sudden_change() {
|
||||
let mut det = VitalAnomalyDetector::new(30, 2.0);
|
||||
// Build a stable baseline
|
||||
for _ in 0..30 {
|
||||
det.check(&make_reading(15.0, 72.0));
|
||||
}
|
||||
// Sudden jump (still in normal clinical range but statistically anomalous)
|
||||
let alerts = det.check(&make_reading(15.0, 95.0));
|
||||
let sudden = alerts.iter().any(|a| a.alert_type == "sudden_change");
|
||||
assert!(sudden, "should detect sudden HR change from 72 to 95 BPM");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reset_clears_state() {
|
||||
let mut det = VitalAnomalyDetector::new(30, 2.5);
|
||||
for _ in 0..10 {
|
||||
det.check(&make_reading(15.0, 72.0));
|
||||
}
|
||||
assert!(det.reading_count() > 0);
|
||||
det.reset();
|
||||
assert_eq!(det.reading_count(), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn welford_stats_basic() {
|
||||
let mut stats = WelfordStats::new();
|
||||
stats.update(10.0);
|
||||
stats.update(20.0);
|
||||
stats.update(30.0);
|
||||
assert!((stats.mean - 20.0).abs() < 1e-10);
|
||||
assert!(stats.std_dev() > 0.0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn welford_z_score() {
|
||||
let mut stats = WelfordStats::new();
|
||||
for i in 0..100 {
|
||||
stats.update(50.0 + (i % 3) as f64);
|
||||
}
|
||||
// A value far from the mean should have a high z-score
|
||||
let z = stats.z_score(100.0);
|
||||
assert!(z > 2.0, "z-score for extreme value should be > 2: {z}");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn running_means_are_tracked() {
|
||||
let mut det = VitalAnomalyDetector::new(30, 2.5);
|
||||
for _ in 0..10 {
|
||||
det.check(&make_reading(16.0, 75.0));
|
||||
}
|
||||
assert!((det.rr_mean() - 16.0).abs() < 0.5);
|
||||
assert!((det.hr_mean() - 75.0).abs() < 0.5);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn severity_is_clamped() {
|
||||
let mut det = VitalAnomalyDetector::new(30, 2.5);
|
||||
for _ in 0..10 {
|
||||
det.check(&make_reading(15.0, 72.0));
|
||||
}
|
||||
let alerts = det.check(&make_reading(15.0, 200.0));
|
||||
for alert in &alerts {
|
||||
assert!(
|
||||
alert.severity >= 0.0 && alert.severity <= 1.0,
|
||||
"severity should be in [0,1]: {}",
|
||||
alert.severity,
|
||||
);
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,318 @@
|
||||
//! Respiratory rate extraction from CSI residuals.
|
||||
//!
|
||||
//! Uses bandpass filtering (0.1-0.5 Hz) and spectral analysis
|
||||
//! to extract breathing rate from multi-subcarrier CSI data.
|
||||
//!
|
||||
//! The approach follows the same IIR bandpass + zero-crossing pattern
|
||||
//! used by [`CoarseBreathingExtractor`](wifi_densepose_wifiscan::pipeline::CoarseBreathingExtractor)
|
||||
//! in the wifiscan crate, adapted for multi-subcarrier f64 processing
|
||||
//! with weighted subcarrier fusion.
|
||||
|
||||
use crate::types::{VitalEstimate, VitalStatus};
|
||||
|
||||
/// IIR bandpass filter state (2nd-order resonator).
|
||||
#[derive(Clone, Debug)]
|
||||
struct IirState {
|
||||
x1: f64,
|
||||
x2: f64,
|
||||
y1: f64,
|
||||
y2: f64,
|
||||
}
|
||||
|
||||
impl Default for IirState {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
x1: 0.0,
|
||||
x2: 0.0,
|
||||
y1: 0.0,
|
||||
y2: 0.0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Respiratory rate extractor using bandpass filtering and zero-crossing analysis.
|
||||
pub struct BreathingExtractor {
|
||||
/// Per-sample filtered signal history.
|
||||
filtered_history: Vec<f64>,
|
||||
/// Sample rate in Hz.
|
||||
sample_rate: f64,
|
||||
/// Analysis window in seconds.
|
||||
window_secs: f64,
|
||||
/// Maximum subcarrier slots.
|
||||
n_subcarriers: usize,
|
||||
/// Breathing band low cutoff (Hz).
|
||||
freq_low: f64,
|
||||
/// Breathing band high cutoff (Hz).
|
||||
freq_high: f64,
|
||||
/// IIR filter state.
|
||||
filter_state: IirState,
|
||||
}
|
||||
|
||||
impl BreathingExtractor {
|
||||
/// Create a new breathing extractor.
|
||||
///
|
||||
/// - `n_subcarriers`: number of subcarrier channels.
|
||||
/// - `sample_rate`: input sample rate in Hz.
|
||||
/// - `window_secs`: analysis window length in seconds (default: 30).
|
||||
#[must_use]
|
||||
#[allow(clippy::cast_possible_truncation, clippy::cast_sign_loss)]
|
||||
pub fn new(n_subcarriers: usize, sample_rate: f64, window_secs: f64) -> Self {
|
||||
let capacity = (sample_rate * window_secs) as usize;
|
||||
Self {
|
||||
filtered_history: Vec::with_capacity(capacity),
|
||||
sample_rate,
|
||||
window_secs,
|
||||
n_subcarriers,
|
||||
freq_low: 0.1,
|
||||
freq_high: 0.5,
|
||||
filter_state: IirState::default(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Create with ESP32 defaults (56 subcarriers, 100 Hz, 30 s window).
|
||||
#[must_use]
|
||||
pub fn esp32_default() -> Self {
|
||||
Self::new(56, 100.0, 30.0)
|
||||
}
|
||||
|
||||
/// Extract respiratory rate from a vector of per-subcarrier residuals.
|
||||
///
|
||||
/// - `residuals`: amplitude residuals from the preprocessor.
|
||||
/// - `weights`: per-subcarrier attention weights (higher = more
|
||||
/// body-sensitive). If shorter than `residuals`, missing weights
|
||||
/// default to uniform.
|
||||
///
|
||||
/// Returns a `VitalEstimate` with the breathing rate in BPM, or
|
||||
/// `None` if insufficient history has been accumulated.
|
||||
pub fn extract(&mut self, residuals: &[f64], weights: &[f64]) -> Option<VitalEstimate> {
|
||||
let n = residuals.len().min(self.n_subcarriers);
|
||||
if n == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Weighted fusion of subcarrier residuals
|
||||
let uniform_w = 1.0 / n as f64;
|
||||
let weighted_signal: f64 = residuals
|
||||
.iter()
|
||||
.enumerate()
|
||||
.take(n)
|
||||
.map(|(i, &r)| {
|
||||
let w = weights.get(i).copied().unwrap_or(uniform_w);
|
||||
r * w
|
||||
})
|
||||
.sum();
|
||||
|
||||
// Apply IIR bandpass filter
|
||||
let filtered = self.bandpass_filter(weighted_signal);
|
||||
|
||||
// Append to history, enforce window limit
|
||||
self.filtered_history.push(filtered);
|
||||
let max_len = (self.sample_rate * self.window_secs) as usize;
|
||||
if self.filtered_history.len() > max_len {
|
||||
self.filtered_history.remove(0);
|
||||
}
|
||||
|
||||
// Need at least 10 seconds of data
|
||||
let min_samples = (self.sample_rate * 10.0) as usize;
|
||||
if self.filtered_history.len() < min_samples {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Zero-crossing rate -> frequency
|
||||
let crossings = count_zero_crossings(&self.filtered_history);
|
||||
let duration_s = self.filtered_history.len() as f64 / self.sample_rate;
|
||||
let frequency_hz = crossings as f64 / (2.0 * duration_s);
|
||||
|
||||
// Validate frequency is within the breathing band
|
||||
if frequency_hz < self.freq_low || frequency_hz > self.freq_high {
|
||||
return None;
|
||||
}
|
||||
|
||||
let bpm = frequency_hz * 60.0;
|
||||
let confidence = compute_confidence(&self.filtered_history);
|
||||
|
||||
let status = if confidence >= 0.7 {
|
||||
VitalStatus::Valid
|
||||
} else if confidence >= 0.4 {
|
||||
VitalStatus::Degraded
|
||||
} else {
|
||||
VitalStatus::Unreliable
|
||||
};
|
||||
|
||||
Some(VitalEstimate {
|
||||
value_bpm: bpm,
|
||||
confidence,
|
||||
status,
|
||||
})
|
||||
}
|
||||
|
||||
/// 2nd-order IIR bandpass filter using a resonator topology.
|
||||
///
|
||||
/// y[n] = (1-r)*(x[n] - x[n-2]) + 2*r*cos(w0)*y[n-1] - r^2*y[n-2]
|
||||
fn bandpass_filter(&mut self, input: f64) -> f64 {
|
||||
let state = &mut self.filter_state;
|
||||
|
||||
let omega_low = 2.0 * std::f64::consts::PI * self.freq_low / self.sample_rate;
|
||||
let omega_high = 2.0 * std::f64::consts::PI * self.freq_high / self.sample_rate;
|
||||
let bw = omega_high - omega_low;
|
||||
let center = f64::midpoint(omega_low, omega_high);
|
||||
|
||||
let r = 1.0 - bw / 2.0;
|
||||
let cos_w0 = center.cos();
|
||||
|
||||
let output =
|
||||
(1.0 - r) * (input - state.x2) + 2.0 * r * cos_w0 * state.y1 - r * r * state.y2;
|
||||
|
||||
state.x2 = state.x1;
|
||||
state.x1 = input;
|
||||
state.y2 = state.y1;
|
||||
state.y1 = output;
|
||||
|
||||
output
|
||||
}
|
||||
|
||||
/// Reset all filter state and history.
|
||||
pub fn reset(&mut self) {
|
||||
self.filtered_history.clear();
|
||||
self.filter_state = IirState::default();
|
||||
}
|
||||
|
||||
/// Current number of samples in the history buffer.
|
||||
#[must_use]
|
||||
pub fn history_len(&self) -> usize {
|
||||
self.filtered_history.len()
|
||||
}
|
||||
|
||||
/// Breathing band cutoff frequencies.
|
||||
#[must_use]
|
||||
pub fn band(&self) -> (f64, f64) {
|
||||
(self.freq_low, self.freq_high)
|
||||
}
|
||||
}
|
||||
|
||||
/// Count zero crossings in a signal.
|
||||
fn count_zero_crossings(signal: &[f64]) -> usize {
|
||||
signal.windows(2).filter(|w| w[0] * w[1] < 0.0).count()
|
||||
}
|
||||
|
||||
/// Compute confidence in the breathing estimate based on signal regularity.
|
||||
fn compute_confidence(history: &[f64]) -> f64 {
|
||||
if history.len() < 4 {
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
let n = history.len() as f64;
|
||||
let mean: f64 = history.iter().sum::<f64>() / n;
|
||||
let variance: f64 = history.iter().map(|x| (x - mean) * (x - mean)).sum::<f64>() / n;
|
||||
|
||||
if variance < 1e-15 {
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
let peak = history
|
||||
.iter()
|
||||
.map(|x| x.abs())
|
||||
.fold(0.0_f64, f64::max);
|
||||
let noise = variance.sqrt();
|
||||
|
||||
let snr = if noise > 1e-15 { peak / noise } else { 0.0 };
|
||||
|
||||
// Map SNR to [0, 1] confidence
|
||||
(snr / 5.0).min(1.0)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn no_data_returns_none() {
|
||||
let mut ext = BreathingExtractor::new(4, 10.0, 30.0);
|
||||
assert!(ext.extract(&[], &[]).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn insufficient_history_returns_none() {
|
||||
let mut ext = BreathingExtractor::new(2, 10.0, 30.0);
|
||||
// Just a few frames are not enough
|
||||
for _ in 0..5 {
|
||||
assert!(ext.extract(&[1.0, 2.0], &[0.5, 0.5]).is_none());
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_crossings_count() {
|
||||
let signal = vec![1.0, -1.0, 1.0, -1.0, 1.0];
|
||||
assert_eq!(count_zero_crossings(&signal), 4);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_crossings_constant() {
|
||||
let signal = vec![1.0, 1.0, 1.0, 1.0];
|
||||
assert_eq!(count_zero_crossings(&signal), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn sinusoidal_breathing_detected() {
|
||||
let sample_rate = 10.0;
|
||||
let mut ext = BreathingExtractor::new(1, sample_rate, 60.0);
|
||||
let breathing_freq = 0.25; // 15 BPM
|
||||
|
||||
// Generate 60 seconds of sinusoidal breathing signal
|
||||
for i in 0..600 {
|
||||
let t = i as f64 / sample_rate;
|
||||
let signal = (2.0 * std::f64::consts::PI * breathing_freq * t).sin();
|
||||
ext.extract(&[signal], &[1.0]);
|
||||
}
|
||||
|
||||
let result = ext.extract(&[0.0], &[1.0]);
|
||||
if let Some(est) = result {
|
||||
// Should be approximately 15 BPM (0.25 Hz * 60)
|
||||
assert!(
|
||||
est.value_bpm > 5.0 && est.value_bpm < 40.0,
|
||||
"estimated BPM should be in breathing range: {}",
|
||||
est.value_bpm,
|
||||
);
|
||||
assert!(est.confidence > 0.0, "confidence should be > 0");
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reset_clears_state() {
|
||||
let mut ext = BreathingExtractor::new(2, 10.0, 30.0);
|
||||
ext.extract(&[1.0, 2.0], &[0.5, 0.5]);
|
||||
assert!(ext.history_len() > 0);
|
||||
ext.reset();
|
||||
assert_eq!(ext.history_len(), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn band_returns_correct_values() {
|
||||
let ext = BreathingExtractor::new(1, 10.0, 30.0);
|
||||
let (low, high) = ext.band();
|
||||
assert!((low - 0.1).abs() < f64::EPSILON);
|
||||
assert!((high - 0.5).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn confidence_zero_for_flat_signal() {
|
||||
let history = vec![0.0; 100];
|
||||
let conf = compute_confidence(&history);
|
||||
assert!((conf - 0.0).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn confidence_positive_for_oscillating_signal() {
|
||||
let history: Vec<f64> = (0..100)
|
||||
.map(|i| (i as f64 * 0.5).sin())
|
||||
.collect();
|
||||
let conf = compute_confidence(&history);
|
||||
assert!(conf > 0.0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn esp32_default_creates_correctly() {
|
||||
let ext = BreathingExtractor::esp32_default();
|
||||
assert_eq!(ext.n_subcarriers, 56);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,396 @@
|
||||
//! Heart rate extraction from CSI phase coherence.
|
||||
//!
|
||||
//! Uses bandpass filtering (0.8-2.0 Hz) and autocorrelation-based
|
||||
//! peak detection to extract cardiac rate from inter-subcarrier
|
||||
//! phase data. Requires multi-subcarrier CSI data (ESP32 mode only).
|
||||
//!
|
||||
//! The cardiac signal (0.1-0.5 mm body surface displacement) is
|
||||
//! ~10x weaker than the respiratory signal (1-5 mm chest displacement),
|
||||
//! so this module relies on phase coherence across subcarriers rather
|
||||
//! than single-channel amplitude analysis.
|
||||
|
||||
use crate::types::{VitalEstimate, VitalStatus};
|
||||
|
||||
/// IIR bandpass filter state (2nd-order resonator).
|
||||
#[derive(Clone, Debug)]
|
||||
struct IirState {
|
||||
x1: f64,
|
||||
x2: f64,
|
||||
y1: f64,
|
||||
y2: f64,
|
||||
}
|
||||
|
||||
impl Default for IirState {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
x1: 0.0,
|
||||
x2: 0.0,
|
||||
y1: 0.0,
|
||||
y2: 0.0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Heart rate extractor using bandpass filtering and autocorrelation
|
||||
/// peak detection.
|
||||
pub struct HeartRateExtractor {
|
||||
/// Per-sample filtered signal history.
|
||||
filtered_history: Vec<f64>,
|
||||
/// Sample rate in Hz.
|
||||
sample_rate: f64,
|
||||
/// Analysis window in seconds.
|
||||
window_secs: f64,
|
||||
/// Maximum subcarrier slots.
|
||||
n_subcarriers: usize,
|
||||
/// Cardiac band low cutoff (Hz) -- 0.8 Hz = 48 BPM.
|
||||
freq_low: f64,
|
||||
/// Cardiac band high cutoff (Hz) -- 2.0 Hz = 120 BPM.
|
||||
freq_high: f64,
|
||||
/// IIR filter state.
|
||||
filter_state: IirState,
|
||||
/// Minimum subcarriers required for reliable HR estimation.
|
||||
min_subcarriers: usize,
|
||||
}
|
||||
|
||||
impl HeartRateExtractor {
|
||||
/// Create a new heart rate extractor.
|
||||
///
|
||||
/// - `n_subcarriers`: number of subcarrier channels.
|
||||
/// - `sample_rate`: input sample rate in Hz.
|
||||
/// - `window_secs`: analysis window length in seconds (default: 15).
|
||||
#[must_use]
|
||||
#[allow(clippy::cast_possible_truncation, clippy::cast_sign_loss)]
|
||||
pub fn new(n_subcarriers: usize, sample_rate: f64, window_secs: f64) -> Self {
|
||||
let capacity = (sample_rate * window_secs) as usize;
|
||||
Self {
|
||||
filtered_history: Vec::with_capacity(capacity),
|
||||
sample_rate,
|
||||
window_secs,
|
||||
n_subcarriers,
|
||||
freq_low: 0.8,
|
||||
freq_high: 2.0,
|
||||
filter_state: IirState::default(),
|
||||
min_subcarriers: 4,
|
||||
}
|
||||
}
|
||||
|
||||
/// Create with ESP32 defaults (56 subcarriers, 100 Hz, 15 s window).
|
||||
#[must_use]
|
||||
pub fn esp32_default() -> Self {
|
||||
Self::new(56, 100.0, 15.0)
|
||||
}
|
||||
|
||||
/// Extract heart rate from per-subcarrier residuals and phase data.
|
||||
///
|
||||
/// - `residuals`: amplitude residuals from the preprocessor.
|
||||
/// - `phases`: per-subcarrier unwrapped phases (radians).
|
||||
///
|
||||
/// Returns a `VitalEstimate` with heart rate in BPM, or `None`
|
||||
/// if insufficient data or too few subcarriers.
|
||||
pub fn extract(&mut self, residuals: &[f64], phases: &[f64]) -> Option<VitalEstimate> {
|
||||
let n = residuals.len().min(self.n_subcarriers).min(phases.len());
|
||||
if n == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// For cardiac signals, use phase-coherence weighted fusion.
|
||||
// Compute mean phase differential as a proxy for body-surface
|
||||
// displacement sensitivity.
|
||||
let phase_signal = compute_phase_coherence_signal(residuals, phases, n);
|
||||
|
||||
// Apply cardiac-band IIR bandpass filter
|
||||
let filtered = self.bandpass_filter(phase_signal);
|
||||
|
||||
// Append to history, enforce window limit
|
||||
self.filtered_history.push(filtered);
|
||||
let max_len = (self.sample_rate * self.window_secs) as usize;
|
||||
if self.filtered_history.len() > max_len {
|
||||
self.filtered_history.remove(0);
|
||||
}
|
||||
|
||||
// Need at least 5 seconds of data for cardiac detection
|
||||
let min_samples = (self.sample_rate * 5.0) as usize;
|
||||
if self.filtered_history.len() < min_samples {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Use autocorrelation to find the dominant periodicity
|
||||
let (period_samples, acf_peak) =
|
||||
autocorrelation_peak(&self.filtered_history, self.sample_rate, self.freq_low, self.freq_high);
|
||||
|
||||
if period_samples == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let frequency_hz = self.sample_rate / period_samples as f64;
|
||||
let bpm = frequency_hz * 60.0;
|
||||
|
||||
// Validate BPM is in physiological range (40-180 BPM)
|
||||
if !(40.0..=180.0).contains(&bpm) {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Confidence based on autocorrelation peak strength and subcarrier count
|
||||
let subcarrier_factor = if n >= self.min_subcarriers {
|
||||
1.0
|
||||
} else {
|
||||
n as f64 / self.min_subcarriers as f64
|
||||
};
|
||||
let confidence = (acf_peak * subcarrier_factor).clamp(0.0, 1.0);
|
||||
|
||||
let status = if confidence >= 0.6 && n >= self.min_subcarriers {
|
||||
VitalStatus::Valid
|
||||
} else if confidence >= 0.3 {
|
||||
VitalStatus::Degraded
|
||||
} else {
|
||||
VitalStatus::Unreliable
|
||||
};
|
||||
|
||||
Some(VitalEstimate {
|
||||
value_bpm: bpm,
|
||||
confidence,
|
||||
status,
|
||||
})
|
||||
}
|
||||
|
||||
/// 2nd-order IIR bandpass filter (cardiac band: 0.8-2.0 Hz).
|
||||
fn bandpass_filter(&mut self, input: f64) -> f64 {
|
||||
let state = &mut self.filter_state;
|
||||
|
||||
let omega_low = 2.0 * std::f64::consts::PI * self.freq_low / self.sample_rate;
|
||||
let omega_high = 2.0 * std::f64::consts::PI * self.freq_high / self.sample_rate;
|
||||
let bw = omega_high - omega_low;
|
||||
let center = f64::midpoint(omega_low, omega_high);
|
||||
|
||||
let r = 1.0 - bw / 2.0;
|
||||
let cos_w0 = center.cos();
|
||||
|
||||
let output =
|
||||
(1.0 - r) * (input - state.x2) + 2.0 * r * cos_w0 * state.y1 - r * r * state.y2;
|
||||
|
||||
state.x2 = state.x1;
|
||||
state.x1 = input;
|
||||
state.y2 = state.y1;
|
||||
state.y1 = output;
|
||||
|
||||
output
|
||||
}
|
||||
|
||||
/// Reset all filter state and history.
|
||||
pub fn reset(&mut self) {
|
||||
self.filtered_history.clear();
|
||||
self.filter_state = IirState::default();
|
||||
}
|
||||
|
||||
/// Current number of samples in the history buffer.
|
||||
#[must_use]
|
||||
pub fn history_len(&self) -> usize {
|
||||
self.filtered_history.len()
|
||||
}
|
||||
|
||||
/// Cardiac band cutoff frequencies.
|
||||
#[must_use]
|
||||
pub fn band(&self) -> (f64, f64) {
|
||||
(self.freq_low, self.freq_high)
|
||||
}
|
||||
}
|
||||
|
||||
/// Compute a phase-coherence-weighted signal from residuals and phases.
|
||||
///
|
||||
/// Combines amplitude residuals with inter-subcarrier phase coherence
|
||||
/// to enhance the cardiac signal. Subcarriers with similar phase
|
||||
/// derivatives are likely sensing the same body surface.
|
||||
fn compute_phase_coherence_signal(residuals: &[f64], phases: &[f64], n: usize) -> f64 {
|
||||
if n <= 1 {
|
||||
return residuals.first().copied().unwrap_or(0.0);
|
||||
}
|
||||
|
||||
// Compute inter-subcarrier phase differences as coherence weights.
|
||||
// Adjacent subcarriers with small phase differences are more coherent.
|
||||
let mut weighted_sum = 0.0;
|
||||
let mut weight_total = 0.0;
|
||||
|
||||
for i in 0..n {
|
||||
let coherence = if i + 1 < n {
|
||||
let phase_diff = (phases[i + 1] - phases[i]).abs();
|
||||
// Higher coherence when phase difference is small
|
||||
(-phase_diff).exp()
|
||||
} else if i > 0 {
|
||||
let phase_diff = (phases[i] - phases[i - 1]).abs();
|
||||
(-phase_diff).exp()
|
||||
} else {
|
||||
1.0
|
||||
};
|
||||
|
||||
weighted_sum += residuals[i] * coherence;
|
||||
weight_total += coherence;
|
||||
}
|
||||
|
||||
if weight_total > 1e-15 {
|
||||
weighted_sum / weight_total
|
||||
} else {
|
||||
0.0
|
||||
}
|
||||
}
|
||||
|
||||
/// Find the dominant periodicity via autocorrelation in the cardiac band.
|
||||
///
|
||||
/// Returns `(period_in_samples, peak_normalized_acf)`. If no peak is
|
||||
/// found, returns `(0, 0.0)`.
|
||||
fn autocorrelation_peak(
|
||||
signal: &[f64],
|
||||
sample_rate: f64,
|
||||
freq_low: f64,
|
||||
freq_high: f64,
|
||||
) -> (usize, f64) {
|
||||
let n = signal.len();
|
||||
if n < 4 {
|
||||
return (0, 0.0);
|
||||
}
|
||||
|
||||
// Lag range corresponding to the cardiac band
|
||||
let min_lag = (sample_rate / freq_high).floor() as usize; // highest freq = shortest period
|
||||
let max_lag = (sample_rate / freq_low).ceil() as usize; // lowest freq = longest period
|
||||
let max_lag = max_lag.min(n / 2);
|
||||
|
||||
if min_lag >= max_lag || min_lag >= n {
|
||||
return (0, 0.0);
|
||||
}
|
||||
|
||||
// Compute mean-subtracted signal
|
||||
let mean: f64 = signal.iter().sum::<f64>() / n as f64;
|
||||
|
||||
// Autocorrelation at lag 0 for normalisation
|
||||
let acf0: f64 = signal.iter().map(|&x| (x - mean) * (x - mean)).sum();
|
||||
if acf0 < 1e-15 {
|
||||
return (0, 0.0);
|
||||
}
|
||||
|
||||
// Search for the peak in the cardiac lag range
|
||||
let mut best_lag = 0;
|
||||
let mut best_acf = f64::MIN;
|
||||
|
||||
for lag in min_lag..=max_lag {
|
||||
let acf: f64 = signal
|
||||
.iter()
|
||||
.take(n - lag)
|
||||
.enumerate()
|
||||
.map(|(i, &x)| (x - mean) * (signal[i + lag] - mean))
|
||||
.sum();
|
||||
|
||||
let normalized = acf / acf0;
|
||||
if normalized > best_acf {
|
||||
best_acf = normalized;
|
||||
best_lag = lag;
|
||||
}
|
||||
}
|
||||
|
||||
if best_acf > 0.0 {
|
||||
(best_lag, best_acf)
|
||||
} else {
|
||||
(0, 0.0)
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn no_data_returns_none() {
|
||||
let mut ext = HeartRateExtractor::new(4, 100.0, 15.0);
|
||||
assert!(ext.extract(&[], &[]).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn insufficient_history_returns_none() {
|
||||
let mut ext = HeartRateExtractor::new(2, 100.0, 15.0);
|
||||
for _ in 0..10 {
|
||||
assert!(ext.extract(&[0.1, 0.2], &[0.0, 0.0]).is_none());
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn sinusoidal_heartbeat_detected() {
|
||||
let sample_rate = 50.0;
|
||||
let mut ext = HeartRateExtractor::new(4, sample_rate, 20.0);
|
||||
let heart_freq = 1.2; // 72 BPM
|
||||
|
||||
// Generate 20 seconds of simulated cardiac signal across 4 subcarriers
|
||||
for i in 0..1000 {
|
||||
let t = i as f64 / sample_rate;
|
||||
let base = (2.0 * std::f64::consts::PI * heart_freq * t).sin();
|
||||
let residuals = vec![base * 0.1, base * 0.08, base * 0.12, base * 0.09];
|
||||
let phases = vec![0.0, 0.01, 0.02, 0.03]; // highly coherent
|
||||
ext.extract(&residuals, &phases);
|
||||
}
|
||||
|
||||
let final_residuals = vec![0.0; 4];
|
||||
let final_phases = vec![0.0; 4];
|
||||
let result = ext.extract(&final_residuals, &final_phases);
|
||||
|
||||
if let Some(est) = result {
|
||||
assert!(
|
||||
est.value_bpm > 40.0 && est.value_bpm < 180.0,
|
||||
"estimated BPM should be in cardiac range: {}",
|
||||
est.value_bpm,
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reset_clears_state() {
|
||||
let mut ext = HeartRateExtractor::new(2, 100.0, 15.0);
|
||||
ext.extract(&[0.1, 0.2], &[0.0, 0.1]);
|
||||
assert!(ext.history_len() > 0);
|
||||
ext.reset();
|
||||
assert_eq!(ext.history_len(), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn band_returns_correct_values() {
|
||||
let ext = HeartRateExtractor::new(1, 100.0, 15.0);
|
||||
let (low, high) = ext.band();
|
||||
assert!((low - 0.8).abs() < f64::EPSILON);
|
||||
assert!((high - 2.0).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn autocorrelation_finds_known_period() {
|
||||
let sample_rate = 50.0;
|
||||
let freq = 1.0; // 1 Hz = period of 50 samples
|
||||
let signal: Vec<f64> = (0..500)
|
||||
.map(|i| (2.0 * std::f64::consts::PI * freq * i as f64 / sample_rate).sin())
|
||||
.collect();
|
||||
|
||||
let (period, acf) = autocorrelation_peak(&signal, sample_rate, 0.8, 2.0);
|
||||
assert!(period > 0, "should find a period");
|
||||
assert!(acf > 0.5, "autocorrelation peak should be strong: {acf}");
|
||||
|
||||
let estimated_freq = sample_rate / period as f64;
|
||||
assert!(
|
||||
(estimated_freq - 1.0).abs() < 0.1,
|
||||
"estimated frequency should be ~1 Hz, got {estimated_freq}",
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn phase_coherence_single_subcarrier() {
|
||||
let result = compute_phase_coherence_signal(&[5.0], &[0.0], 1);
|
||||
assert!((result - 5.0).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn phase_coherence_multi_subcarrier() {
|
||||
// Two coherent subcarriers (small phase difference)
|
||||
let result = compute_phase_coherence_signal(&[1.0, 1.0], &[0.0, 0.01], 2);
|
||||
// Both weights should be ~1.0 (exp(-0.01) ~ 0.99), so result ~ 1.0
|
||||
assert!((result - 1.0).abs() < 0.1, "coherent result should be ~1.0: {result}");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn esp32_default_creates_correctly() {
|
||||
let ext = HeartRateExtractor::esp32_default();
|
||||
assert_eq!(ext.n_subcarriers, 56);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,80 @@
|
||||
//! ESP32 CSI-grade vital sign extraction (ADR-021).
|
||||
//!
|
||||
//! Extracts heart rate and respiratory rate from WiFi Channel
|
||||
//! State Information using multi-subcarrier amplitude and phase
|
||||
//! analysis.
|
||||
//!
|
||||
//! # Architecture
|
||||
//!
|
||||
//! The pipeline processes CSI frames through four stages:
|
||||
//!
|
||||
//! 1. **Preprocessing** ([`CsiVitalPreprocessor`]): EMA-based static
|
||||
//! component suppression, producing per-subcarrier residuals.
|
||||
//! 2. **Breathing extraction** ([`BreathingExtractor`]): Bandpass
|
||||
//! filtering (0.1-0.5 Hz) with zero-crossing analysis for
|
||||
//! respiratory rate.
|
||||
//! 3. **Heart rate extraction** ([`HeartRateExtractor`]): Bandpass
|
||||
//! filtering (0.8-2.0 Hz) with autocorrelation peak detection
|
||||
//! and inter-subcarrier phase coherence weighting.
|
||||
//! 4. **Anomaly detection** ([`VitalAnomalyDetector`]): Z-score
|
||||
//! analysis with Welford running statistics for clinical alerts
|
||||
//! (apnea, tachycardia, bradycardia).
|
||||
//!
|
||||
//! Results are stored in a [`VitalSignStore`] with configurable
|
||||
//! retention for historical analysis.
|
||||
//!
|
||||
//! # Example
|
||||
//!
|
||||
//! ```
|
||||
//! use wifi_densepose_vitals::{
|
||||
//! CsiVitalPreprocessor, BreathingExtractor, HeartRateExtractor,
|
||||
//! VitalAnomalyDetector, VitalSignStore, CsiFrame,
|
||||
//! VitalReading, VitalEstimate, VitalStatus,
|
||||
//! };
|
||||
//!
|
||||
//! let mut preprocessor = CsiVitalPreprocessor::new(56, 0.05);
|
||||
//! let mut breathing = BreathingExtractor::new(56, 100.0, 30.0);
|
||||
//! let mut heartrate = HeartRateExtractor::new(56, 100.0, 15.0);
|
||||
//! let mut anomaly = VitalAnomalyDetector::default_config();
|
||||
//! let mut store = VitalSignStore::new(3600);
|
||||
//!
|
||||
//! // Process a CSI frame
|
||||
//! let frame = CsiFrame {
|
||||
//! amplitudes: vec![1.0; 56],
|
||||
//! phases: vec![0.0; 56],
|
||||
//! n_subcarriers: 56,
|
||||
//! sample_index: 0,
|
||||
//! sample_rate_hz: 100.0,
|
||||
//! };
|
||||
//!
|
||||
//! if let Some(residuals) = preprocessor.process(&frame) {
|
||||
//! let weights = vec![1.0 / 56.0; 56];
|
||||
//! let rr = breathing.extract(&residuals, &weights);
|
||||
//! let hr = heartrate.extract(&residuals, &frame.phases);
|
||||
//!
|
||||
//! let reading = VitalReading {
|
||||
//! respiratory_rate: rr.unwrap_or_else(VitalEstimate::unavailable),
|
||||
//! heart_rate: hr.unwrap_or_else(VitalEstimate::unavailable),
|
||||
//! subcarrier_count: frame.n_subcarriers,
|
||||
//! signal_quality: 0.9,
|
||||
//! timestamp_secs: 0.0,
|
||||
//! };
|
||||
//!
|
||||
//! let alerts = anomaly.check(&reading);
|
||||
//! store.push(reading);
|
||||
//! }
|
||||
//! ```
|
||||
|
||||
pub mod anomaly;
|
||||
pub mod breathing;
|
||||
pub mod heartrate;
|
||||
pub mod preprocessor;
|
||||
pub mod store;
|
||||
pub mod types;
|
||||
|
||||
pub use anomaly::{AnomalyAlert, VitalAnomalyDetector};
|
||||
pub use breathing::BreathingExtractor;
|
||||
pub use heartrate::HeartRateExtractor;
|
||||
pub use preprocessor::CsiVitalPreprocessor;
|
||||
pub use store::{VitalSignStore, VitalStats};
|
||||
pub use types::{CsiFrame, VitalEstimate, VitalReading, VitalStatus};
|
||||
@@ -0,0 +1,206 @@
|
||||
//! CSI vital sign preprocessor.
|
||||
//!
|
||||
//! Suppresses static subcarrier components and extracts the
|
||||
//! body-modulated signal residuals for vital sign analysis.
|
||||
//!
|
||||
//! Uses an EMA-based predictive filter (same pattern as
|
||||
//! [`PredictiveGate`](wifi_densepose_wifiscan::pipeline::PredictiveGate)
|
||||
//! in the wifiscan crate) operating on per-subcarrier amplitudes.
|
||||
//! The residuals represent deviations from the static environment
|
||||
//! baseline, isolating physiological movements (breathing, heartbeat).
|
||||
|
||||
use crate::types::CsiFrame;
|
||||
|
||||
/// EMA-based preprocessor that extracts body-modulated residuals
|
||||
/// from raw CSI subcarrier amplitudes.
|
||||
pub struct CsiVitalPreprocessor {
|
||||
/// EMA predictions per subcarrier.
|
||||
predictions: Vec<f64>,
|
||||
/// Whether each subcarrier slot has been initialised.
|
||||
initialized: Vec<bool>,
|
||||
/// EMA smoothing factor (lower = slower tracking, better static suppression).
|
||||
alpha: f64,
|
||||
/// Number of subcarrier slots.
|
||||
n_subcarriers: usize,
|
||||
}
|
||||
|
||||
impl CsiVitalPreprocessor {
|
||||
/// Create a new preprocessor.
|
||||
///
|
||||
/// - `n_subcarriers`: number of subcarrier slots to track.
|
||||
/// - `alpha`: EMA smoothing factor in `(0, 1)`. Lower values
|
||||
/// provide better static component suppression but slower
|
||||
/// adaptation. Default for vital signs: `0.05`.
|
||||
#[must_use]
|
||||
pub fn new(n_subcarriers: usize, alpha: f64) -> Self {
|
||||
Self {
|
||||
predictions: vec![0.0; n_subcarriers],
|
||||
initialized: vec![false; n_subcarriers],
|
||||
alpha: alpha.clamp(0.001, 0.999),
|
||||
n_subcarriers,
|
||||
}
|
||||
}
|
||||
|
||||
/// Create a preprocessor with defaults suitable for ESP32 CSI
|
||||
/// vital sign extraction (56 subcarriers, alpha = 0.05).
|
||||
#[must_use]
|
||||
pub fn esp32_default() -> Self {
|
||||
Self::new(56, 0.05)
|
||||
}
|
||||
|
||||
/// Process a CSI frame and return the residual vector.
|
||||
///
|
||||
/// The residuals represent the difference between observed and
|
||||
/// predicted (EMA) amplitudes. On the first frame for each
|
||||
/// subcarrier, the prediction is seeded and the raw amplitude
|
||||
/// is returned.
|
||||
///
|
||||
/// Returns `None` if the frame has zero subcarriers.
|
||||
pub fn process(&mut self, frame: &CsiFrame) -> Option<Vec<f64>> {
|
||||
let n = frame.amplitudes.len().min(self.n_subcarriers);
|
||||
if n == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut residuals = vec![0.0; n];
|
||||
|
||||
for (i, residual) in residuals.iter_mut().enumerate().take(n) {
|
||||
if self.initialized[i] {
|
||||
// Compute residual: observed - predicted
|
||||
*residual = frame.amplitudes[i] - self.predictions[i];
|
||||
// Update EMA prediction
|
||||
self.predictions[i] =
|
||||
self.alpha * frame.amplitudes[i] + (1.0 - self.alpha) * self.predictions[i];
|
||||
} else {
|
||||
// First observation: seed the prediction
|
||||
self.predictions[i] = frame.amplitudes[i];
|
||||
self.initialized[i] = true;
|
||||
// First-frame residual is zero (no prior to compare against)
|
||||
*residual = 0.0;
|
||||
}
|
||||
}
|
||||
|
||||
Some(residuals)
|
||||
}
|
||||
|
||||
/// Reset all predictions and initialisation state.
|
||||
pub fn reset(&mut self) {
|
||||
self.predictions.fill(0.0);
|
||||
self.initialized.fill(false);
|
||||
}
|
||||
|
||||
/// Current EMA smoothing factor.
|
||||
#[must_use]
|
||||
pub fn alpha(&self) -> f64 {
|
||||
self.alpha
|
||||
}
|
||||
|
||||
/// Update the EMA smoothing factor.
|
||||
pub fn set_alpha(&mut self, alpha: f64) {
|
||||
self.alpha = alpha.clamp(0.001, 0.999);
|
||||
}
|
||||
|
||||
/// Number of subcarrier slots.
|
||||
#[must_use]
|
||||
pub fn n_subcarriers(&self) -> usize {
|
||||
self.n_subcarriers
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::types::CsiFrame;
|
||||
|
||||
fn make_frame(amplitudes: Vec<f64>, n: usize) -> CsiFrame {
|
||||
let phases = vec![0.0; n];
|
||||
CsiFrame {
|
||||
amplitudes,
|
||||
phases,
|
||||
n_subcarriers: n,
|
||||
sample_index: 0,
|
||||
sample_rate_hz: 100.0,
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty_frame_returns_none() {
|
||||
let mut pp = CsiVitalPreprocessor::new(4, 0.05);
|
||||
let frame = make_frame(vec![], 0);
|
||||
assert!(pp.process(&frame).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn first_frame_residuals_are_zero() {
|
||||
let mut pp = CsiVitalPreprocessor::new(3, 0.05);
|
||||
let frame = make_frame(vec![1.0, 2.0, 3.0], 3);
|
||||
let residuals = pp.process(&frame).unwrap();
|
||||
assert_eq!(residuals.len(), 3);
|
||||
for &r in &residuals {
|
||||
assert!((r - 0.0).abs() < f64::EPSILON, "first frame residual should be 0");
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn static_signal_residuals_converge_to_zero() {
|
||||
let mut pp = CsiVitalPreprocessor::new(2, 0.1);
|
||||
let frame = make_frame(vec![5.0, 10.0], 2);
|
||||
|
||||
// Seed
|
||||
pp.process(&frame);
|
||||
|
||||
// After many identical frames, residuals should be near zero
|
||||
let mut last_residuals = vec![0.0; 2];
|
||||
for _ in 0..100 {
|
||||
last_residuals = pp.process(&frame).unwrap();
|
||||
}
|
||||
|
||||
for &r in &last_residuals {
|
||||
assert!(r.abs() < 0.01, "residuals should converge to ~0 for static signal, got {r}");
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn step_change_produces_large_residual() {
|
||||
let mut pp = CsiVitalPreprocessor::new(1, 0.05);
|
||||
let frame1 = make_frame(vec![10.0], 1);
|
||||
|
||||
// Converge EMA
|
||||
pp.process(&frame1);
|
||||
for _ in 0..200 {
|
||||
pp.process(&frame1);
|
||||
}
|
||||
|
||||
// Step change
|
||||
let frame2 = make_frame(vec![20.0], 1);
|
||||
let residuals = pp.process(&frame2).unwrap();
|
||||
assert!(residuals[0] > 5.0, "step change should produce large residual, got {}", residuals[0]);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reset_clears_state() {
|
||||
let mut pp = CsiVitalPreprocessor::new(2, 0.1);
|
||||
let frame = make_frame(vec![1.0, 2.0], 2);
|
||||
pp.process(&frame);
|
||||
pp.reset();
|
||||
// After reset, next frame is treated as first
|
||||
let residuals = pp.process(&frame).unwrap();
|
||||
for &r in &residuals {
|
||||
assert!((r - 0.0).abs() < f64::EPSILON);
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn alpha_clamped() {
|
||||
let pp = CsiVitalPreprocessor::new(1, -5.0);
|
||||
assert!(pp.alpha() > 0.0);
|
||||
let pp = CsiVitalPreprocessor::new(1, 100.0);
|
||||
assert!(pp.alpha() < 1.0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn esp32_default_has_correct_subcarriers() {
|
||||
let pp = CsiVitalPreprocessor::esp32_default();
|
||||
assert_eq!(pp.n_subcarriers(), 56);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,290 @@
|
||||
//! Vital sign time series store.
|
||||
//!
|
||||
//! Stores vital sign readings with configurable retention.
|
||||
//! Designed for upgrade to `TieredStore` when `ruvector-temporal-tensor`
|
||||
//! becomes available (ADR-021 phase 2).
|
||||
|
||||
use crate::types::{VitalReading, VitalStatus};
|
||||
|
||||
/// Simple vital sign store with capacity-limited ring buffer semantics.
|
||||
pub struct VitalSignStore {
|
||||
/// Stored readings (oldest first).
|
||||
readings: Vec<VitalReading>,
|
||||
/// Maximum number of readings to retain.
|
||||
max_readings: usize,
|
||||
}
|
||||
|
||||
/// Summary statistics for stored vital sign readings.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct VitalStats {
|
||||
/// Number of readings in the store.
|
||||
pub count: usize,
|
||||
/// Mean respiratory rate (BPM).
|
||||
pub rr_mean: f64,
|
||||
/// Mean heart rate (BPM).
|
||||
pub hr_mean: f64,
|
||||
/// Min respiratory rate (BPM).
|
||||
pub rr_min: f64,
|
||||
/// Max respiratory rate (BPM).
|
||||
pub rr_max: f64,
|
||||
/// Min heart rate (BPM).
|
||||
pub hr_min: f64,
|
||||
/// Max heart rate (BPM).
|
||||
pub hr_max: f64,
|
||||
/// Fraction of readings with Valid status.
|
||||
pub valid_fraction: f64,
|
||||
}
|
||||
|
||||
impl VitalSignStore {
|
||||
/// Create a new store with a given maximum capacity.
|
||||
///
|
||||
/// When the capacity is exceeded, the oldest readings are evicted.
|
||||
#[must_use]
|
||||
pub fn new(max_readings: usize) -> Self {
|
||||
Self {
|
||||
readings: Vec::with_capacity(max_readings.min(4096)),
|
||||
max_readings: max_readings.max(1),
|
||||
}
|
||||
}
|
||||
|
||||
/// Create with default capacity (3600 readings ~ 1 hour at 1 Hz).
|
||||
#[must_use]
|
||||
pub fn default_capacity() -> Self {
|
||||
Self::new(3600)
|
||||
}
|
||||
|
||||
/// Push a new reading into the store.
|
||||
///
|
||||
/// If the store is at capacity, the oldest reading is evicted.
|
||||
pub fn push(&mut self, reading: VitalReading) {
|
||||
if self.readings.len() >= self.max_readings {
|
||||
self.readings.remove(0);
|
||||
}
|
||||
self.readings.push(reading);
|
||||
}
|
||||
|
||||
/// Get the most recent reading, if any.
|
||||
#[must_use]
|
||||
pub fn latest(&self) -> Option<&VitalReading> {
|
||||
self.readings.last()
|
||||
}
|
||||
|
||||
/// Get the last `n` readings (most recent last).
|
||||
///
|
||||
/// Returns fewer than `n` if the store contains fewer readings.
|
||||
#[must_use]
|
||||
pub fn history(&self, n: usize) -> &[VitalReading] {
|
||||
let start = self.readings.len().saturating_sub(n);
|
||||
&self.readings[start..]
|
||||
}
|
||||
|
||||
/// Compute summary statistics over all stored readings.
|
||||
///
|
||||
/// Returns `None` if the store is empty.
|
||||
#[must_use]
|
||||
pub fn stats(&self) -> Option<VitalStats> {
|
||||
if self.readings.is_empty() {
|
||||
return None;
|
||||
}
|
||||
|
||||
let n = self.readings.len() as f64;
|
||||
let mut rr_sum = 0.0;
|
||||
let mut hr_sum = 0.0;
|
||||
let mut rr_min = f64::MAX;
|
||||
let mut rr_max = f64::MIN;
|
||||
let mut hr_min = f64::MAX;
|
||||
let mut hr_max = f64::MIN;
|
||||
let mut valid_count = 0_usize;
|
||||
|
||||
for r in &self.readings {
|
||||
let rr = r.respiratory_rate.value_bpm;
|
||||
let hr = r.heart_rate.value_bpm;
|
||||
rr_sum += rr;
|
||||
hr_sum += hr;
|
||||
rr_min = rr_min.min(rr);
|
||||
rr_max = rr_max.max(rr);
|
||||
hr_min = hr_min.min(hr);
|
||||
hr_max = hr_max.max(hr);
|
||||
|
||||
if r.respiratory_rate.status == VitalStatus::Valid
|
||||
&& r.heart_rate.status == VitalStatus::Valid
|
||||
{
|
||||
valid_count += 1;
|
||||
}
|
||||
}
|
||||
|
||||
Some(VitalStats {
|
||||
count: self.readings.len(),
|
||||
rr_mean: rr_sum / n,
|
||||
hr_mean: hr_sum / n,
|
||||
rr_min,
|
||||
rr_max,
|
||||
hr_min,
|
||||
hr_max,
|
||||
valid_fraction: valid_count as f64 / n,
|
||||
})
|
||||
}
|
||||
|
||||
/// Number of readings currently stored.
|
||||
#[must_use]
|
||||
pub fn len(&self) -> usize {
|
||||
self.readings.len()
|
||||
}
|
||||
|
||||
/// Whether the store is empty.
|
||||
#[must_use]
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.readings.is_empty()
|
||||
}
|
||||
|
||||
/// Maximum capacity of the store.
|
||||
#[must_use]
|
||||
pub fn capacity(&self) -> usize {
|
||||
self.max_readings
|
||||
}
|
||||
|
||||
/// Clear all stored readings.
|
||||
pub fn clear(&mut self) {
|
||||
self.readings.clear();
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::types::{VitalEstimate, VitalReading, VitalStatus};
|
||||
|
||||
fn make_reading(rr: f64, hr: f64) -> VitalReading {
|
||||
VitalReading {
|
||||
respiratory_rate: VitalEstimate {
|
||||
value_bpm: rr,
|
||||
confidence: 0.9,
|
||||
status: VitalStatus::Valid,
|
||||
},
|
||||
heart_rate: VitalEstimate {
|
||||
value_bpm: hr,
|
||||
confidence: 0.85,
|
||||
status: VitalStatus::Valid,
|
||||
},
|
||||
subcarrier_count: 56,
|
||||
signal_quality: 0.9,
|
||||
timestamp_secs: 0.0,
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty_store() {
|
||||
let store = VitalSignStore::new(10);
|
||||
assert!(store.is_empty());
|
||||
assert_eq!(store.len(), 0);
|
||||
assert!(store.latest().is_none());
|
||||
assert!(store.stats().is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn push_and_retrieve() {
|
||||
let mut store = VitalSignStore::new(10);
|
||||
store.push(make_reading(15.0, 72.0));
|
||||
assert_eq!(store.len(), 1);
|
||||
assert!(!store.is_empty());
|
||||
|
||||
let latest = store.latest().unwrap();
|
||||
assert!((latest.respiratory_rate.value_bpm - 15.0).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn eviction_at_capacity() {
|
||||
let mut store = VitalSignStore::new(3);
|
||||
store.push(make_reading(10.0, 60.0));
|
||||
store.push(make_reading(15.0, 72.0));
|
||||
store.push(make_reading(20.0, 80.0));
|
||||
assert_eq!(store.len(), 3);
|
||||
|
||||
// Push one more; oldest should be evicted
|
||||
store.push(make_reading(25.0, 90.0));
|
||||
assert_eq!(store.len(), 3);
|
||||
|
||||
// Oldest should now be 15.0, not 10.0
|
||||
let oldest = &store.history(10)[0];
|
||||
assert!((oldest.respiratory_rate.value_bpm - 15.0).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn history_returns_last_n() {
|
||||
let mut store = VitalSignStore::new(10);
|
||||
for i in 0..5 {
|
||||
store.push(make_reading(10.0 + i as f64, 60.0 + i as f64));
|
||||
}
|
||||
|
||||
let last3 = store.history(3);
|
||||
assert_eq!(last3.len(), 3);
|
||||
assert!((last3[0].respiratory_rate.value_bpm - 12.0).abs() < f64::EPSILON);
|
||||
assert!((last3[2].respiratory_rate.value_bpm - 14.0).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn history_when_fewer_than_n() {
|
||||
let mut store = VitalSignStore::new(10);
|
||||
store.push(make_reading(15.0, 72.0));
|
||||
let all = store.history(100);
|
||||
assert_eq!(all.len(), 1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn stats_computation() {
|
||||
let mut store = VitalSignStore::new(10);
|
||||
store.push(make_reading(10.0, 60.0));
|
||||
store.push(make_reading(20.0, 80.0));
|
||||
store.push(make_reading(15.0, 70.0));
|
||||
|
||||
let stats = store.stats().unwrap();
|
||||
assert_eq!(stats.count, 3);
|
||||
assert!((stats.rr_mean - 15.0).abs() < f64::EPSILON);
|
||||
assert!((stats.hr_mean - 70.0).abs() < f64::EPSILON);
|
||||
assert!((stats.rr_min - 10.0).abs() < f64::EPSILON);
|
||||
assert!((stats.rr_max - 20.0).abs() < f64::EPSILON);
|
||||
assert!((stats.hr_min - 60.0).abs() < f64::EPSILON);
|
||||
assert!((stats.hr_max - 80.0).abs() < f64::EPSILON);
|
||||
assert!((stats.valid_fraction - 1.0).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn stats_valid_fraction() {
|
||||
let mut store = VitalSignStore::new(10);
|
||||
store.push(make_reading(15.0, 72.0)); // Valid
|
||||
store.push(VitalReading {
|
||||
respiratory_rate: VitalEstimate {
|
||||
value_bpm: 15.0,
|
||||
confidence: 0.3,
|
||||
status: VitalStatus::Degraded,
|
||||
},
|
||||
heart_rate: VitalEstimate {
|
||||
value_bpm: 72.0,
|
||||
confidence: 0.8,
|
||||
status: VitalStatus::Valid,
|
||||
},
|
||||
subcarrier_count: 56,
|
||||
signal_quality: 0.5,
|
||||
timestamp_secs: 1.0,
|
||||
});
|
||||
|
||||
let stats = store.stats().unwrap();
|
||||
assert!((stats.valid_fraction - 0.5).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn clear_empties_store() {
|
||||
let mut store = VitalSignStore::new(10);
|
||||
store.push(make_reading(15.0, 72.0));
|
||||
store.push(make_reading(16.0, 73.0));
|
||||
assert_eq!(store.len(), 2);
|
||||
store.clear();
|
||||
assert!(store.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn default_capacity_is_3600() {
|
||||
let store = VitalSignStore::default_capacity();
|
||||
assert_eq!(store.capacity(), 3600);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,174 @@
|
||||
//! Vital sign domain types (ADR-021).
|
||||
|
||||
#[cfg(feature = "serde")]
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
/// Status of a vital sign measurement.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub enum VitalStatus {
|
||||
/// Valid measurement with clinical-grade confidence.
|
||||
Valid,
|
||||
/// Measurement present but with reduced confidence.
|
||||
Degraded,
|
||||
/// Measurement unreliable (e.g., single RSSI source).
|
||||
Unreliable,
|
||||
/// No measurement possible.
|
||||
Unavailable,
|
||||
}
|
||||
|
||||
/// A single vital sign estimate.
|
||||
#[derive(Debug, Clone)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub struct VitalEstimate {
|
||||
/// Estimated value in BPM (beats/breaths per minute).
|
||||
pub value_bpm: f64,
|
||||
/// Confidence in the estimate [0.0, 1.0].
|
||||
pub confidence: f64,
|
||||
/// Measurement status.
|
||||
pub status: VitalStatus,
|
||||
}
|
||||
|
||||
/// Combined vital sign reading.
|
||||
#[derive(Debug, Clone)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub struct VitalReading {
|
||||
/// Respiratory rate estimate.
|
||||
pub respiratory_rate: VitalEstimate,
|
||||
/// Heart rate estimate.
|
||||
pub heart_rate: VitalEstimate,
|
||||
/// Number of subcarriers used.
|
||||
pub subcarrier_count: usize,
|
||||
/// Signal quality score [0.0, 1.0].
|
||||
pub signal_quality: f64,
|
||||
/// Timestamp (seconds since epoch).
|
||||
pub timestamp_secs: f64,
|
||||
}
|
||||
|
||||
/// Input frame for the vital sign pipeline.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct CsiFrame {
|
||||
/// Per-subcarrier amplitudes.
|
||||
pub amplitudes: Vec<f64>,
|
||||
/// Per-subcarrier phases (radians).
|
||||
pub phases: Vec<f64>,
|
||||
/// Number of subcarriers.
|
||||
pub n_subcarriers: usize,
|
||||
/// Sample index (monotonically increasing).
|
||||
pub sample_index: u64,
|
||||
/// Sample rate in Hz.
|
||||
pub sample_rate_hz: f64,
|
||||
}
|
||||
|
||||
impl CsiFrame {
|
||||
/// Create a new CSI frame, validating that amplitude and phase
|
||||
/// vectors match the declared subcarrier count.
|
||||
///
|
||||
/// Returns `None` if the lengths are inconsistent.
|
||||
pub fn new(
|
||||
amplitudes: Vec<f64>,
|
||||
phases: Vec<f64>,
|
||||
n_subcarriers: usize,
|
||||
sample_index: u64,
|
||||
sample_rate_hz: f64,
|
||||
) -> Option<Self> {
|
||||
if amplitudes.len() != n_subcarriers || phases.len() != n_subcarriers {
|
||||
return None;
|
||||
}
|
||||
Some(Self {
|
||||
amplitudes,
|
||||
phases,
|
||||
n_subcarriers,
|
||||
sample_index,
|
||||
sample_rate_hz,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
impl VitalEstimate {
|
||||
/// Create an unavailable estimate (no measurement possible).
|
||||
pub fn unavailable() -> Self {
|
||||
Self {
|
||||
value_bpm: 0.0,
|
||||
confidence: 0.0,
|
||||
status: VitalStatus::Unavailable,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn vital_status_equality() {
|
||||
assert_eq!(VitalStatus::Valid, VitalStatus::Valid);
|
||||
assert_ne!(VitalStatus::Valid, VitalStatus::Degraded);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn vital_estimate_unavailable() {
|
||||
let est = VitalEstimate::unavailable();
|
||||
assert_eq!(est.status, VitalStatus::Unavailable);
|
||||
assert!((est.value_bpm - 0.0).abs() < f64::EPSILON);
|
||||
assert!((est.confidence - 0.0).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn csi_frame_new_valid() {
|
||||
let frame = CsiFrame::new(
|
||||
vec![1.0, 2.0, 3.0],
|
||||
vec![0.1, 0.2, 0.3],
|
||||
3,
|
||||
0,
|
||||
100.0,
|
||||
);
|
||||
assert!(frame.is_some());
|
||||
let f = frame.unwrap();
|
||||
assert_eq!(f.n_subcarriers, 3);
|
||||
assert_eq!(f.amplitudes.len(), 3);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn csi_frame_new_mismatched_lengths() {
|
||||
let frame = CsiFrame::new(
|
||||
vec![1.0, 2.0],
|
||||
vec![0.1, 0.2, 0.3],
|
||||
3,
|
||||
0,
|
||||
100.0,
|
||||
);
|
||||
assert!(frame.is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn csi_frame_clone() {
|
||||
let frame = CsiFrame::new(vec![1.0], vec![0.5], 1, 42, 50.0).unwrap();
|
||||
let cloned = frame.clone();
|
||||
assert_eq!(cloned.sample_index, 42);
|
||||
assert_eq!(cloned.n_subcarriers, 1);
|
||||
}
|
||||
|
||||
#[cfg(feature = "serde")]
|
||||
#[test]
|
||||
fn vital_reading_serde_roundtrip() {
|
||||
let reading = VitalReading {
|
||||
respiratory_rate: VitalEstimate {
|
||||
value_bpm: 15.0,
|
||||
confidence: 0.9,
|
||||
status: VitalStatus::Valid,
|
||||
},
|
||||
heart_rate: VitalEstimate {
|
||||
value_bpm: 72.0,
|
||||
confidence: 0.85,
|
||||
status: VitalStatus::Valid,
|
||||
},
|
||||
subcarrier_count: 56,
|
||||
signal_quality: 0.92,
|
||||
timestamp_secs: 1_700_000_000.0,
|
||||
};
|
||||
let json = serde_json::to_string(&reading).unwrap();
|
||||
let parsed: VitalReading = serde_json::from_str(&json).unwrap();
|
||||
assert!((parsed.heart_rate.value_bpm - 72.0).abs() < f64::EPSILON);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,40 @@
|
||||
[package]
|
||||
name = "wifi-densepose-wifiscan"
|
||||
version.workspace = true
|
||||
edition.workspace = true
|
||||
description = "Multi-BSSID WiFi scanning domain layer for enhanced Windows WiFi DensePose sensing (ADR-022)"
|
||||
license.workspace = true
|
||||
|
||||
[dependencies]
|
||||
# Logging
|
||||
tracing.workspace = true
|
||||
|
||||
# Serialization (optional, for domain types)
|
||||
serde = { workspace = true, optional = true }
|
||||
|
||||
# Async runtime (optional, for Tier 2 async scanning)
|
||||
tokio = { workspace = true, optional = true }
|
||||
|
||||
[features]
|
||||
default = ["serde", "pipeline"]
|
||||
serde = ["dep:serde"]
|
||||
pipeline = []
|
||||
## Tier 2: enables async scan_async() method on WlanApiScanner via tokio
|
||||
wlanapi = ["dep:tokio"]
|
||||
|
||||
[lints.rust]
|
||||
unsafe_code = "forbid"
|
||||
|
||||
[lints.clippy]
|
||||
all = "warn"
|
||||
pedantic = "warn"
|
||||
doc_markdown = "allow"
|
||||
module_name_repetitions = "allow"
|
||||
must_use_candidate = "allow"
|
||||
missing_errors_doc = "allow"
|
||||
missing_panics_doc = "allow"
|
||||
cast_precision_loss = "allow"
|
||||
cast_lossless = "allow"
|
||||
many_single_char_names = "allow"
|
||||
uninlined_format_args = "allow"
|
||||
assigning_clones = "allow"
|
||||
@@ -0,0 +1,12 @@
|
||||
//! Adapter implementations for the [`WlanScanPort`] port.
|
||||
//!
|
||||
//! Each adapter targets a specific platform scanning mechanism:
|
||||
//! - [`NetshBssidScanner`]: Tier 1 -- parses `netsh wlan show networks mode=bssid`.
|
||||
//! - [`WlanApiScanner`]: Tier 2 -- async wrapper with metrics and future native FFI path.
|
||||
|
||||
pub(crate) mod netsh_scanner;
|
||||
pub mod wlanapi_scanner;
|
||||
|
||||
pub use netsh_scanner::NetshBssidScanner;
|
||||
pub use netsh_scanner::parse_netsh_output;
|
||||
pub use wlanapi_scanner::WlanApiScanner;
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,474 @@
|
||||
//! Tier 2: Windows WLAN API adapter for higher scan rates.
|
||||
//!
|
||||
//! This module provides a higher-rate scanning interface that targets 10-20 Hz
|
||||
//! scan rates compared to the Tier 1 [`NetshBssidScanner`]'s ~2 Hz limitation
|
||||
//! (caused by subprocess spawn overhead per scan).
|
||||
//!
|
||||
//! # Current implementation
|
||||
//!
|
||||
//! The adapter currently wraps [`NetshBssidScanner`] and provides:
|
||||
//!
|
||||
//! - **Synchronous scanning** via [`WlanScanPort`] trait implementation
|
||||
//! - **Async scanning** (feature-gated behind `"wlanapi"`) via
|
||||
//! `tokio::task::spawn_blocking`
|
||||
//! - **Scan metrics** (count, timing) for performance monitoring
|
||||
//! - **Rate estimation** based on observed inter-scan intervals
|
||||
//!
|
||||
//! # Future: native `wlanapi.dll` FFI
|
||||
//!
|
||||
//! When native WLAN API bindings are available, this adapter will call:
|
||||
//!
|
||||
//! - `WlanOpenHandle` -- open a session to the WLAN service
|
||||
//! - `WlanEnumInterfaces` -- discover WLAN adapters
|
||||
//! - `WlanScan` -- trigger a fresh scan
|
||||
//! - `WlanGetNetworkBssList` -- retrieve raw BSS entries with RSSI
|
||||
//! - `WlanCloseHandle` -- clean up the session handle
|
||||
//!
|
||||
//! This eliminates the `netsh.exe` process-spawn bottleneck and enables
|
||||
//! true 10-20 Hz scan rates suitable for real-time sensing.
|
||||
//!
|
||||
//! # Platform
|
||||
//!
|
||||
//! Windows only. On other platforms this module is not compiled.
|
||||
|
||||
use std::sync::atomic::{AtomicU64, Ordering};
|
||||
use std::time::{Duration, Instant};
|
||||
|
||||
use crate::adapter::netsh_scanner::NetshBssidScanner;
|
||||
use crate::domain::bssid::BssidObservation;
|
||||
use crate::error::WifiScanError;
|
||||
use crate::port::WlanScanPort;
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Scan metrics
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Accumulated metrics from scan operations.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct ScanMetrics {
|
||||
/// Total number of scans performed since creation.
|
||||
pub scan_count: u64,
|
||||
/// Total number of BSSIDs observed across all scans.
|
||||
pub total_bssids_observed: u64,
|
||||
/// Duration of the most recent scan.
|
||||
pub last_scan_duration: Option<Duration>,
|
||||
/// Estimated scan rate in Hz based on the last scan duration.
|
||||
/// Returns `None` if no scans have been performed yet.
|
||||
pub estimated_rate_hz: Option<f64>,
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// WlanApiScanner
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Tier 2 WLAN API scanner with async support and scan metrics.
|
||||
///
|
||||
/// Currently wraps [`NetshBssidScanner`] with performance instrumentation.
|
||||
/// When native WLAN API bindings become available, the inner implementation
|
||||
/// will switch to `WlanGetNetworkBssList` for approximately 10x higher scan
|
||||
/// rates without changing the public interface.
|
||||
///
|
||||
/// # Example (sync)
|
||||
///
|
||||
/// ```no_run
|
||||
/// use wifi_densepose_wifiscan::adapter::wlanapi_scanner::WlanApiScanner;
|
||||
/// use wifi_densepose_wifiscan::port::WlanScanPort;
|
||||
///
|
||||
/// let scanner = WlanApiScanner::new();
|
||||
/// let observations = scanner.scan().unwrap();
|
||||
/// for obs in &observations {
|
||||
/// println!("{}: {} dBm", obs.bssid, obs.rssi_dbm);
|
||||
/// }
|
||||
/// println!("metrics: {:?}", scanner.metrics());
|
||||
/// ```
|
||||
pub struct WlanApiScanner {
|
||||
/// The underlying Tier 1 scanner.
|
||||
inner: NetshBssidScanner,
|
||||
|
||||
/// Number of scans performed.
|
||||
scan_count: AtomicU64,
|
||||
|
||||
/// Total BSSIDs observed across all scans.
|
||||
total_bssids: AtomicU64,
|
||||
|
||||
/// Timestamp of the most recent scan start (for rate estimation).
|
||||
///
|
||||
/// Uses `std::sync::Mutex` because `Instant` is not atomic but we need
|
||||
/// interior mutability. The lock duration is negligible (one write per
|
||||
/// scan) so contention is not a concern.
|
||||
last_scan_start: std::sync::Mutex<Option<Instant>>,
|
||||
|
||||
/// Duration of the most recent scan.
|
||||
last_scan_duration: std::sync::Mutex<Option<Duration>>,
|
||||
}
|
||||
|
||||
impl WlanApiScanner {
|
||||
/// Create a new Tier 2 scanner.
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
inner: NetshBssidScanner::new(),
|
||||
scan_count: AtomicU64::new(0),
|
||||
total_bssids: AtomicU64::new(0),
|
||||
last_scan_start: std::sync::Mutex::new(None),
|
||||
last_scan_duration: std::sync::Mutex::new(None),
|
||||
}
|
||||
}
|
||||
|
||||
/// Return accumulated scan metrics.
|
||||
pub fn metrics(&self) -> ScanMetrics {
|
||||
let scan_count = self.scan_count.load(Ordering::Relaxed);
|
||||
let total_bssids_observed = self.total_bssids.load(Ordering::Relaxed);
|
||||
let last_scan_duration =
|
||||
*self.last_scan_duration.lock().unwrap_or_else(std::sync::PoisonError::into_inner);
|
||||
let estimated_rate_hz = last_scan_duration.map(|d| {
|
||||
let secs = d.as_secs_f64();
|
||||
if secs > 0.0 {
|
||||
1.0 / secs
|
||||
} else {
|
||||
f64::INFINITY
|
||||
}
|
||||
});
|
||||
|
||||
ScanMetrics {
|
||||
scan_count,
|
||||
total_bssids_observed,
|
||||
last_scan_duration,
|
||||
estimated_rate_hz,
|
||||
}
|
||||
}
|
||||
|
||||
/// Return the number of scans performed so far.
|
||||
pub fn scan_count(&self) -> u64 {
|
||||
self.scan_count.load(Ordering::Relaxed)
|
||||
}
|
||||
|
||||
/// Perform a synchronous scan with timing instrumentation.
|
||||
///
|
||||
/// This is the core scan method that both the [`WlanScanPort`] trait
|
||||
/// implementation and the async wrapper delegate to.
|
||||
fn scan_instrumented(&self) -> Result<Vec<BssidObservation>, WifiScanError> {
|
||||
let start = Instant::now();
|
||||
|
||||
// Record scan start time.
|
||||
if let Ok(mut guard) = self.last_scan_start.lock() {
|
||||
*guard = Some(start);
|
||||
}
|
||||
|
||||
// Delegate to the Tier 1 scanner.
|
||||
let results = self.inner.scan_sync()?;
|
||||
|
||||
// Record metrics.
|
||||
let elapsed = start.elapsed();
|
||||
if let Ok(mut guard) = self.last_scan_duration.lock() {
|
||||
*guard = Some(elapsed);
|
||||
}
|
||||
|
||||
self.scan_count.fetch_add(1, Ordering::Relaxed);
|
||||
self.total_bssids
|
||||
.fetch_add(results.len() as u64, Ordering::Relaxed);
|
||||
|
||||
tracing::debug!(
|
||||
scan_count = self.scan_count.load(Ordering::Relaxed),
|
||||
bssid_count = results.len(),
|
||||
elapsed_ms = elapsed.as_millis(),
|
||||
"Tier 2 scan complete"
|
||||
);
|
||||
|
||||
Ok(results)
|
||||
}
|
||||
|
||||
/// Perform an async scan by offloading the blocking netsh call to
|
||||
/// a background thread.
|
||||
///
|
||||
/// This is gated behind the `"wlanapi"` feature because it requires
|
||||
/// the `tokio` runtime dependency.
|
||||
///
|
||||
/// # Errors
|
||||
///
|
||||
/// Returns [`WifiScanError::ScanFailed`] if the background task panics
|
||||
/// or is cancelled, or propagates any error from the underlying scan.
|
||||
#[cfg(feature = "wlanapi")]
|
||||
pub async fn scan_async(&self) -> Result<Vec<BssidObservation>, WifiScanError> {
|
||||
// We need to create a fresh scanner for the blocking task because
|
||||
// `&self` is not `Send` across the spawn_blocking boundary.
|
||||
// `NetshBssidScanner` is cheap (zero-size struct) so this is fine.
|
||||
let inner = NetshBssidScanner::new();
|
||||
let start = Instant::now();
|
||||
|
||||
let results = tokio::task::spawn_blocking(move || inner.scan_sync())
|
||||
.await
|
||||
.map_err(|e| WifiScanError::ScanFailed {
|
||||
reason: format!("async scan task failed: {e}"),
|
||||
})??;
|
||||
|
||||
// Record metrics.
|
||||
let elapsed = start.elapsed();
|
||||
if let Ok(mut guard) = self.last_scan_duration.lock() {
|
||||
*guard = Some(elapsed);
|
||||
}
|
||||
self.scan_count.fetch_add(1, Ordering::Relaxed);
|
||||
self.total_bssids
|
||||
.fetch_add(results.len() as u64, Ordering::Relaxed);
|
||||
|
||||
tracing::debug!(
|
||||
scan_count = self.scan_count.load(Ordering::Relaxed),
|
||||
bssid_count = results.len(),
|
||||
elapsed_ms = elapsed.as_millis(),
|
||||
"Tier 2 async scan complete"
|
||||
);
|
||||
|
||||
Ok(results)
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for WlanApiScanner {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// WlanScanPort implementation (sync)
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
impl WlanScanPort for WlanApiScanner {
|
||||
fn scan(&self) -> Result<Vec<BssidObservation>, WifiScanError> {
|
||||
self.scan_instrumented()
|
||||
}
|
||||
|
||||
fn connected(&self) -> Result<Option<BssidObservation>, WifiScanError> {
|
||||
// Not yet implemented for Tier 2 -- fall back to a full scan and
|
||||
// return the strongest signal (heuristic for "likely connected").
|
||||
let mut results = self.scan_instrumented()?;
|
||||
if results.is_empty() {
|
||||
return Ok(None);
|
||||
}
|
||||
// Sort by signal strength descending; return the strongest.
|
||||
results.sort_by(|a, b| {
|
||||
b.rssi_dbm
|
||||
.partial_cmp(&a.rssi_dbm)
|
||||
.unwrap_or(std::cmp::Ordering::Equal)
|
||||
});
|
||||
Ok(Some(results.swap_remove(0)))
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Native WLAN API constants and frequency utilities
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Native WLAN API constants and frequency conversion utilities.
|
||||
///
|
||||
/// When implemented, this will contain:
|
||||
///
|
||||
/// ```ignore
|
||||
/// extern "system" {
|
||||
/// fn WlanOpenHandle(
|
||||
/// dwClientVersion: u32,
|
||||
/// pReserved: *const std::ffi::c_void,
|
||||
/// pdwNegotiatedVersion: *mut u32,
|
||||
/// phClientHandle: *mut HANDLE,
|
||||
/// ) -> u32;
|
||||
///
|
||||
/// fn WlanEnumInterfaces(
|
||||
/// hClientHandle: HANDLE,
|
||||
/// pReserved: *const std::ffi::c_void,
|
||||
/// ppInterfaceList: *mut *mut WLAN_INTERFACE_INFO_LIST,
|
||||
/// ) -> u32;
|
||||
///
|
||||
/// fn WlanGetNetworkBssList(
|
||||
/// hClientHandle: HANDLE,
|
||||
/// pInterfaceGuid: *const GUID,
|
||||
/// pDot11Ssid: *const DOT11_SSID,
|
||||
/// dot11BssType: DOT11_BSS_TYPE,
|
||||
/// bSecurityEnabled: BOOL,
|
||||
/// pReserved: *const std::ffi::c_void,
|
||||
/// ppWlanBssList: *mut *mut WLAN_BSS_LIST,
|
||||
/// ) -> u32;
|
||||
///
|
||||
/// fn WlanCloseHandle(
|
||||
/// hClientHandle: HANDLE,
|
||||
/// pReserved: *const std::ffi::c_void,
|
||||
/// ) -> u32;
|
||||
/// }
|
||||
/// ```
|
||||
///
|
||||
/// The native API returns `WLAN_BSS_ENTRY` structs that include:
|
||||
/// - `dot11Bssid` (6-byte MAC)
|
||||
/// - `lRssi` (dBm as i32)
|
||||
/// - `ulChCenterFrequency` (kHz, from which channel/band are derived)
|
||||
/// - `dot11BssPhyType` (maps to `RadioType`)
|
||||
///
|
||||
/// This eliminates the netsh subprocess overhead entirely.
|
||||
#[allow(dead_code)]
|
||||
mod wlan_ffi {
|
||||
/// WLAN API client version 2 (Vista+).
|
||||
pub const WLAN_CLIENT_VERSION_2: u32 = 2;
|
||||
|
||||
/// BSS type for infrastructure networks.
|
||||
pub const DOT11_BSS_TYPE_INFRASTRUCTURE: u32 = 1;
|
||||
|
||||
/// Convert a center frequency in kHz to an 802.11 channel number.
|
||||
///
|
||||
/// Covers 2.4 GHz (ch 1-14), 5 GHz (ch 36-177), and 6 GHz bands.
|
||||
#[allow(clippy::cast_possible_truncation)] // Channel numbers always fit in u8
|
||||
pub fn freq_khz_to_channel(frequency_khz: u32) -> u8 {
|
||||
let mhz = frequency_khz / 1000;
|
||||
match mhz {
|
||||
// 2.4 GHz band
|
||||
2412..=2472 => ((mhz - 2407) / 5) as u8,
|
||||
2484 => 14,
|
||||
// 5 GHz band
|
||||
5170..=5825 => ((mhz - 5000) / 5) as u8,
|
||||
// 6 GHz band (Wi-Fi 6E)
|
||||
5955..=7115 => ((mhz - 5950) / 5) as u8,
|
||||
_ => 0,
|
||||
}
|
||||
}
|
||||
|
||||
/// Convert a center frequency in kHz to a band type discriminant.
|
||||
///
|
||||
/// Returns 0 for 2.4 GHz, 1 for 5 GHz, 2 for 6 GHz.
|
||||
pub fn freq_khz_to_band(frequency_khz: u32) -> u8 {
|
||||
let mhz = frequency_khz / 1000;
|
||||
match mhz {
|
||||
5000..=5900 => 1, // 5 GHz
|
||||
5925..=7200 => 2, // 6 GHz
|
||||
_ => 0, // 2.4 GHz and unknown
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ===========================================================================
|
||||
// Tests
|
||||
// ===========================================================================
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
// -- construction ---------------------------------------------------------
|
||||
|
||||
#[test]
|
||||
fn new_creates_scanner_with_zero_metrics() {
|
||||
let scanner = WlanApiScanner::new();
|
||||
assert_eq!(scanner.scan_count(), 0);
|
||||
|
||||
let m = scanner.metrics();
|
||||
assert_eq!(m.scan_count, 0);
|
||||
assert_eq!(m.total_bssids_observed, 0);
|
||||
assert!(m.last_scan_duration.is_none());
|
||||
assert!(m.estimated_rate_hz.is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn default_creates_scanner() {
|
||||
let scanner = WlanApiScanner::default();
|
||||
assert_eq!(scanner.scan_count(), 0);
|
||||
}
|
||||
|
||||
// -- frequency conversion (FFI placeholder) --------------------------------
|
||||
|
||||
#[test]
|
||||
fn freq_khz_to_channel_2_4ghz() {
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(2_412_000), 1);
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(2_437_000), 6);
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(2_462_000), 11);
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(2_484_000), 14);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn freq_khz_to_channel_5ghz() {
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(5_180_000), 36);
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(5_240_000), 48);
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(5_745_000), 149);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn freq_khz_to_channel_6ghz() {
|
||||
// 6 GHz channel 1 = 5955 MHz
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(5_955_000), 1);
|
||||
// 6 GHz channel 5 = 5975 MHz
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(5_975_000), 5);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn freq_khz_to_channel_unknown_returns_zero() {
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(900_000), 0);
|
||||
assert_eq!(wlan_ffi::freq_khz_to_channel(0), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn freq_khz_to_band_classification() {
|
||||
assert_eq!(wlan_ffi::freq_khz_to_band(2_437_000), 0); // 2.4 GHz
|
||||
assert_eq!(wlan_ffi::freq_khz_to_band(5_180_000), 1); // 5 GHz
|
||||
assert_eq!(wlan_ffi::freq_khz_to_band(5_975_000), 2); // 6 GHz
|
||||
}
|
||||
|
||||
// -- WlanScanPort trait compliance -----------------------------------------
|
||||
|
||||
#[test]
|
||||
fn implements_wlan_scan_port() {
|
||||
// Compile-time check: WlanApiScanner implements WlanScanPort.
|
||||
fn assert_port<T: WlanScanPort>() {}
|
||||
assert_port::<WlanApiScanner>();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn implements_send_and_sync() {
|
||||
fn assert_send_sync<T: Send + Sync>() {}
|
||||
assert_send_sync::<WlanApiScanner>();
|
||||
}
|
||||
|
||||
// -- metrics structure -----------------------------------------------------
|
||||
|
||||
#[test]
|
||||
fn scan_metrics_debug_display() {
|
||||
let m = ScanMetrics {
|
||||
scan_count: 42,
|
||||
total_bssids_observed: 126,
|
||||
last_scan_duration: Some(Duration::from_millis(150)),
|
||||
estimated_rate_hz: Some(1.0 / 0.15),
|
||||
};
|
||||
let debug = format!("{m:?}");
|
||||
assert!(debug.contains("42"));
|
||||
assert!(debug.contains("126"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn scan_metrics_clone() {
|
||||
let m = ScanMetrics {
|
||||
scan_count: 1,
|
||||
total_bssids_observed: 5,
|
||||
last_scan_duration: None,
|
||||
estimated_rate_hz: None,
|
||||
};
|
||||
let m2 = m.clone();
|
||||
assert_eq!(m2.scan_count, 1);
|
||||
assert_eq!(m2.total_bssids_observed, 5);
|
||||
}
|
||||
|
||||
// -- rate estimation -------------------------------------------------------
|
||||
|
||||
#[test]
|
||||
fn estimated_rate_from_known_duration() {
|
||||
let scanner = WlanApiScanner::new();
|
||||
|
||||
// Manually set last_scan_duration to simulate a completed scan.
|
||||
{
|
||||
let mut guard = scanner.last_scan_duration.lock().unwrap();
|
||||
*guard = Some(Duration::from_millis(100));
|
||||
}
|
||||
|
||||
let m = scanner.metrics();
|
||||
let rate = m.estimated_rate_hz.unwrap();
|
||||
// 100ms per scan => 10 Hz
|
||||
assert!((rate - 10.0).abs() < 0.01, "expected ~10 Hz, got {rate}");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn estimated_rate_none_before_first_scan() {
|
||||
let scanner = WlanApiScanner::new();
|
||||
assert!(scanner.metrics().estimated_rate_hz.is_none());
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,282 @@
|
||||
//! Core value objects for BSSID identification and observation.
|
||||
//!
|
||||
//! These types form the shared kernel of the BSSID Acquisition bounded context
|
||||
//! as defined in ADR-022 section 3.1.
|
||||
|
||||
use std::fmt;
|
||||
use std::time::Instant;
|
||||
|
||||
#[cfg(feature = "serde")]
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
use crate::error::WifiScanError;
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// BssidId -- Value Object
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// A unique BSSID identifier wrapping a 6-byte IEEE 802.11 MAC address.
|
||||
///
|
||||
/// This is the primary identity for access points in the multi-BSSID scanning
|
||||
/// pipeline. Two `BssidId` values are equal when their MAC bytes match.
|
||||
#[derive(Clone, Copy, Hash, Eq, PartialEq, Ord, PartialOrd)]
|
||||
pub struct BssidId(pub [u8; 6]);
|
||||
|
||||
impl BssidId {
|
||||
/// Create a `BssidId` from a byte slice.
|
||||
///
|
||||
/// Returns an error if the slice is not exactly 6 bytes.
|
||||
pub fn from_bytes(bytes: &[u8]) -> Result<Self, WifiScanError> {
|
||||
let arr: [u8; 6] = bytes
|
||||
.try_into()
|
||||
.map_err(|_| WifiScanError::InvalidMac { len: bytes.len() })?;
|
||||
Ok(Self(arr))
|
||||
}
|
||||
|
||||
/// Parse a `BssidId` from a colon-separated hex string such as
|
||||
/// `"aa:bb:cc:dd:ee:ff"`.
|
||||
pub fn parse(s: &str) -> Result<Self, WifiScanError> {
|
||||
let parts: Vec<&str> = s.split(':').collect();
|
||||
if parts.len() != 6 {
|
||||
return Err(WifiScanError::MacParseFailed {
|
||||
input: s.to_owned(),
|
||||
});
|
||||
}
|
||||
|
||||
let mut bytes = [0u8; 6];
|
||||
for (i, part) in parts.iter().enumerate() {
|
||||
bytes[i] = u8::from_str_radix(part, 16).map_err(|_| WifiScanError::MacParseFailed {
|
||||
input: s.to_owned(),
|
||||
})?;
|
||||
}
|
||||
Ok(Self(bytes))
|
||||
}
|
||||
|
||||
/// Return the raw 6-byte MAC address.
|
||||
pub fn as_bytes(&self) -> &[u8; 6] {
|
||||
&self.0
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Debug for BssidId {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
write!(f, "BssidId({self})")
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for BssidId {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
let [a, b, c, d, e, g] = self.0;
|
||||
write!(f, "{a:02x}:{b:02x}:{c:02x}:{d:02x}:{e:02x}:{g:02x}")
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// BandType -- Value Object
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// The WiFi frequency band on which a BSSID operates.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub enum BandType {
|
||||
/// 2.4 GHz (channels 1-14)
|
||||
Band2_4GHz,
|
||||
/// 5 GHz (channels 36-177)
|
||||
Band5GHz,
|
||||
/// 6 GHz (Wi-Fi 6E / 7)
|
||||
Band6GHz,
|
||||
}
|
||||
|
||||
impl BandType {
|
||||
/// Infer the band from an 802.11 channel number.
|
||||
pub fn from_channel(channel: u8) -> Self {
|
||||
match channel {
|
||||
1..=14 => Self::Band2_4GHz,
|
||||
32..=177 => Self::Band5GHz,
|
||||
_ => Self::Band6GHz,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for BandType {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
Self::Band2_4GHz => write!(f, "2.4 GHz"),
|
||||
Self::Band5GHz => write!(f, "5 GHz"),
|
||||
Self::Band6GHz => write!(f, "6 GHz"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// RadioType -- Value Object
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// The 802.11 radio standard reported by the access point.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub enum RadioType {
|
||||
/// 802.11n (Wi-Fi 4)
|
||||
N,
|
||||
/// 802.11ac (Wi-Fi 5)
|
||||
Ac,
|
||||
/// 802.11ax (Wi-Fi 6 / 6E)
|
||||
Ax,
|
||||
/// 802.11be (Wi-Fi 7)
|
||||
Be,
|
||||
}
|
||||
|
||||
impl RadioType {
|
||||
/// Parse a radio type from a `netsh` output string such as `"802.11ax"`.
|
||||
///
|
||||
/// Returns `None` for unrecognised strings.
|
||||
pub fn from_netsh_str(s: &str) -> Option<Self> {
|
||||
let lower = s.trim().to_ascii_lowercase();
|
||||
if lower.contains("802.11be") || lower.contains("be") {
|
||||
Some(Self::Be)
|
||||
} else if lower.contains("802.11ax") || lower.contains("ax") || lower.contains("wi-fi 6")
|
||||
{
|
||||
Some(Self::Ax)
|
||||
} else if lower.contains("802.11ac") || lower.contains("ac") || lower.contains("wi-fi 5")
|
||||
{
|
||||
Some(Self::Ac)
|
||||
} else if lower.contains("802.11n") || lower.contains("wi-fi 4") {
|
||||
Some(Self::N)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl fmt::Display for RadioType {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
Self::N => write!(f, "802.11n"),
|
||||
Self::Ac => write!(f, "802.11ac"),
|
||||
Self::Ax => write!(f, "802.11ax"),
|
||||
Self::Be => write!(f, "802.11be"),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// BssidObservation -- Value Object
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// A single observation of a BSSID from a WiFi scan.
|
||||
///
|
||||
/// This is the fundamental measurement unit: one access point observed once
|
||||
/// at a specific point in time.
|
||||
#[derive(Clone, Debug)]
|
||||
pub struct BssidObservation {
|
||||
/// The MAC address of the observed access point.
|
||||
pub bssid: BssidId,
|
||||
/// Received signal strength in dBm (typically -30 to -90).
|
||||
pub rssi_dbm: f64,
|
||||
/// Signal quality as a percentage (0-100), as reported by the driver.
|
||||
pub signal_pct: f64,
|
||||
/// The 802.11 channel number.
|
||||
pub channel: u8,
|
||||
/// The frequency band.
|
||||
pub band: BandType,
|
||||
/// The 802.11 radio standard.
|
||||
pub radio_type: RadioType,
|
||||
/// The SSID (network name). May be empty for hidden networks.
|
||||
pub ssid: String,
|
||||
/// When this observation was captured.
|
||||
pub timestamp: Instant,
|
||||
}
|
||||
|
||||
impl BssidObservation {
|
||||
/// Convert signal percentage (0-100) to an approximate dBm value.
|
||||
///
|
||||
/// Uses the common linear mapping: `dBm = (pct / 2) - 100`.
|
||||
/// This matches the conversion used by Windows WLAN API.
|
||||
pub fn pct_to_dbm(pct: f64) -> f64 {
|
||||
(pct / 2.0) - 100.0
|
||||
}
|
||||
|
||||
/// Convert dBm to a linear amplitude suitable for pseudo-CSI frames.
|
||||
///
|
||||
/// Formula: `10^((rssi_dbm + 100) / 20)`, mapping -100 dBm to 1.0.
|
||||
pub fn rssi_to_amplitude(rssi_dbm: f64) -> f64 {
|
||||
10.0_f64.powf((rssi_dbm + 100.0) / 20.0)
|
||||
}
|
||||
|
||||
/// Return the amplitude of this observation (linear scale).
|
||||
pub fn amplitude(&self) -> f64 {
|
||||
Self::rssi_to_amplitude(self.rssi_dbm)
|
||||
}
|
||||
|
||||
/// Encode the channel number as a pseudo-phase value in `[0, pi]`.
|
||||
///
|
||||
/// This provides downstream pipeline compatibility with code that expects
|
||||
/// phase data, even though RSSI-based scanning has no true phase.
|
||||
pub fn pseudo_phase(&self) -> f64 {
|
||||
(self.channel as f64 / 48.0) * std::f64::consts::PI
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn bssid_id_roundtrip() {
|
||||
let mac = [0xaa, 0xbb, 0xcc, 0xdd, 0xee, 0xff];
|
||||
let id = BssidId(mac);
|
||||
assert_eq!(id.to_string(), "aa:bb:cc:dd:ee:ff");
|
||||
assert_eq!(BssidId::parse("aa:bb:cc:dd:ee:ff").unwrap(), id);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn bssid_id_parse_errors() {
|
||||
assert!(BssidId::parse("aa:bb:cc").is_err());
|
||||
assert!(BssidId::parse("zz:bb:cc:dd:ee:ff").is_err());
|
||||
assert!(BssidId::parse("").is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn bssid_id_from_bytes() {
|
||||
let bytes = vec![0x01, 0x02, 0x03, 0x04, 0x05, 0x06];
|
||||
let id = BssidId::from_bytes(&bytes).unwrap();
|
||||
assert_eq!(id.0, [0x01, 0x02, 0x03, 0x04, 0x05, 0x06]);
|
||||
|
||||
assert!(BssidId::from_bytes(&[0x01, 0x02]).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn band_type_from_channel() {
|
||||
assert_eq!(BandType::from_channel(1), BandType::Band2_4GHz);
|
||||
assert_eq!(BandType::from_channel(11), BandType::Band2_4GHz);
|
||||
assert_eq!(BandType::from_channel(36), BandType::Band5GHz);
|
||||
assert_eq!(BandType::from_channel(149), BandType::Band5GHz);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn radio_type_from_netsh() {
|
||||
assert_eq!(RadioType::from_netsh_str("802.11ax"), Some(RadioType::Ax));
|
||||
assert_eq!(RadioType::from_netsh_str("802.11ac"), Some(RadioType::Ac));
|
||||
assert_eq!(RadioType::from_netsh_str("802.11n"), Some(RadioType::N));
|
||||
assert_eq!(RadioType::from_netsh_str("802.11be"), Some(RadioType::Be));
|
||||
assert_eq!(RadioType::from_netsh_str("unknown"), None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pct_to_dbm_conversion() {
|
||||
// 100% -> -50 dBm
|
||||
assert!((BssidObservation::pct_to_dbm(100.0) - (-50.0)).abs() < f64::EPSILON);
|
||||
// 0% -> -100 dBm
|
||||
assert!((BssidObservation::pct_to_dbm(0.0) - (-100.0)).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn rssi_to_amplitude_baseline() {
|
||||
// At -100 dBm, amplitude should be 1.0
|
||||
let amp = BssidObservation::rssi_to_amplitude(-100.0);
|
||||
assert!((amp - 1.0).abs() < 1e-9);
|
||||
// At -80 dBm, amplitude should be 10.0
|
||||
let amp = BssidObservation::rssi_to_amplitude(-80.0);
|
||||
assert!((amp - 10.0).abs() < 1e-9);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,148 @@
|
||||
//! Multi-AP frame value object.
|
||||
//!
|
||||
//! A `MultiApFrame` is a snapshot of all BSSID observations at a single point
|
||||
//! in time. It serves as the input to the signal intelligence pipeline
|
||||
//! (Bounded Context 2 in ADR-022), providing the multi-dimensional
|
||||
//! pseudo-CSI data that replaces the single-RSSI approach.
|
||||
|
||||
use std::collections::VecDeque;
|
||||
use std::time::Instant;
|
||||
|
||||
/// A snapshot of all tracked BSSIDs at a single point in time.
|
||||
///
|
||||
/// This value object is produced by [`BssidRegistry::to_multi_ap_frame`] and
|
||||
/// consumed by the signal intelligence pipeline. Each index `i` in the
|
||||
/// vectors corresponds to the `i`-th entry in the registry's subcarrier map.
|
||||
///
|
||||
/// [`BssidRegistry::to_multi_ap_frame`]: crate::domain::registry::BssidRegistry::to_multi_ap_frame
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct MultiApFrame {
|
||||
/// Number of BSSIDs (pseudo-subcarriers) in this frame.
|
||||
pub bssid_count: usize,
|
||||
|
||||
/// RSSI values in dBm, one per BSSID.
|
||||
///
|
||||
/// Index matches the subcarrier map ordering.
|
||||
pub rssi_dbm: Vec<f64>,
|
||||
|
||||
/// Linear amplitudes derived from RSSI via `10^((rssi + 100) / 20)`.
|
||||
///
|
||||
/// This maps -100 dBm to amplitude 1.0, providing a scale that is
|
||||
/// compatible with the downstream attention and correlation stages.
|
||||
pub amplitudes: Vec<f64>,
|
||||
|
||||
/// Pseudo-phase values derived from channel numbers.
|
||||
///
|
||||
/// Encoded as `(channel / 48) * pi`, giving a value in `[0, pi]`.
|
||||
/// This is a heuristic that provides spatial diversity information
|
||||
/// to pipeline stages that expect phase data.
|
||||
pub phases: Vec<f64>,
|
||||
|
||||
/// Per-BSSID RSSI variance (Welford), one per BSSID.
|
||||
///
|
||||
/// High variance indicates a BSSID whose signal is modulated by body
|
||||
/// movement; low variance indicates a static background AP.
|
||||
pub per_bssid_variance: Vec<f64>,
|
||||
|
||||
/// Per-BSSID RSSI history (ring buffer), one per BSSID.
|
||||
///
|
||||
/// Used by the spatial correlator and breathing extractor to compute
|
||||
/// cross-correlation and spectral features.
|
||||
pub histories: Vec<VecDeque<f64>>,
|
||||
|
||||
/// Estimated effective sample rate in Hz.
|
||||
///
|
||||
/// Tier 1 (netsh): approximately 2 Hz.
|
||||
/// Tier 2 (wlanapi): approximately 10-20 Hz.
|
||||
pub sample_rate_hz: f64,
|
||||
|
||||
/// When this frame was constructed.
|
||||
pub timestamp: Instant,
|
||||
}
|
||||
|
||||
impl MultiApFrame {
|
||||
/// Whether this frame has enough BSSIDs for multi-AP sensing.
|
||||
///
|
||||
/// The `min_bssids` parameter comes from `WindowsWifiConfig::min_bssids`.
|
||||
pub fn is_sufficient(&self, min_bssids: usize) -> bool {
|
||||
self.bssid_count >= min_bssids
|
||||
}
|
||||
|
||||
/// The maximum amplitude across all BSSIDs. Returns 0.0 for empty frames.
|
||||
pub fn max_amplitude(&self) -> f64 {
|
||||
self.amplitudes
|
||||
.iter()
|
||||
.copied()
|
||||
.fold(0.0_f64, f64::max)
|
||||
}
|
||||
|
||||
/// The mean RSSI across all BSSIDs in dBm. Returns `f64::NEG_INFINITY`
|
||||
/// for empty frames.
|
||||
pub fn mean_rssi(&self) -> f64 {
|
||||
if self.rssi_dbm.is_empty() {
|
||||
return f64::NEG_INFINITY;
|
||||
}
|
||||
let sum: f64 = self.rssi_dbm.iter().sum();
|
||||
sum / self.rssi_dbm.len() as f64
|
||||
}
|
||||
|
||||
/// The total variance across all BSSIDs (sum of per-BSSID variances).
|
||||
///
|
||||
/// Higher values indicate more environmental change, which correlates
|
||||
/// with human presence and movement.
|
||||
pub fn total_variance(&self) -> f64 {
|
||||
self.per_bssid_variance.iter().sum()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
fn make_frame(bssid_count: usize, rssi_values: &[f64]) -> MultiApFrame {
|
||||
let amplitudes: Vec<f64> = rssi_values
|
||||
.iter()
|
||||
.map(|&r| 10.0_f64.powf((r + 100.0) / 20.0))
|
||||
.collect();
|
||||
MultiApFrame {
|
||||
bssid_count,
|
||||
rssi_dbm: rssi_values.to_vec(),
|
||||
amplitudes,
|
||||
phases: vec![0.0; bssid_count],
|
||||
per_bssid_variance: vec![0.1; bssid_count],
|
||||
histories: vec![VecDeque::new(); bssid_count],
|
||||
sample_rate_hz: 2.0,
|
||||
timestamp: Instant::now(),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn is_sufficient_checks_threshold() {
|
||||
let frame = make_frame(5, &[-60.0, -65.0, -70.0, -75.0, -80.0]);
|
||||
assert!(frame.is_sufficient(3));
|
||||
assert!(frame.is_sufficient(5));
|
||||
assert!(!frame.is_sufficient(6));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn mean_rssi_calculation() {
|
||||
let frame = make_frame(3, &[-60.0, -70.0, -80.0]);
|
||||
assert!((frame.mean_rssi() - (-70.0)).abs() < 1e-9);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn empty_frame_handles_gracefully() {
|
||||
let frame = make_frame(0, &[]);
|
||||
assert_eq!(frame.max_amplitude(), 0.0);
|
||||
assert!(frame.mean_rssi().is_infinite());
|
||||
assert_eq!(frame.total_variance(), 0.0);
|
||||
assert!(!frame.is_sufficient(1));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn total_variance_sums_per_bssid() {
|
||||
let mut frame = make_frame(3, &[-60.0, -70.0, -80.0]);
|
||||
frame.per_bssid_variance = vec![0.1, 0.2, 0.3];
|
||||
assert!((frame.total_variance() - 0.6).abs() < 1e-9);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,11 @@
|
||||
//! Domain types for the BSSID Acquisition bounded context (ADR-022).
|
||||
|
||||
pub mod bssid;
|
||||
pub mod frame;
|
||||
pub mod registry;
|
||||
pub mod result;
|
||||
|
||||
pub use bssid::{BandType, BssidId, BssidObservation, RadioType};
|
||||
pub use frame::MultiApFrame;
|
||||
pub use registry::{BssidEntry, BssidMeta, BssidRegistry, RunningStats};
|
||||
pub use result::EnhancedSensingResult;
|
||||
@@ -0,0 +1,511 @@
|
||||
//! BSSID Registry aggregate root.
|
||||
//!
|
||||
//! The `BssidRegistry` is the aggregate root of the BSSID Acquisition bounded
|
||||
//! context. It tracks all visible access points across scans, maintains
|
||||
//! identity stability as BSSIDs appear and disappear, and provides a
|
||||
//! consistent subcarrier mapping for pseudo-CSI frame construction.
|
||||
|
||||
use std::collections::HashMap;
|
||||
use std::collections::VecDeque;
|
||||
use std::time::Instant;
|
||||
|
||||
use crate::domain::bssid::{BandType, BssidId, BssidObservation, RadioType};
|
||||
use crate::domain::frame::MultiApFrame;
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// RunningStats -- Welford online statistics
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Welford online algorithm for computing running mean and variance.
|
||||
///
|
||||
/// This allows us to compute per-BSSID statistics incrementally without
|
||||
/// storing the entire history, which is essential for detecting which BSSIDs
|
||||
/// show body-correlated variance versus static background.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct RunningStats {
|
||||
/// Number of samples seen.
|
||||
count: u64,
|
||||
/// Running mean.
|
||||
mean: f64,
|
||||
/// Running M2 accumulator (sum of squared differences from the mean).
|
||||
m2: f64,
|
||||
}
|
||||
|
||||
impl RunningStats {
|
||||
/// Create a new empty `RunningStats`.
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
count: 0,
|
||||
mean: 0.0,
|
||||
m2: 0.0,
|
||||
}
|
||||
}
|
||||
|
||||
/// Push a new sample into the running statistics.
|
||||
pub fn push(&mut self, value: f64) {
|
||||
self.count += 1;
|
||||
let delta = value - self.mean;
|
||||
self.mean += delta / self.count as f64;
|
||||
let delta2 = value - self.mean;
|
||||
self.m2 += delta * delta2;
|
||||
}
|
||||
|
||||
/// The number of samples observed.
|
||||
pub fn count(&self) -> u64 {
|
||||
self.count
|
||||
}
|
||||
|
||||
/// The running mean. Returns 0.0 if no samples have been pushed.
|
||||
pub fn mean(&self) -> f64 {
|
||||
self.mean
|
||||
}
|
||||
|
||||
/// The population variance. Returns 0.0 if fewer than 2 samples.
|
||||
pub fn variance(&self) -> f64 {
|
||||
if self.count < 2 {
|
||||
0.0
|
||||
} else {
|
||||
self.m2 / self.count as f64
|
||||
}
|
||||
}
|
||||
|
||||
/// The sample variance (Bessel-corrected). Returns 0.0 if fewer than 2 samples.
|
||||
pub fn sample_variance(&self) -> f64 {
|
||||
if self.count < 2 {
|
||||
0.0
|
||||
} else {
|
||||
self.m2 / (self.count - 1) as f64
|
||||
}
|
||||
}
|
||||
|
||||
/// The population standard deviation.
|
||||
pub fn std_dev(&self) -> f64 {
|
||||
self.variance().sqrt()
|
||||
}
|
||||
|
||||
/// Reset all statistics to zero.
|
||||
pub fn reset(&mut self) {
|
||||
self.count = 0;
|
||||
self.mean = 0.0;
|
||||
self.m2 = 0.0;
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for RunningStats {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// BssidMeta -- metadata about a tracked BSSID
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Static metadata about a tracked BSSID, captured on first observation.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct BssidMeta {
|
||||
/// The SSID (network name). May be empty for hidden networks.
|
||||
pub ssid: String,
|
||||
/// The 802.11 channel number.
|
||||
pub channel: u8,
|
||||
/// The frequency band.
|
||||
pub band: BandType,
|
||||
/// The radio standard.
|
||||
pub radio_type: RadioType,
|
||||
/// When this BSSID was first observed.
|
||||
pub first_seen: Instant,
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// BssidEntry -- Entity
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// A tracked BSSID with observation history and running statistics.
|
||||
///
|
||||
/// Each entry corresponds to one physical access point. The ring buffer
|
||||
/// stores recent RSSI values (in dBm) for temporal analysis, while the
|
||||
/// `RunningStats` provides efficient online mean/variance without needing
|
||||
/// the full history.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct BssidEntry {
|
||||
/// The unique identifier for this BSSID.
|
||||
pub id: BssidId,
|
||||
/// Static metadata (SSID, channel, band, radio type).
|
||||
pub meta: BssidMeta,
|
||||
/// Ring buffer of recent RSSI observations (dBm).
|
||||
pub history: VecDeque<f64>,
|
||||
/// Welford online statistics over the full observation lifetime.
|
||||
pub stats: RunningStats,
|
||||
/// When this BSSID was last observed.
|
||||
pub last_seen: Instant,
|
||||
/// Index in the subcarrier map, or `None` if not yet assigned.
|
||||
pub subcarrier_idx: Option<usize>,
|
||||
}
|
||||
|
||||
impl BssidEntry {
|
||||
/// Maximum number of RSSI samples kept in the ring buffer history.
|
||||
pub const DEFAULT_HISTORY_CAPACITY: usize = 128;
|
||||
|
||||
/// Create a new entry from a first observation.
|
||||
fn new(obs: &BssidObservation) -> Self {
|
||||
let mut stats = RunningStats::new();
|
||||
stats.push(obs.rssi_dbm);
|
||||
|
||||
let mut history = VecDeque::with_capacity(Self::DEFAULT_HISTORY_CAPACITY);
|
||||
history.push_back(obs.rssi_dbm);
|
||||
|
||||
Self {
|
||||
id: obs.bssid,
|
||||
meta: BssidMeta {
|
||||
ssid: obs.ssid.clone(),
|
||||
channel: obs.channel,
|
||||
band: obs.band,
|
||||
radio_type: obs.radio_type,
|
||||
first_seen: obs.timestamp,
|
||||
},
|
||||
history,
|
||||
stats,
|
||||
last_seen: obs.timestamp,
|
||||
subcarrier_idx: None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Record a new observation for this BSSID.
|
||||
fn record(&mut self, obs: &BssidObservation) {
|
||||
self.stats.push(obs.rssi_dbm);
|
||||
|
||||
if self.history.len() >= Self::DEFAULT_HISTORY_CAPACITY {
|
||||
self.history.pop_front();
|
||||
}
|
||||
self.history.push_back(obs.rssi_dbm);
|
||||
|
||||
self.last_seen = obs.timestamp;
|
||||
|
||||
// Update mutable metadata in case the AP changed channel/band
|
||||
self.meta.channel = obs.channel;
|
||||
self.meta.band = obs.band;
|
||||
self.meta.radio_type = obs.radio_type;
|
||||
if !obs.ssid.is_empty() {
|
||||
self.meta.ssid = obs.ssid.clone();
|
||||
}
|
||||
}
|
||||
|
||||
/// The RSSI variance over the observation lifetime (Welford).
|
||||
pub fn variance(&self) -> f64 {
|
||||
self.stats.variance()
|
||||
}
|
||||
|
||||
/// The most recent RSSI observation in dBm.
|
||||
pub fn latest_rssi(&self) -> Option<f64> {
|
||||
self.history.back().copied()
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// BssidRegistry -- Aggregate Root
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Aggregate root that tracks all visible BSSIDs across scans.
|
||||
///
|
||||
/// The registry maintains:
|
||||
/// - A map of known BSSIDs with per-BSSID history and statistics.
|
||||
/// - An ordered subcarrier map that assigns each BSSID a stable index,
|
||||
/// sorted by first-seen time so that the mapping is deterministic.
|
||||
/// - Expiry logic to remove BSSIDs that have not been observed recently.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct BssidRegistry {
|
||||
/// Known BSSIDs with sliding window of observations.
|
||||
entries: HashMap<BssidId, BssidEntry>,
|
||||
/// Ordered list of BSSID IDs for consistent subcarrier mapping.
|
||||
/// Sorted by first-seen time for stability.
|
||||
subcarrier_map: Vec<BssidId>,
|
||||
/// Maximum number of tracked BSSIDs (maps to max pseudo-subcarriers).
|
||||
max_bssids: usize,
|
||||
/// How long a BSSID can go unseen before being expired (in seconds).
|
||||
expiry_secs: u64,
|
||||
}
|
||||
|
||||
impl BssidRegistry {
|
||||
/// Default maximum number of tracked BSSIDs.
|
||||
pub const DEFAULT_MAX_BSSIDS: usize = 32;
|
||||
|
||||
/// Default expiry time in seconds.
|
||||
pub const DEFAULT_EXPIRY_SECS: u64 = 30;
|
||||
|
||||
/// Create a new registry with the given capacity and expiry settings.
|
||||
pub fn new(max_bssids: usize, expiry_secs: u64) -> Self {
|
||||
Self {
|
||||
entries: HashMap::with_capacity(max_bssids),
|
||||
subcarrier_map: Vec::with_capacity(max_bssids),
|
||||
max_bssids,
|
||||
expiry_secs,
|
||||
}
|
||||
}
|
||||
|
||||
/// Update the registry with a batch of observations from a single scan.
|
||||
///
|
||||
/// New BSSIDs are registered and assigned subcarrier indices. Existing
|
||||
/// BSSIDs have their history and statistics updated. BSSIDs that have
|
||||
/// not been seen within the expiry window are removed.
|
||||
pub fn update(&mut self, observations: &[BssidObservation]) {
|
||||
let now = if let Some(obs) = observations.first() {
|
||||
obs.timestamp
|
||||
} else {
|
||||
return;
|
||||
};
|
||||
|
||||
// Update or insert each observed BSSID
|
||||
for obs in observations {
|
||||
if let Some(entry) = self.entries.get_mut(&obs.bssid) {
|
||||
entry.record(obs);
|
||||
} else if self.subcarrier_map.len() < self.max_bssids {
|
||||
// New BSSID: register it
|
||||
let mut entry = BssidEntry::new(obs);
|
||||
let idx = self.subcarrier_map.len();
|
||||
entry.subcarrier_idx = Some(idx);
|
||||
self.subcarrier_map.push(obs.bssid);
|
||||
self.entries.insert(obs.bssid, entry);
|
||||
}
|
||||
// If we are at capacity, silently ignore new BSSIDs.
|
||||
// A smarter policy (evict lowest-variance) can be added later.
|
||||
}
|
||||
|
||||
// Expire stale BSSIDs
|
||||
self.expire(now);
|
||||
}
|
||||
|
||||
/// Remove BSSIDs that have not been observed within the expiry window.
|
||||
fn expire(&mut self, now: Instant) {
|
||||
let expiry = std::time::Duration::from_secs(self.expiry_secs);
|
||||
let stale: Vec<BssidId> = self
|
||||
.entries
|
||||
.iter()
|
||||
.filter(|(_, entry)| now.duration_since(entry.last_seen) > expiry)
|
||||
.map(|(id, _)| *id)
|
||||
.collect();
|
||||
|
||||
for id in &stale {
|
||||
self.entries.remove(id);
|
||||
}
|
||||
|
||||
if !stale.is_empty() {
|
||||
// Rebuild the subcarrier map without the stale entries,
|
||||
// preserving relative ordering.
|
||||
self.subcarrier_map.retain(|id| !stale.contains(id));
|
||||
// Re-index remaining entries
|
||||
for (idx, id) in self.subcarrier_map.iter().enumerate() {
|
||||
if let Some(entry) = self.entries.get_mut(id) {
|
||||
entry.subcarrier_idx = Some(idx);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Look up the subcarrier index assigned to a BSSID.
|
||||
pub fn subcarrier_index(&self, bssid: &BssidId) -> Option<usize> {
|
||||
self.entries
|
||||
.get(bssid)
|
||||
.and_then(|entry| entry.subcarrier_idx)
|
||||
}
|
||||
|
||||
/// Return the ordered subcarrier map (list of BSSID IDs).
|
||||
pub fn subcarrier_map(&self) -> &[BssidId] {
|
||||
&self.subcarrier_map
|
||||
}
|
||||
|
||||
/// The number of currently tracked BSSIDs.
|
||||
pub fn len(&self) -> usize {
|
||||
self.entries.len()
|
||||
}
|
||||
|
||||
/// Whether the registry is empty.
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.entries.is_empty()
|
||||
}
|
||||
|
||||
/// The maximum number of BSSIDs this registry can track.
|
||||
pub fn capacity(&self) -> usize {
|
||||
self.max_bssids
|
||||
}
|
||||
|
||||
/// Get an entry by BSSID ID.
|
||||
pub fn get(&self, bssid: &BssidId) -> Option<&BssidEntry> {
|
||||
self.entries.get(bssid)
|
||||
}
|
||||
|
||||
/// Iterate over all tracked entries.
|
||||
pub fn entries(&self) -> impl Iterator<Item = &BssidEntry> {
|
||||
self.entries.values()
|
||||
}
|
||||
|
||||
/// Build a `MultiApFrame` from the current registry state.
|
||||
///
|
||||
/// The frame contains one slot per subcarrier (BSSID), with amplitudes
|
||||
/// derived from the most recent RSSI observation and pseudo-phase from
|
||||
/// the channel number.
|
||||
pub fn to_multi_ap_frame(&self) -> MultiApFrame {
|
||||
let n = self.subcarrier_map.len();
|
||||
let mut rssi_dbm = vec![0.0_f64; n];
|
||||
let mut amplitudes = vec![0.0_f64; n];
|
||||
let mut phases = vec![0.0_f64; n];
|
||||
let mut per_bssid_variance = vec![0.0_f64; n];
|
||||
let mut histories: Vec<VecDeque<f64>> = Vec::with_capacity(n);
|
||||
|
||||
for (idx, bssid_id) in self.subcarrier_map.iter().enumerate() {
|
||||
if let Some(entry) = self.entries.get(bssid_id) {
|
||||
let latest = entry.latest_rssi().unwrap_or(-100.0);
|
||||
rssi_dbm[idx] = latest;
|
||||
amplitudes[idx] = BssidObservation::rssi_to_amplitude(latest);
|
||||
phases[idx] = (entry.meta.channel as f64 / 48.0) * std::f64::consts::PI;
|
||||
per_bssid_variance[idx] = entry.variance();
|
||||
histories.push(entry.history.clone());
|
||||
} else {
|
||||
histories.push(VecDeque::new());
|
||||
}
|
||||
}
|
||||
|
||||
// Estimate sample rate from observation count and time span
|
||||
let sample_rate_hz = self.estimate_sample_rate();
|
||||
|
||||
MultiApFrame {
|
||||
bssid_count: n,
|
||||
rssi_dbm,
|
||||
amplitudes,
|
||||
phases,
|
||||
per_bssid_variance,
|
||||
histories,
|
||||
sample_rate_hz,
|
||||
timestamp: Instant::now(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Rough estimate of the effective sample rate based on observation history.
|
||||
fn estimate_sample_rate(&self) -> f64 {
|
||||
// Default to 2 Hz (Tier 1 netsh rate) when we cannot compute
|
||||
if self.entries.is_empty() {
|
||||
return 2.0;
|
||||
}
|
||||
|
||||
// Use the first entry with enough history
|
||||
for entry in self.entries.values() {
|
||||
if entry.stats.count() >= 4 {
|
||||
let elapsed = entry
|
||||
.last_seen
|
||||
.duration_since(entry.meta.first_seen)
|
||||
.as_secs_f64();
|
||||
if elapsed > 0.0 {
|
||||
return entry.stats.count() as f64 / elapsed;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
2.0 // Fallback: assume Tier 1 rate
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for BssidRegistry {
|
||||
fn default() -> Self {
|
||||
Self::new(Self::DEFAULT_MAX_BSSIDS, Self::DEFAULT_EXPIRY_SECS)
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Tests
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use crate::domain::bssid::{BandType, RadioType};
|
||||
|
||||
fn make_obs(mac: [u8; 6], rssi: f64, channel: u8) -> BssidObservation {
|
||||
BssidObservation {
|
||||
bssid: BssidId(mac),
|
||||
rssi_dbm: rssi,
|
||||
signal_pct: (rssi + 100.0) * 2.0,
|
||||
channel,
|
||||
band: BandType::from_channel(channel),
|
||||
radio_type: RadioType::Ax,
|
||||
ssid: "TestNetwork".to_string(),
|
||||
timestamp: Instant::now(),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn registry_tracks_new_bssids() {
|
||||
let mut reg = BssidRegistry::default();
|
||||
let obs = vec![
|
||||
make_obs([0x01; 6], -60.0, 6),
|
||||
make_obs([0x02; 6], -70.0, 36),
|
||||
];
|
||||
reg.update(&obs);
|
||||
|
||||
assert_eq!(reg.len(), 2);
|
||||
assert_eq!(reg.subcarrier_index(&BssidId([0x01; 6])), Some(0));
|
||||
assert_eq!(reg.subcarrier_index(&BssidId([0x02; 6])), Some(1));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn registry_updates_existing_bssid() {
|
||||
let mut reg = BssidRegistry::default();
|
||||
let mac = [0xaa; 6];
|
||||
|
||||
let obs1 = vec![make_obs(mac, -60.0, 6)];
|
||||
reg.update(&obs1);
|
||||
|
||||
let obs2 = vec![make_obs(mac, -65.0, 6)];
|
||||
reg.update(&obs2);
|
||||
|
||||
let entry = reg.get(&BssidId(mac)).unwrap();
|
||||
assert_eq!(entry.stats.count(), 2);
|
||||
assert_eq!(entry.history.len(), 2);
|
||||
assert!((entry.stats.mean() - (-62.5)).abs() < 1e-9);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn registry_respects_capacity() {
|
||||
let mut reg = BssidRegistry::new(2, 30);
|
||||
let obs = vec![
|
||||
make_obs([0x01; 6], -60.0, 1),
|
||||
make_obs([0x02; 6], -70.0, 6),
|
||||
make_obs([0x03; 6], -80.0, 11), // Should be ignored
|
||||
];
|
||||
reg.update(&obs);
|
||||
|
||||
assert_eq!(reg.len(), 2);
|
||||
assert!(reg.get(&BssidId([0x03; 6])).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn to_multi_ap_frame_builds_correct_frame() {
|
||||
let mut reg = BssidRegistry::default();
|
||||
let obs = vec![
|
||||
make_obs([0x01; 6], -60.0, 6),
|
||||
make_obs([0x02; 6], -70.0, 36),
|
||||
];
|
||||
reg.update(&obs);
|
||||
|
||||
let frame = reg.to_multi_ap_frame();
|
||||
assert_eq!(frame.bssid_count, 2);
|
||||
assert_eq!(frame.rssi_dbm.len(), 2);
|
||||
assert_eq!(frame.amplitudes.len(), 2);
|
||||
assert_eq!(frame.phases.len(), 2);
|
||||
assert!(frame.amplitudes[0] > frame.amplitudes[1]); // -60 dBm > -70 dBm
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn welford_stats_accuracy() {
|
||||
let mut stats = RunningStats::new();
|
||||
let values = [2.0, 4.0, 4.0, 4.0, 5.0, 5.0, 7.0, 9.0];
|
||||
for v in &values {
|
||||
stats.push(*v);
|
||||
}
|
||||
|
||||
assert_eq!(stats.count(), 8);
|
||||
assert!((stats.mean() - 5.0).abs() < 1e-9);
|
||||
// Population variance of this dataset is 4.0
|
||||
assert!((stats.variance() - 4.0).abs() < 1e-9);
|
||||
// Sample variance is 4.571428...
|
||||
assert!((stats.sample_variance() - (32.0 / 7.0)).abs() < 1e-9);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,216 @@
|
||||
//! Enhanced sensing result value object.
|
||||
//!
|
||||
//! The `EnhancedSensingResult` is the output of the signal intelligence
|
||||
//! pipeline, carrying motion, breathing, posture, and quality metrics
|
||||
//! derived from multi-BSSID pseudo-CSI data.
|
||||
|
||||
#[cfg(feature = "serde")]
|
||||
use serde::{Deserialize, Serialize};
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// MotionLevel
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Coarse classification of detected motion intensity.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub enum MotionLevel {
|
||||
/// No significant change in BSSID variance; room likely empty.
|
||||
None,
|
||||
/// Very small fluctuations consistent with a stationary person
|
||||
/// (e.g., breathing, minor fidgeting).
|
||||
Minimal,
|
||||
/// Moderate changes suggesting slow movement (e.g., walking, gesturing).
|
||||
Moderate,
|
||||
/// Large variance swings indicating vigorous or rapid movement.
|
||||
High,
|
||||
}
|
||||
|
||||
impl MotionLevel {
|
||||
/// Map a normalised motion score `[0.0, 1.0]` to a `MotionLevel`.
|
||||
///
|
||||
/// The thresholds are tuned for multi-BSSID RSSI variance and can be
|
||||
/// overridden via `WindowsWifiConfig` in the pipeline layer.
|
||||
pub fn from_score(score: f64) -> Self {
|
||||
if score < 0.05 {
|
||||
Self::None
|
||||
} else if score < 0.20 {
|
||||
Self::Minimal
|
||||
} else if score < 0.60 {
|
||||
Self::Moderate
|
||||
} else {
|
||||
Self::High
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// MotionEstimate
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Quantitative motion estimate from the multi-BSSID pipeline.
|
||||
#[derive(Debug, Clone)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub struct MotionEstimate {
|
||||
/// Normalised motion score in `[0.0, 1.0]`.
|
||||
pub score: f64,
|
||||
/// Coarse classification derived from the score.
|
||||
pub level: MotionLevel,
|
||||
/// The number of BSSIDs contributing to this estimate.
|
||||
pub contributing_bssids: usize,
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// BreathingEstimate
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Coarse respiratory rate estimate extracted from body-sensitive BSSIDs.
|
||||
///
|
||||
/// Only valid when motion level is `Minimal` (person stationary) and at
|
||||
/// least 3 body-correlated BSSIDs are available. The accuracy is limited
|
||||
/// by the low sample rate of Tier 1 scanning (~2 Hz).
|
||||
#[derive(Debug, Clone)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub struct BreathingEstimate {
|
||||
/// Estimated breaths per minute (typical: 12-20 for adults at rest).
|
||||
pub rate_bpm: f64,
|
||||
/// Confidence in the estimate, `[0.0, 1.0]`.
|
||||
pub confidence: f64,
|
||||
/// Number of BSSIDs used for the spectral analysis.
|
||||
pub bssid_count: usize,
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// PostureClass
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Coarse posture classification from BSSID fingerprint matching.
|
||||
///
|
||||
/// Based on Hopfield template matching of the multi-BSSID amplitude
|
||||
/// signature against stored reference patterns.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub enum PostureClass {
|
||||
/// Room appears empty.
|
||||
Empty,
|
||||
/// Person standing.
|
||||
Standing,
|
||||
/// Person sitting.
|
||||
Sitting,
|
||||
/// Person lying down.
|
||||
LyingDown,
|
||||
/// Person walking / in motion.
|
||||
Walking,
|
||||
/// Unknown posture (insufficient confidence).
|
||||
Unknown,
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// SignalQuality
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Signal quality metrics for the current multi-BSSID frame.
|
||||
#[derive(Debug, Clone)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub struct SignalQuality {
|
||||
/// Overall quality score `[0.0, 1.0]`, where 1.0 is excellent.
|
||||
pub score: f64,
|
||||
/// Number of BSSIDs in the current frame.
|
||||
pub bssid_count: usize,
|
||||
/// Spectral gap from the BSSID correlation graph.
|
||||
/// A large gap indicates good signal separation.
|
||||
pub spectral_gap: f64,
|
||||
/// Mean RSSI across all tracked BSSIDs (dBm).
|
||||
pub mean_rssi_dbm: f64,
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// Verdict
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// Quality gate verdict from the ruQu three-filter pipeline.
|
||||
///
|
||||
/// The pipeline evaluates structural integrity, statistical shift
|
||||
/// significance, and evidence accumulation before permitting a reading.
|
||||
#[derive(Debug, Clone, Copy, PartialEq, Eq, Hash)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub enum Verdict {
|
||||
/// Reading passed all quality gates and is reliable.
|
||||
Permit,
|
||||
/// Reading shows some anomalies but is usable with reduced confidence.
|
||||
Warn,
|
||||
/// Reading failed quality checks and should be discarded.
|
||||
Deny,
|
||||
}
|
||||
|
||||
// ---------------------------------------------------------------------------
|
||||
// EnhancedSensingResult
|
||||
// ---------------------------------------------------------------------------
|
||||
|
||||
/// The output of the multi-BSSID signal intelligence pipeline.
|
||||
///
|
||||
/// This value object carries all sensing information derived from a single
|
||||
/// scan cycle. It is converted to a `SensingUpdate` by the Sensing Output
|
||||
/// bounded context for delivery to the UI.
|
||||
#[derive(Debug, Clone)]
|
||||
#[cfg_attr(feature = "serde", derive(Serialize, Deserialize))]
|
||||
pub struct EnhancedSensingResult {
|
||||
/// Motion detection result.
|
||||
pub motion: MotionEstimate,
|
||||
/// Coarse respiratory rate, if detectable.
|
||||
pub breathing: Option<BreathingEstimate>,
|
||||
/// Posture classification, if available.
|
||||
pub posture: Option<PostureClass>,
|
||||
/// Signal quality metrics for the current frame.
|
||||
pub signal_quality: SignalQuality,
|
||||
/// Number of BSSIDs used in this sensing cycle.
|
||||
pub bssid_count: usize,
|
||||
/// Quality gate verdict.
|
||||
pub verdict: Verdict,
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn motion_level_thresholds() {
|
||||
assert_eq!(MotionLevel::from_score(0.0), MotionLevel::None);
|
||||
assert_eq!(MotionLevel::from_score(0.04), MotionLevel::None);
|
||||
assert_eq!(MotionLevel::from_score(0.05), MotionLevel::Minimal);
|
||||
assert_eq!(MotionLevel::from_score(0.19), MotionLevel::Minimal);
|
||||
assert_eq!(MotionLevel::from_score(0.20), MotionLevel::Moderate);
|
||||
assert_eq!(MotionLevel::from_score(0.59), MotionLevel::Moderate);
|
||||
assert_eq!(MotionLevel::from_score(0.60), MotionLevel::High);
|
||||
assert_eq!(MotionLevel::from_score(1.0), MotionLevel::High);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn enhanced_result_construction() {
|
||||
let result = EnhancedSensingResult {
|
||||
motion: MotionEstimate {
|
||||
score: 0.3,
|
||||
level: MotionLevel::Moderate,
|
||||
contributing_bssids: 10,
|
||||
},
|
||||
breathing: Some(BreathingEstimate {
|
||||
rate_bpm: 16.0,
|
||||
confidence: 0.7,
|
||||
bssid_count: 5,
|
||||
}),
|
||||
posture: Some(PostureClass::Standing),
|
||||
signal_quality: SignalQuality {
|
||||
score: 0.85,
|
||||
bssid_count: 15,
|
||||
spectral_gap: 0.42,
|
||||
mean_rssi_dbm: -65.0,
|
||||
},
|
||||
bssid_count: 15,
|
||||
verdict: Verdict::Permit,
|
||||
};
|
||||
|
||||
assert_eq!(result.motion.level, MotionLevel::Moderate);
|
||||
assert_eq!(result.verdict, Verdict::Permit);
|
||||
assert_eq!(result.bssid_count, 15);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,112 @@
|
||||
//! Error types for the wifi-densepose-wifiscan crate.
|
||||
|
||||
use std::fmt;
|
||||
|
||||
/// Errors that can occur during WiFi scanning and BSSID processing.
|
||||
#[derive(Debug, Clone)]
|
||||
pub enum WifiScanError {
|
||||
/// The BSSID MAC address bytes are invalid (must be exactly 6 bytes).
|
||||
InvalidMac {
|
||||
/// The number of bytes that were provided.
|
||||
len: usize,
|
||||
},
|
||||
|
||||
/// Failed to parse a MAC address string (expected `aa:bb:cc:dd:ee:ff`).
|
||||
MacParseFailed {
|
||||
/// The input string that could not be parsed.
|
||||
input: String,
|
||||
},
|
||||
|
||||
/// The scan backend returned an error.
|
||||
ScanFailed {
|
||||
/// Human-readable description of what went wrong.
|
||||
reason: String,
|
||||
},
|
||||
|
||||
/// Too few BSSIDs are visible for multi-AP mode.
|
||||
InsufficientBssids {
|
||||
/// Number of BSSIDs observed.
|
||||
observed: usize,
|
||||
/// Minimum required for multi-AP mode.
|
||||
required: usize,
|
||||
},
|
||||
|
||||
/// A BSSID was not found in the registry.
|
||||
BssidNotFound {
|
||||
/// The MAC address that was not found.
|
||||
bssid: [u8; 6],
|
||||
},
|
||||
|
||||
/// The subcarrier map is full and cannot accept more BSSIDs.
|
||||
SubcarrierMapFull {
|
||||
/// Maximum capacity of the subcarrier map.
|
||||
max: usize,
|
||||
},
|
||||
|
||||
/// An RSSI value is out of the expected range.
|
||||
RssiOutOfRange {
|
||||
/// The invalid RSSI value in dBm.
|
||||
value: f64,
|
||||
},
|
||||
|
||||
/// The requested operation is not supported by this adapter.
|
||||
Unsupported(String),
|
||||
|
||||
/// Failed to execute the scan subprocess.
|
||||
ProcessError(String),
|
||||
|
||||
/// Failed to parse scan output.
|
||||
ParseError(String),
|
||||
}
|
||||
|
||||
impl fmt::Display for WifiScanError {
|
||||
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
|
||||
match self {
|
||||
Self::InvalidMac { len } => {
|
||||
write!(f, "invalid MAC address: expected 6 bytes, got {len}")
|
||||
}
|
||||
Self::MacParseFailed { input } => {
|
||||
write!(
|
||||
f,
|
||||
"failed to parse MAC address from '{input}': expected aa:bb:cc:dd:ee:ff"
|
||||
)
|
||||
}
|
||||
Self::ScanFailed { reason } => {
|
||||
write!(f, "WiFi scan failed: {reason}")
|
||||
}
|
||||
Self::InsufficientBssids { observed, required } => {
|
||||
write!(
|
||||
f,
|
||||
"insufficient BSSIDs for multi-AP mode: {observed} observed, {required} required"
|
||||
)
|
||||
}
|
||||
Self::BssidNotFound { bssid } => {
|
||||
write!(
|
||||
f,
|
||||
"BSSID not found in registry: {:02x}:{:02x}:{:02x}:{:02x}:{:02x}:{:02x}",
|
||||
bssid[0], bssid[1], bssid[2], bssid[3], bssid[4], bssid[5]
|
||||
)
|
||||
}
|
||||
Self::SubcarrierMapFull { max } => {
|
||||
write!(
|
||||
f,
|
||||
"subcarrier map is full at {max} entries; cannot add more BSSIDs"
|
||||
)
|
||||
}
|
||||
Self::RssiOutOfRange { value } => {
|
||||
write!(f, "RSSI value {value} dBm is out of expected range [-120, 0]")
|
||||
}
|
||||
Self::Unsupported(msg) => {
|
||||
write!(f, "unsupported operation: {msg}")
|
||||
}
|
||||
Self::ProcessError(msg) => {
|
||||
write!(f, "scan process error: {msg}")
|
||||
}
|
||||
Self::ParseError(msg) => {
|
||||
write!(f, "scan output parse error: {msg}")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl std::error::Error for WifiScanError {}
|
||||
@@ -0,0 +1,30 @@
|
||||
//! # wifi-densepose-wifiscan
|
||||
//!
|
||||
//! Domain layer for multi-BSSID WiFi scanning and enhanced sensing (ADR-022).
|
||||
//!
|
||||
//! This crate implements the **BSSID Acquisition** bounded context, providing:
|
||||
//!
|
||||
//! - **Domain types**: [`BssidId`], [`BssidObservation`], [`BandType`], [`RadioType`]
|
||||
//! - **Port**: [`WlanScanPort`] -- trait abstracting the platform scan backend
|
||||
//! - **Adapter**: [`NetshBssidScanner`] -- Tier 1 adapter that parses
|
||||
//! `netsh wlan show networks mode=bssid` output
|
||||
|
||||
pub mod adapter;
|
||||
pub mod domain;
|
||||
pub mod error;
|
||||
pub mod pipeline;
|
||||
pub mod port;
|
||||
|
||||
// Re-export key types at the crate root for convenience.
|
||||
pub use adapter::NetshBssidScanner;
|
||||
pub use adapter::parse_netsh_output;
|
||||
pub use adapter::WlanApiScanner;
|
||||
pub use domain::bssid::{BandType, BssidId, BssidObservation, RadioType};
|
||||
pub use domain::frame::MultiApFrame;
|
||||
pub use domain::registry::{BssidEntry, BssidMeta, BssidRegistry, RunningStats};
|
||||
pub use domain::result::EnhancedSensingResult;
|
||||
pub use error::WifiScanError;
|
||||
pub use port::WlanScanPort;
|
||||
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub use pipeline::WindowsWifiPipeline;
|
||||
@@ -0,0 +1,129 @@
|
||||
//! Stage 2: Attention-based BSSID weighting.
|
||||
//!
|
||||
//! Uses scaled dot-product attention to learn which BSSIDs respond
|
||||
//! most to body movement. High-variance BSSIDs on body-affected
|
||||
//! paths get higher attention weights.
|
||||
//!
|
||||
//! When the `pipeline` feature is enabled, this uses
|
||||
//! `ruvector_attention::ScaledDotProductAttention` for the core
|
||||
//! attention computation. Otherwise, it falls back to a pure-Rust
|
||||
//! softmax implementation.
|
||||
|
||||
/// Weights BSSIDs by body-sensitivity using attention mechanism.
|
||||
pub struct AttentionWeighter {
|
||||
dim: usize,
|
||||
}
|
||||
|
||||
impl AttentionWeighter {
|
||||
/// Create a new attention weighter.
|
||||
///
|
||||
/// - `dim`: dimensionality of the attention space (typically 1 for scalar RSSI).
|
||||
#[must_use]
|
||||
pub fn new(dim: usize) -> Self {
|
||||
Self { dim }
|
||||
}
|
||||
|
||||
/// Compute attention-weighted output from BSSID residuals.
|
||||
///
|
||||
/// - `query`: the aggregated variance profile (1 x dim).
|
||||
/// - `keys`: per-BSSID residual vectors (`n_bssids` x dim).
|
||||
/// - `values`: per-BSSID amplitude vectors (`n_bssids` x dim).
|
||||
///
|
||||
/// Returns the weighted amplitude vector and per-BSSID weights.
|
||||
#[must_use]
|
||||
pub fn weight(
|
||||
&self,
|
||||
query: &[f32],
|
||||
keys: &[Vec<f32>],
|
||||
values: &[Vec<f32>],
|
||||
) -> (Vec<f32>, Vec<f32>) {
|
||||
if keys.is_empty() || values.is_empty() {
|
||||
return (vec![0.0; self.dim], vec![]);
|
||||
}
|
||||
|
||||
// Compute per-BSSID attention scores (softmax of q·k / sqrt(d))
|
||||
let scores = self.compute_scores(query, keys);
|
||||
|
||||
// Weighted sum of values
|
||||
let mut weighted = vec![0.0f32; self.dim];
|
||||
for (i, score) in scores.iter().enumerate() {
|
||||
if let Some(val) = values.get(i) {
|
||||
for (d, v) in weighted.iter_mut().zip(val.iter()) {
|
||||
*d += score * v;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
(weighted, scores)
|
||||
}
|
||||
|
||||
/// Compute raw attention scores (softmax of q*k / sqrt(d)).
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
fn compute_scores(&self, query: &[f32], keys: &[Vec<f32>]) -> Vec<f32> {
|
||||
let scale = (self.dim as f32).sqrt();
|
||||
let mut scores: Vec<f32> = keys
|
||||
.iter()
|
||||
.map(|key| {
|
||||
let dot: f32 = query.iter().zip(key.iter()).map(|(q, k)| q * k).sum();
|
||||
dot / scale
|
||||
})
|
||||
.collect();
|
||||
|
||||
// Softmax
|
||||
let max_score = scores.iter().copied().fold(f32::NEG_INFINITY, f32::max);
|
||||
let sum_exp: f32 = scores.iter().map(|&s| (s - max_score).exp()).sum();
|
||||
for s in &mut scores {
|
||||
*s = (*s - max_score).exp() / sum_exp;
|
||||
}
|
||||
scores
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn empty_input_returns_zero() {
|
||||
let weighter = AttentionWeighter::new(1);
|
||||
let (output, scores) = weighter.weight(&[0.0], &[], &[]);
|
||||
assert_eq!(output, vec![0.0]);
|
||||
assert!(scores.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn single_bssid_gets_full_weight() {
|
||||
let weighter = AttentionWeighter::new(1);
|
||||
let query = vec![1.0];
|
||||
let keys = vec![vec![1.0]];
|
||||
let values = vec![vec![5.0]];
|
||||
let (output, scores) = weighter.weight(&query, &keys, &values);
|
||||
assert!((scores[0] - 1.0).abs() < 1e-5, "single BSSID should have weight 1.0");
|
||||
assert!((output[0] - 5.0).abs() < 1e-3, "output should equal the single value");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn higher_residual_gets_more_weight() {
|
||||
let weighter = AttentionWeighter::new(1);
|
||||
let query = vec![1.0];
|
||||
// BSSID 0 has low residual, BSSID 1 has high residual
|
||||
let keys = vec![vec![0.1], vec![10.0]];
|
||||
let values = vec![vec![1.0], vec![1.0]];
|
||||
let (_output, scores) = weighter.weight(&query, &keys, &values);
|
||||
assert!(
|
||||
scores[1] > scores[0],
|
||||
"high-residual BSSID should get higher weight: {scores:?}"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn scores_sum_to_one() {
|
||||
let weighter = AttentionWeighter::new(1);
|
||||
let query = vec![1.0];
|
||||
let keys = vec![vec![0.5], vec![1.0], vec![2.0]];
|
||||
let values = vec![vec![1.0], vec![2.0], vec![3.0]];
|
||||
let (_output, scores) = weighter.weight(&query, &keys, &values);
|
||||
let sum: f32 = scores.iter().sum();
|
||||
assert!((sum - 1.0).abs() < 1e-5, "scores should sum to 1.0, got {sum}");
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,277 @@
|
||||
//! Stage 5: Coarse breathing rate extraction.
|
||||
//!
|
||||
//! Extracts respiratory rate from body-sensitive BSSID oscillations.
|
||||
//! Uses a simple bandpass filter (0.1-0.5 Hz) and zero-crossing
|
||||
//! analysis rather than `OscillatoryRouter` (which is designed for
|
||||
//! gamma-band frequencies, not sub-Hz breathing).
|
||||
|
||||
/// Coarse breathing extractor from multi-BSSID signal variance.
|
||||
pub struct CoarseBreathingExtractor {
|
||||
/// Combined filtered signal history.
|
||||
filtered_history: Vec<f32>,
|
||||
/// Window size for analysis.
|
||||
window: usize,
|
||||
/// Maximum tracked BSSIDs.
|
||||
n_bssids: usize,
|
||||
/// Breathing band low cutoff (Hz).
|
||||
freq_low: f32,
|
||||
/// Breathing band high cutoff (Hz).
|
||||
freq_high: f32,
|
||||
/// Sample rate (Hz) -- typically 2 Hz for Tier 1.
|
||||
sample_rate: f32,
|
||||
/// IIR filter state (simple 2nd-order bandpass).
|
||||
filter_state: IirState,
|
||||
}
|
||||
|
||||
/// Simple IIR bandpass filter state (2nd order).
|
||||
#[derive(Clone, Debug)]
|
||||
struct IirState {
|
||||
x1: f32,
|
||||
x2: f32,
|
||||
y1: f32,
|
||||
y2: f32,
|
||||
}
|
||||
|
||||
impl Default for IirState {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
x1: 0.0,
|
||||
x2: 0.0,
|
||||
y1: 0.0,
|
||||
y2: 0.0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl CoarseBreathingExtractor {
|
||||
/// Create a breathing extractor.
|
||||
///
|
||||
/// - `n_bssids`: maximum BSSID slots.
|
||||
/// - `sample_rate`: input sample rate in Hz.
|
||||
/// - `freq_low`: breathing band low cutoff (default 0.1 Hz).
|
||||
/// - `freq_high`: breathing band high cutoff (default 0.5 Hz).
|
||||
#[must_use]
|
||||
#[allow(clippy::cast_possible_truncation, clippy::cast_sign_loss)]
|
||||
pub fn new(n_bssids: usize, sample_rate: f32, freq_low: f32, freq_high: f32) -> Self {
|
||||
let window = (sample_rate * 30.0) as usize; // 30 seconds of data
|
||||
Self {
|
||||
filtered_history: Vec::with_capacity(window),
|
||||
window,
|
||||
n_bssids,
|
||||
freq_low,
|
||||
freq_high,
|
||||
sample_rate,
|
||||
filter_state: IirState::default(),
|
||||
}
|
||||
}
|
||||
|
||||
/// Create with defaults suitable for Tier 1 (2 Hz sample rate).
|
||||
#[must_use]
|
||||
pub fn tier1_default(n_bssids: usize) -> Self {
|
||||
Self::new(n_bssids, 2.0, 0.1, 0.5)
|
||||
}
|
||||
|
||||
/// Process a frame of residuals with attention weights.
|
||||
/// Returns estimated breathing rate (BPM) if detectable.
|
||||
///
|
||||
/// - `residuals`: per-BSSID residuals from `PredictiveGate`.
|
||||
/// - `weights`: per-BSSID attention weights.
|
||||
pub fn extract(&mut self, residuals: &[f32], weights: &[f32]) -> Option<BreathingEstimate> {
|
||||
let n = residuals.len().min(self.n_bssids);
|
||||
if n == 0 {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Compute weighted sum of residuals for breathing analysis
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
let weighted_signal: f32 = residuals
|
||||
.iter()
|
||||
.enumerate()
|
||||
.take(n)
|
||||
.map(|(i, &r)| {
|
||||
let w = weights.get(i).copied().unwrap_or(1.0 / n as f32);
|
||||
r * w
|
||||
})
|
||||
.sum();
|
||||
|
||||
// Apply bandpass filter
|
||||
let filtered = self.bandpass_filter(weighted_signal);
|
||||
|
||||
// Store in history
|
||||
self.filtered_history.push(filtered);
|
||||
if self.filtered_history.len() > self.window {
|
||||
self.filtered_history.remove(0);
|
||||
}
|
||||
|
||||
// Need at least 10 seconds of data to estimate breathing
|
||||
#[allow(clippy::cast_possible_truncation, clippy::cast_sign_loss)]
|
||||
let min_samples = (self.sample_rate * 10.0) as usize;
|
||||
if self.filtered_history.len() < min_samples {
|
||||
return None;
|
||||
}
|
||||
|
||||
// Zero-crossing rate -> frequency
|
||||
let crossings = count_zero_crossings(&self.filtered_history);
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
let duration_s = self.filtered_history.len() as f32 / self.sample_rate;
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
let frequency_hz = crossings as f32 / (2.0 * duration_s);
|
||||
|
||||
// Validate frequency is in breathing range
|
||||
if frequency_hz < self.freq_low || frequency_hz > self.freq_high {
|
||||
return None;
|
||||
}
|
||||
|
||||
let bpm = frequency_hz * 60.0;
|
||||
|
||||
// Compute confidence based on signal regularity
|
||||
let confidence = compute_confidence(&self.filtered_history);
|
||||
|
||||
Some(BreathingEstimate {
|
||||
bpm,
|
||||
frequency_hz,
|
||||
confidence,
|
||||
})
|
||||
}
|
||||
|
||||
/// Simple 2nd-order IIR bandpass filter.
|
||||
fn bandpass_filter(&mut self, input: f32) -> f32 {
|
||||
let state = &mut self.filter_state;
|
||||
|
||||
// Butterworth bandpass coefficients for [freq_low, freq_high] at given sample rate.
|
||||
// Using bilinear transform approximation.
|
||||
let omega_low = 2.0 * std::f32::consts::PI * self.freq_low / self.sample_rate;
|
||||
let omega_high = 2.0 * std::f32::consts::PI * self.freq_high / self.sample_rate;
|
||||
let bw = omega_high - omega_low;
|
||||
let center = f32::midpoint(omega_low, omega_high);
|
||||
|
||||
let r = 1.0 - bw / 2.0;
|
||||
let cos_w0 = center.cos();
|
||||
|
||||
// y[n] = (1-r)*(x[n] - x[n-2]) + 2*r*cos(w0)*y[n-1] - r^2*y[n-2]
|
||||
let output =
|
||||
(1.0 - r) * (input - state.x2) + 2.0 * r * cos_w0 * state.y1 - r * r * state.y2;
|
||||
|
||||
state.x2 = state.x1;
|
||||
state.x1 = input;
|
||||
state.y2 = state.y1;
|
||||
state.y1 = output;
|
||||
|
||||
output
|
||||
}
|
||||
|
||||
/// Reset all filter states and histories.
|
||||
pub fn reset(&mut self) {
|
||||
self.filtered_history.clear();
|
||||
self.filter_state = IirState::default();
|
||||
}
|
||||
}
|
||||
|
||||
/// Result of breathing extraction.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct BreathingEstimate {
|
||||
/// Estimated breathing rate in breaths per minute.
|
||||
pub bpm: f32,
|
||||
/// Estimated breathing frequency in Hz.
|
||||
pub frequency_hz: f32,
|
||||
/// Confidence in the estimate [0, 1].
|
||||
pub confidence: f32,
|
||||
}
|
||||
|
||||
/// Compute confidence in the breathing estimate based on signal regularity.
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
fn compute_confidence(history: &[f32]) -> f32 {
|
||||
if history.len() < 4 {
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
// Use variance-based SNR as a confidence metric
|
||||
let mean: f32 = history.iter().sum::<f32>() / history.len() as f32;
|
||||
let variance: f32 = history
|
||||
.iter()
|
||||
.map(|x| (x - mean) * (x - mean))
|
||||
.sum::<f32>()
|
||||
/ history.len() as f32;
|
||||
|
||||
if variance < 1e-10 {
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
// Simple SNR-based confidence
|
||||
let peak = history.iter().map(|x| x.abs()).fold(0.0f32, f32::max);
|
||||
let noise = variance.sqrt();
|
||||
|
||||
let snr = if noise > 1e-10 { peak / noise } else { 0.0 };
|
||||
|
||||
// Map SNR to [0, 1] confidence
|
||||
(snr / 5.0).min(1.0)
|
||||
}
|
||||
|
||||
/// Count zero crossings in a signal.
|
||||
fn count_zero_crossings(signal: &[f32]) -> usize {
|
||||
signal.windows(2).filter(|w| w[0] * w[1] < 0.0).count()
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn no_data_returns_none() {
|
||||
let mut ext = CoarseBreathingExtractor::tier1_default(4);
|
||||
assert!(ext.extract(&[], &[]).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn insufficient_history_returns_none() {
|
||||
let mut ext = CoarseBreathingExtractor::tier1_default(4);
|
||||
// Just a few frames are not enough
|
||||
for _ in 0..5 {
|
||||
assert!(ext.extract(&[1.0, 2.0], &[0.5, 0.5]).is_none());
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn sinusoidal_breathing_detected() {
|
||||
let mut ext = CoarseBreathingExtractor::new(1, 10.0, 0.1, 0.5);
|
||||
let breathing_freq = 0.25; // 15 BPM
|
||||
|
||||
// Generate 60 seconds of sinusoidal breathing signal at 10 Hz
|
||||
for i in 0..600 {
|
||||
let t = i as f32 / 10.0;
|
||||
let signal = (2.0 * std::f32::consts::PI * breathing_freq * t).sin();
|
||||
ext.extract(&[signal], &[1.0]);
|
||||
}
|
||||
|
||||
let result = ext.extract(&[0.0], &[1.0]);
|
||||
if let Some(est) = result {
|
||||
// Should be approximately 15 BPM (0.25 Hz * 60)
|
||||
assert!(
|
||||
est.bpm > 5.0 && est.bpm < 40.0,
|
||||
"estimated BPM should be in breathing range: {}",
|
||||
est.bpm
|
||||
);
|
||||
}
|
||||
// It is acceptable if None -- the bandpass filter may need tuning
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_crossings_count() {
|
||||
let signal = vec![1.0, -1.0, 1.0, -1.0, 1.0];
|
||||
assert_eq!(count_zero_crossings(&signal), 4);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_crossings_constant() {
|
||||
let signal = vec![1.0, 1.0, 1.0, 1.0];
|
||||
assert_eq!(count_zero_crossings(&signal), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reset_clears_state() {
|
||||
let mut ext = CoarseBreathingExtractor::tier1_default(2);
|
||||
ext.extract(&[1.0, 2.0], &[0.5, 0.5]);
|
||||
ext.reset();
|
||||
assert!(ext.filtered_history.is_empty());
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,267 @@
|
||||
//! Stage 3: BSSID spatial correlation via GNN message passing.
|
||||
//!
|
||||
//! Builds a cross-correlation graph where nodes are BSSIDs and edges
|
||||
//! represent temporal cross-correlation between their RSSI histories.
|
||||
//! A single message-passing step identifies co-varying BSSID clusters
|
||||
//! that are likely affected by the same person.
|
||||
|
||||
/// BSSID correlator that computes pairwise Pearson correlation
|
||||
/// and identifies co-varying clusters.
|
||||
///
|
||||
/// Note: The full `RuvectorLayer` GNN requires matching dimension
|
||||
/// weights trained on CSI data. For Phase 2 we use a lightweight
|
||||
/// correlation-based approach that can be upgraded to GNN later.
|
||||
pub struct BssidCorrelator {
|
||||
/// Per-BSSID history buffers for correlation computation.
|
||||
histories: Vec<Vec<f32>>,
|
||||
/// Maximum history length.
|
||||
window: usize,
|
||||
/// Number of tracked BSSIDs.
|
||||
n_bssids: usize,
|
||||
/// Correlation threshold for "co-varying" classification.
|
||||
correlation_threshold: f32,
|
||||
}
|
||||
|
||||
impl BssidCorrelator {
|
||||
/// Create a new correlator.
|
||||
///
|
||||
/// - `n_bssids`: number of BSSID slots.
|
||||
/// - `window`: correlation window size (number of frames).
|
||||
/// - `correlation_threshold`: minimum |r| to consider BSSIDs co-varying.
|
||||
#[must_use]
|
||||
pub fn new(n_bssids: usize, window: usize, correlation_threshold: f32) -> Self {
|
||||
Self {
|
||||
histories: vec![Vec::with_capacity(window); n_bssids],
|
||||
window,
|
||||
n_bssids,
|
||||
correlation_threshold,
|
||||
}
|
||||
}
|
||||
|
||||
/// Push a new frame of amplitudes and compute correlation features.
|
||||
///
|
||||
/// Returns a `CorrelationResult` with the correlation matrix and
|
||||
/// cluster assignments.
|
||||
pub fn update(&mut self, amplitudes: &[f32]) -> CorrelationResult {
|
||||
let n = amplitudes.len().min(self.n_bssids);
|
||||
|
||||
// Update histories
|
||||
for (i, &) in amplitudes.iter().enumerate().take(n) {
|
||||
let hist = &mut self.histories[i];
|
||||
hist.push(amp);
|
||||
if hist.len() > self.window {
|
||||
hist.remove(0);
|
||||
}
|
||||
}
|
||||
|
||||
// Compute pairwise Pearson correlation
|
||||
let mut corr_matrix = vec![vec![0.0f32; n]; n];
|
||||
#[allow(clippy::needless_range_loop)]
|
||||
for i in 0..n {
|
||||
corr_matrix[i][i] = 1.0;
|
||||
for j in (i + 1)..n {
|
||||
let r = pearson_r(&self.histories[i], &self.histories[j]);
|
||||
corr_matrix[i][j] = r;
|
||||
corr_matrix[j][i] = r;
|
||||
}
|
||||
}
|
||||
|
||||
// Find strongly correlated clusters (simple union-find)
|
||||
let clusters = self.find_clusters(&corr_matrix, n);
|
||||
|
||||
// Compute per-BSSID "spatial diversity" score:
|
||||
// how many other BSSIDs is each one correlated with
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
let diversity: Vec<f32> = (0..n)
|
||||
.map(|i| {
|
||||
let count = (0..n)
|
||||
.filter(|&j| j != i && corr_matrix[i][j].abs() > self.correlation_threshold)
|
||||
.count();
|
||||
count as f32 / (n.max(1) - 1) as f32
|
||||
})
|
||||
.collect();
|
||||
|
||||
CorrelationResult {
|
||||
matrix: corr_matrix,
|
||||
clusters,
|
||||
diversity,
|
||||
n_active: n,
|
||||
}
|
||||
}
|
||||
|
||||
/// Simple cluster assignment via thresholded correlation.
|
||||
fn find_clusters(&self, corr: &[Vec<f32>], n: usize) -> Vec<usize> {
|
||||
let mut cluster_id = vec![0usize; n];
|
||||
let mut next_cluster = 0usize;
|
||||
let mut assigned = vec![false; n];
|
||||
|
||||
for i in 0..n {
|
||||
if assigned[i] {
|
||||
continue;
|
||||
}
|
||||
cluster_id[i] = next_cluster;
|
||||
assigned[i] = true;
|
||||
|
||||
// BFS: assign same cluster to correlated BSSIDs
|
||||
let mut queue = vec![i];
|
||||
while let Some(current) = queue.pop() {
|
||||
for j in 0..n {
|
||||
if !assigned[j] && corr[current][j].abs() > self.correlation_threshold {
|
||||
cluster_id[j] = next_cluster;
|
||||
assigned[j] = true;
|
||||
queue.push(j);
|
||||
}
|
||||
}
|
||||
}
|
||||
next_cluster += 1;
|
||||
}
|
||||
cluster_id
|
||||
}
|
||||
|
||||
/// Reset all correlation histories.
|
||||
pub fn reset(&mut self) {
|
||||
for h in &mut self.histories {
|
||||
h.clear();
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Result of correlation analysis.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct CorrelationResult {
|
||||
/// n x n Pearson correlation matrix.
|
||||
pub matrix: Vec<Vec<f32>>,
|
||||
/// Cluster assignment per BSSID.
|
||||
pub clusters: Vec<usize>,
|
||||
/// Per-BSSID spatial diversity score [0, 1].
|
||||
pub diversity: Vec<f32>,
|
||||
/// Number of active BSSIDs in this frame.
|
||||
pub n_active: usize,
|
||||
}
|
||||
|
||||
impl CorrelationResult {
|
||||
/// Number of distinct clusters.
|
||||
#[must_use]
|
||||
pub fn n_clusters(&self) -> usize {
|
||||
self.clusters.iter().copied().max().map_or(0, |m| m + 1)
|
||||
}
|
||||
|
||||
/// Mean absolute correlation (proxy for signal coherence).
|
||||
#[must_use]
|
||||
pub fn mean_correlation(&self) -> f32 {
|
||||
if self.n_active < 2 {
|
||||
return 0.0;
|
||||
}
|
||||
let mut sum = 0.0f32;
|
||||
let mut count = 0;
|
||||
for i in 0..self.n_active {
|
||||
for j in (i + 1)..self.n_active {
|
||||
sum += self.matrix[i][j].abs();
|
||||
count += 1;
|
||||
}
|
||||
}
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
let mean = if count == 0 { 0.0 } else { sum / count as f32 };
|
||||
mean
|
||||
}
|
||||
}
|
||||
|
||||
/// Pearson correlation coefficient between two equal-length slices.
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
fn pearson_r(x: &[f32], y: &[f32]) -> f32 {
|
||||
let n = x.len().min(y.len());
|
||||
if n < 2 {
|
||||
return 0.0;
|
||||
}
|
||||
let n_f = n as f32;
|
||||
|
||||
let mean_x: f32 = x.iter().take(n).sum::<f32>() / n_f;
|
||||
let mean_y: f32 = y.iter().take(n).sum::<f32>() / n_f;
|
||||
|
||||
let mut cov = 0.0f32;
|
||||
let mut var_x = 0.0f32;
|
||||
let mut var_y = 0.0f32;
|
||||
|
||||
for i in 0..n {
|
||||
let dx = x[i] - mean_x;
|
||||
let dy = y[i] - mean_y;
|
||||
cov += dx * dy;
|
||||
var_x += dx * dx;
|
||||
var_y += dy * dy;
|
||||
}
|
||||
|
||||
let denom = (var_x * var_y).sqrt();
|
||||
if denom < 1e-12 {
|
||||
0.0
|
||||
} else {
|
||||
cov / denom
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn pearson_perfect_correlation() {
|
||||
let x = vec![1.0, 2.0, 3.0, 4.0, 5.0];
|
||||
let y = vec![2.0, 4.0, 6.0, 8.0, 10.0];
|
||||
let r = pearson_r(&x, &y);
|
||||
assert!((r - 1.0).abs() < 1e-5, "perfect positive correlation: {r}");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pearson_negative_correlation() {
|
||||
let x = vec![1.0, 2.0, 3.0, 4.0, 5.0];
|
||||
let y = vec![10.0, 8.0, 6.0, 4.0, 2.0];
|
||||
let r = pearson_r(&x, &y);
|
||||
assert!((r - (-1.0)).abs() < 1e-5, "perfect negative correlation: {r}");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pearson_no_correlation() {
|
||||
let x = vec![1.0, 2.0, 3.0, 4.0, 5.0];
|
||||
let y = vec![5.0, 1.0, 4.0, 2.0, 3.0]; // shuffled
|
||||
let r = pearson_r(&x, &y);
|
||||
assert!(r.abs() < 0.5, "low correlation expected: {r}");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn correlator_basic_update() {
|
||||
let mut corr = BssidCorrelator::new(3, 10, 0.7);
|
||||
// Push several identical frames
|
||||
for _ in 0..5 {
|
||||
corr.update(&[1.0, 2.0, 3.0]);
|
||||
}
|
||||
let result = corr.update(&[1.0, 2.0, 3.0]);
|
||||
assert_eq!(result.n_active, 3);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn correlator_detects_covarying_bssids() {
|
||||
let mut corr = BssidCorrelator::new(3, 20, 0.8);
|
||||
// BSSID 0 and 1 co-vary, BSSID 2 is independent
|
||||
for i in 0..20 {
|
||||
let v = i as f32;
|
||||
corr.update(&[v, v * 2.0, 5.0]); // 0 and 1 correlate, 2 is constant
|
||||
}
|
||||
let result = corr.update(&[20.0, 40.0, 5.0]);
|
||||
// BSSIDs 0 and 1 should be in the same cluster
|
||||
assert_eq!(
|
||||
result.clusters[0], result.clusters[1],
|
||||
"co-varying BSSIDs should cluster: {:?}",
|
||||
result.clusters
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn mean_correlation_zero_for_one_bssid() {
|
||||
let result = CorrelationResult {
|
||||
matrix: vec![vec![1.0]],
|
||||
clusters: vec![0],
|
||||
diversity: vec![0.0],
|
||||
n_active: 1,
|
||||
};
|
||||
assert!((result.mean_correlation() - 0.0).abs() < 1e-5);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,288 @@
|
||||
//! Stage 7: BSSID fingerprint matching via cosine similarity.
|
||||
//!
|
||||
//! Stores reference BSSID amplitude patterns for known postures
|
||||
//! (standing, sitting, walking, empty) and classifies new observations
|
||||
//! by retrieving the nearest stored template.
|
||||
//!
|
||||
//! This is a pure-Rust implementation using cosine similarity. When
|
||||
//! `ruvector-nervous-system` becomes available, the inner store can
|
||||
//! be replaced with `ModernHopfield` for richer associative memory.
|
||||
|
||||
use crate::domain::result::PostureClass;
|
||||
|
||||
/// A stored posture fingerprint template.
|
||||
#[derive(Debug, Clone)]
|
||||
struct PostureTemplate {
|
||||
/// Reference amplitude pattern (normalised).
|
||||
pattern: Vec<f32>,
|
||||
/// The posture label for this template.
|
||||
label: PostureClass,
|
||||
}
|
||||
|
||||
/// BSSID fingerprint matcher using cosine similarity.
|
||||
pub struct FingerprintMatcher {
|
||||
/// Stored reference templates.
|
||||
templates: Vec<PostureTemplate>,
|
||||
/// Minimum cosine similarity for a match.
|
||||
confidence_threshold: f32,
|
||||
/// Expected dimension (number of BSSID slots).
|
||||
n_bssids: usize,
|
||||
}
|
||||
|
||||
impl FingerprintMatcher {
|
||||
/// Create a new fingerprint matcher.
|
||||
///
|
||||
/// - `n_bssids`: number of BSSID slots (pattern dimension).
|
||||
/// - `confidence_threshold`: minimum cosine similarity for a match.
|
||||
#[must_use]
|
||||
pub fn new(n_bssids: usize, confidence_threshold: f32) -> Self {
|
||||
Self {
|
||||
templates: Vec::new(),
|
||||
confidence_threshold,
|
||||
n_bssids,
|
||||
}
|
||||
}
|
||||
|
||||
/// Store a reference pattern with its posture label.
|
||||
///
|
||||
/// # Errors
|
||||
///
|
||||
/// Returns an error if the pattern dimension does not match `n_bssids`.
|
||||
pub fn store_pattern(
|
||||
&mut self,
|
||||
pattern: Vec<f32>,
|
||||
label: PostureClass,
|
||||
) -> Result<(), String> {
|
||||
if pattern.len() != self.n_bssids {
|
||||
return Err(format!(
|
||||
"pattern dimension {} != expected {}",
|
||||
pattern.len(),
|
||||
self.n_bssids
|
||||
));
|
||||
}
|
||||
self.templates.push(PostureTemplate { pattern, label });
|
||||
Ok(())
|
||||
}
|
||||
|
||||
/// Classify an observation by matching against stored fingerprints.
|
||||
///
|
||||
/// Returns the best-matching posture and similarity score, or `None`
|
||||
/// if no patterns are stored or similarity is below threshold.
|
||||
#[must_use]
|
||||
pub fn classify(&self, observation: &[f32]) -> Option<(PostureClass, f32)> {
|
||||
if self.templates.is_empty() || observation.len() != self.n_bssids {
|
||||
return None;
|
||||
}
|
||||
|
||||
let mut best_label = None;
|
||||
let mut best_sim = f32::NEG_INFINITY;
|
||||
|
||||
for tmpl in &self.templates {
|
||||
let sim = cosine_similarity(&tmpl.pattern, observation);
|
||||
if sim > best_sim {
|
||||
best_sim = sim;
|
||||
best_label = Some(tmpl.label);
|
||||
}
|
||||
}
|
||||
|
||||
match best_label {
|
||||
Some(label) if best_sim >= self.confidence_threshold => Some((label, best_sim)),
|
||||
_ => None,
|
||||
}
|
||||
}
|
||||
|
||||
/// Match posture and return a structured result.
|
||||
#[must_use]
|
||||
pub fn match_posture(&self, observation: &[f32]) -> MatchResult {
|
||||
match self.classify(observation) {
|
||||
Some((posture, confidence)) => MatchResult {
|
||||
posture: Some(posture),
|
||||
confidence,
|
||||
matched: true,
|
||||
},
|
||||
None => MatchResult {
|
||||
posture: None,
|
||||
confidence: 0.0,
|
||||
matched: false,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
/// Generate default templates from a baseline signal.
|
||||
///
|
||||
/// Creates heuristic patterns for standing, sitting, and empty by
|
||||
/// scaling the baseline amplitude pattern.
|
||||
pub fn generate_defaults(&mut self, baseline: &[f32]) {
|
||||
if baseline.len() != self.n_bssids {
|
||||
return;
|
||||
}
|
||||
|
||||
// Empty: very low amplitude (background noise only)
|
||||
let empty: Vec<f32> = baseline.iter().map(|&a| a * 0.1).collect();
|
||||
let _ = self.store_pattern(empty, PostureClass::Empty);
|
||||
|
||||
// Standing: moderate perturbation of some BSSIDs
|
||||
let standing: Vec<f32> = baseline
|
||||
.iter()
|
||||
.enumerate()
|
||||
.map(|(i, &a)| if i % 3 == 0 { a * 1.3 } else { a })
|
||||
.collect();
|
||||
let _ = self.store_pattern(standing, PostureClass::Standing);
|
||||
|
||||
// Sitting: different perturbation pattern
|
||||
let sitting: Vec<f32> = baseline
|
||||
.iter()
|
||||
.enumerate()
|
||||
.map(|(i, &a)| if i % 2 == 0 { a * 1.2 } else { a * 0.9 })
|
||||
.collect();
|
||||
let _ = self.store_pattern(sitting, PostureClass::Sitting);
|
||||
}
|
||||
|
||||
/// Number of stored patterns.
|
||||
#[must_use]
|
||||
pub fn num_patterns(&self) -> usize {
|
||||
self.templates.len()
|
||||
}
|
||||
|
||||
/// Clear all stored patterns.
|
||||
pub fn clear(&mut self) {
|
||||
self.templates.clear();
|
||||
}
|
||||
|
||||
/// Set the minimum similarity threshold for classification.
|
||||
pub fn set_confidence_threshold(&mut self, threshold: f32) {
|
||||
self.confidence_threshold = threshold;
|
||||
}
|
||||
}
|
||||
|
||||
/// Result of fingerprint matching.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct MatchResult {
|
||||
/// Matched posture class (None if no match).
|
||||
pub posture: Option<PostureClass>,
|
||||
/// Cosine similarity of the best match.
|
||||
pub confidence: f32,
|
||||
/// Whether a match was found above threshold.
|
||||
pub matched: bool,
|
||||
}
|
||||
|
||||
/// Cosine similarity between two vectors.
|
||||
fn cosine_similarity(a: &[f32], b: &[f32]) -> f32 {
|
||||
let n = a.len().min(b.len());
|
||||
if n == 0 {
|
||||
return 0.0;
|
||||
}
|
||||
|
||||
let mut dot = 0.0f32;
|
||||
let mut norm_a = 0.0f32;
|
||||
let mut norm_b = 0.0f32;
|
||||
|
||||
for i in 0..n {
|
||||
dot += a[i] * b[i];
|
||||
norm_a += a[i] * a[i];
|
||||
norm_b += b[i] * b[i];
|
||||
}
|
||||
|
||||
let denom = (norm_a * norm_b).sqrt();
|
||||
if denom < 1e-12 {
|
||||
0.0
|
||||
} else {
|
||||
dot / denom
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn empty_matcher_returns_none() {
|
||||
let matcher = FingerprintMatcher::new(4, 0.5);
|
||||
assert!(matcher.classify(&[1.0, 2.0, 3.0, 4.0]).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn wrong_dimension_returns_none() {
|
||||
let mut matcher = FingerprintMatcher::new(4, 0.5);
|
||||
matcher
|
||||
.store_pattern(vec![1.0; 4], PostureClass::Standing)
|
||||
.unwrap();
|
||||
// Wrong dimension
|
||||
assert!(matcher.classify(&[1.0, 2.0]).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn store_and_recall() {
|
||||
let mut matcher = FingerprintMatcher::new(4, 0.5);
|
||||
|
||||
// Store distinct patterns
|
||||
matcher
|
||||
.store_pattern(vec![1.0, 0.0, 0.0, 0.0], PostureClass::Standing)
|
||||
.unwrap();
|
||||
matcher
|
||||
.store_pattern(vec![0.0, 1.0, 0.0, 0.0], PostureClass::Sitting)
|
||||
.unwrap();
|
||||
|
||||
assert_eq!(matcher.num_patterns(), 2);
|
||||
|
||||
// Query close to "Standing" pattern
|
||||
let result = matcher.classify(&[0.9, 0.1, 0.0, 0.0]);
|
||||
if let Some((posture, sim)) = result {
|
||||
assert_eq!(posture, PostureClass::Standing);
|
||||
assert!(sim > 0.5, "similarity should be above threshold: {sim}");
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn wrong_dim_store_rejected() {
|
||||
let mut matcher = FingerprintMatcher::new(4, 0.5);
|
||||
let result = matcher.store_pattern(vec![1.0, 2.0], PostureClass::Empty);
|
||||
assert!(result.is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn clear_removes_all() {
|
||||
let mut matcher = FingerprintMatcher::new(2, 0.5);
|
||||
matcher
|
||||
.store_pattern(vec![1.0, 0.0], PostureClass::Standing)
|
||||
.unwrap();
|
||||
assert_eq!(matcher.num_patterns(), 1);
|
||||
matcher.clear();
|
||||
assert_eq!(matcher.num_patterns(), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn cosine_similarity_identical() {
|
||||
let a = vec![1.0, 2.0, 3.0];
|
||||
let b = vec![1.0, 2.0, 3.0];
|
||||
let sim = cosine_similarity(&a, &b);
|
||||
assert!((sim - 1.0).abs() < 1e-5, "identical vectors: {sim}");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn cosine_similarity_orthogonal() {
|
||||
let a = vec![1.0, 0.0];
|
||||
let b = vec![0.0, 1.0];
|
||||
let sim = cosine_similarity(&a, &b);
|
||||
assert!(sim.abs() < 1e-5, "orthogonal vectors: {sim}");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn match_posture_result() {
|
||||
let mut matcher = FingerprintMatcher::new(3, 0.5);
|
||||
matcher
|
||||
.store_pattern(vec![1.0, 0.0, 0.0], PostureClass::Standing)
|
||||
.unwrap();
|
||||
|
||||
let result = matcher.match_posture(&[0.95, 0.05, 0.0]);
|
||||
assert!(result.matched);
|
||||
assert_eq!(result.posture, Some(PostureClass::Standing));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn generate_defaults_creates_templates() {
|
||||
let mut matcher = FingerprintMatcher::new(4, 0.3);
|
||||
matcher.generate_defaults(&[1.0, 2.0, 3.0, 4.0]);
|
||||
assert_eq!(matcher.num_patterns(), 3); // Empty, Standing, Sitting
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,36 @@
|
||||
//! Signal Intelligence pipeline (Phase 2, ADR-022).
|
||||
//!
|
||||
//! Composes `RuVector` primitives into a multi-stage sensing pipeline
|
||||
//! that transforms multi-BSSID RSSI frames into presence, motion,
|
||||
//! and coarse vital sign estimates.
|
||||
//!
|
||||
//! ## Stages
|
||||
//!
|
||||
//! 1. [`predictive_gate`] -- residual gating via `PredictiveLayer`
|
||||
//! 2. [`attention_weighter`] -- BSSID attention weighting
|
||||
//! 3. [`correlator`] -- cross-BSSID Pearson correlation & clustering
|
||||
//! 4. [`motion_estimator`] -- multi-AP motion estimation
|
||||
//! 5. [`breathing_extractor`] -- coarse breathing rate extraction
|
||||
//! 6. [`quality_gate`] -- ruQu three-filter quality gate
|
||||
//! 7. [`fingerprint_matcher`] -- `ModernHopfield` posture fingerprinting
|
||||
//! 8. [`orchestrator`] -- full pipeline orchestrator
|
||||
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub mod predictive_gate;
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub mod attention_weighter;
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub mod correlator;
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub mod motion_estimator;
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub mod breathing_extractor;
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub mod quality_gate;
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub mod fingerprint_matcher;
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub mod orchestrator;
|
||||
|
||||
#[cfg(feature = "pipeline")]
|
||||
pub use orchestrator::WindowsWifiPipeline;
|
||||
@@ -0,0 +1,210 @@
|
||||
//! Stage 4: Multi-AP motion estimation.
|
||||
//!
|
||||
//! Combines per-BSSID residuals, attention weights, and correlation
|
||||
//! features to estimate overall motion intensity and classify
|
||||
//! motion level (None / Minimal / Moderate / High).
|
||||
|
||||
use crate::domain::result::MotionLevel;
|
||||
|
||||
/// Multi-AP motion estimator using weighted variance of BSSID residuals.
|
||||
pub struct MultiApMotionEstimator {
|
||||
/// EMA smoothing factor for motion score.
|
||||
alpha: f32,
|
||||
/// Running EMA of motion score.
|
||||
ema_motion: f32,
|
||||
/// Motion threshold for None->Minimal transition.
|
||||
threshold_minimal: f32,
|
||||
/// Motion threshold for Minimal->Moderate transition.
|
||||
threshold_moderate: f32,
|
||||
/// Motion threshold for Moderate->High transition.
|
||||
threshold_high: f32,
|
||||
}
|
||||
|
||||
impl MultiApMotionEstimator {
|
||||
/// Create a motion estimator with default thresholds.
|
||||
#[must_use]
|
||||
pub fn new() -> Self {
|
||||
Self {
|
||||
alpha: 0.3,
|
||||
ema_motion: 0.0,
|
||||
threshold_minimal: 0.02,
|
||||
threshold_moderate: 0.10,
|
||||
threshold_high: 0.30,
|
||||
}
|
||||
}
|
||||
|
||||
/// Create with custom thresholds.
|
||||
#[must_use]
|
||||
pub fn with_thresholds(minimal: f32, moderate: f32, high: f32) -> Self {
|
||||
Self {
|
||||
alpha: 0.3,
|
||||
ema_motion: 0.0,
|
||||
threshold_minimal: minimal,
|
||||
threshold_moderate: moderate,
|
||||
threshold_high: high,
|
||||
}
|
||||
}
|
||||
|
||||
/// Estimate motion from weighted residuals.
|
||||
///
|
||||
/// - `residuals`: per-BSSID residual from `PredictiveGate`.
|
||||
/// - `weights`: per-BSSID attention weights from `AttentionWeighter`.
|
||||
/// - `diversity`: per-BSSID correlation diversity from `BssidCorrelator`.
|
||||
///
|
||||
/// Returns `MotionEstimate` with score and level.
|
||||
pub fn estimate(
|
||||
&mut self,
|
||||
residuals: &[f32],
|
||||
weights: &[f32],
|
||||
diversity: &[f32],
|
||||
) -> MotionEstimate {
|
||||
let n = residuals.len();
|
||||
if n == 0 {
|
||||
return MotionEstimate {
|
||||
score: 0.0,
|
||||
level: MotionLevel::None,
|
||||
weighted_variance: 0.0,
|
||||
n_contributing: 0,
|
||||
};
|
||||
}
|
||||
|
||||
// Weighted variance of residuals (body-sensitive BSSIDs contribute more)
|
||||
let mut weighted_sum = 0.0f32;
|
||||
let mut weight_total = 0.0f32;
|
||||
let mut n_contributing = 0usize;
|
||||
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
for (i, residual) in residuals.iter().enumerate() {
|
||||
let w = weights.get(i).copied().unwrap_or(1.0 / n as f32);
|
||||
let d = diversity.get(i).copied().unwrap_or(0.5);
|
||||
// Combine attention weight with diversity (correlated BSSIDs
|
||||
// that respond together are better indicators)
|
||||
let combined_w = w * (0.5 + 0.5 * d);
|
||||
weighted_sum += combined_w * residual.abs();
|
||||
weight_total += combined_w;
|
||||
|
||||
if residual.abs() > 0.001 {
|
||||
n_contributing += 1;
|
||||
}
|
||||
}
|
||||
|
||||
let weighted_variance = if weight_total > 1e-9 {
|
||||
weighted_sum / weight_total
|
||||
} else {
|
||||
0.0
|
||||
};
|
||||
|
||||
// EMA smoothing
|
||||
self.ema_motion = self.alpha * weighted_variance + (1.0 - self.alpha) * self.ema_motion;
|
||||
|
||||
let level = if self.ema_motion < self.threshold_minimal {
|
||||
MotionLevel::None
|
||||
} else if self.ema_motion < self.threshold_moderate {
|
||||
MotionLevel::Minimal
|
||||
} else if self.ema_motion < self.threshold_high {
|
||||
MotionLevel::Moderate
|
||||
} else {
|
||||
MotionLevel::High
|
||||
};
|
||||
|
||||
MotionEstimate {
|
||||
score: self.ema_motion,
|
||||
level,
|
||||
weighted_variance,
|
||||
n_contributing,
|
||||
}
|
||||
}
|
||||
|
||||
/// Reset the EMA state.
|
||||
pub fn reset(&mut self) {
|
||||
self.ema_motion = 0.0;
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for MultiApMotionEstimator {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
/// Result of motion estimation.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct MotionEstimate {
|
||||
/// Smoothed motion score (EMA of weighted variance).
|
||||
pub score: f32,
|
||||
/// Classified motion level.
|
||||
pub level: MotionLevel,
|
||||
/// Raw weighted variance before smoothing.
|
||||
pub weighted_variance: f32,
|
||||
/// Number of BSSIDs with non-zero residuals.
|
||||
pub n_contributing: usize,
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn no_residuals_yields_no_motion() {
|
||||
let mut est = MultiApMotionEstimator::new();
|
||||
let result = est.estimate(&[], &[], &[]);
|
||||
assert_eq!(result.level, MotionLevel::None);
|
||||
assert!((result.score - 0.0).abs() < f32::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn zero_residuals_yield_no_motion() {
|
||||
let mut est = MultiApMotionEstimator::new();
|
||||
let residuals = vec![0.0, 0.0, 0.0];
|
||||
let weights = vec![0.33, 0.33, 0.34];
|
||||
let diversity = vec![0.5, 0.5, 0.5];
|
||||
let result = est.estimate(&residuals, &weights, &diversity);
|
||||
assert_eq!(result.level, MotionLevel::None);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn large_residuals_yield_high_motion() {
|
||||
let mut est = MultiApMotionEstimator::new();
|
||||
let residuals = vec![5.0, 5.0, 5.0];
|
||||
let weights = vec![0.33, 0.33, 0.34];
|
||||
let diversity = vec![1.0, 1.0, 1.0];
|
||||
// Push several frames to overcome EMA smoothing
|
||||
for _ in 0..20 {
|
||||
est.estimate(&residuals, &weights, &diversity);
|
||||
}
|
||||
let result = est.estimate(&residuals, &weights, &diversity);
|
||||
assert_eq!(result.level, MotionLevel::High);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn ema_smooths_transients() {
|
||||
let mut est = MultiApMotionEstimator::new();
|
||||
let big = vec![10.0, 10.0, 10.0];
|
||||
let zero = vec![0.0, 0.0, 0.0];
|
||||
let w = vec![0.33, 0.33, 0.34];
|
||||
let d = vec![0.5, 0.5, 0.5];
|
||||
|
||||
// One big spike followed by zeros
|
||||
est.estimate(&big, &w, &d);
|
||||
let r1 = est.estimate(&zero, &w, &d);
|
||||
let r2 = est.estimate(&zero, &w, &d);
|
||||
// Score should decay
|
||||
assert!(r2.score < r1.score, "EMA should decay: {} < {}", r2.score, r1.score);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn n_contributing_counts_nonzero() {
|
||||
let mut est = MultiApMotionEstimator::new();
|
||||
let residuals = vec![0.0, 1.0, 0.0, 2.0];
|
||||
let weights = vec![0.25; 4];
|
||||
let diversity = vec![0.5; 4];
|
||||
let result = est.estimate(&residuals, &weights, &diversity);
|
||||
assert_eq!(result.n_contributing, 2);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn default_creates_estimator() {
|
||||
let est = MultiApMotionEstimator::default();
|
||||
assert!((est.threshold_minimal - 0.02).abs() < f32::EPSILON);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,432 @@
|
||||
//! Stage 8: Pipeline orchestrator (Domain Service).
|
||||
//!
|
||||
//! `WindowsWifiPipeline` connects all pipeline stages (1-7) into a
|
||||
//! single processing step that transforms a `MultiApFrame` into an
|
||||
//! `EnhancedSensingResult`.
|
||||
//!
|
||||
//! This is the Domain Service described in ADR-022 section 3.2.
|
||||
|
||||
use crate::domain::frame::MultiApFrame;
|
||||
use crate::domain::result::{
|
||||
BreathingEstimate as DomainBreathingEstimate, EnhancedSensingResult,
|
||||
MotionEstimate as DomainMotionEstimate, MotionLevel, PostureClass, SignalQuality,
|
||||
Verdict as DomainVerdict,
|
||||
};
|
||||
|
||||
use super::attention_weighter::AttentionWeighter;
|
||||
use super::breathing_extractor::CoarseBreathingExtractor;
|
||||
use super::correlator::BssidCorrelator;
|
||||
use super::fingerprint_matcher::FingerprintMatcher;
|
||||
use super::motion_estimator::MultiApMotionEstimator;
|
||||
use super::predictive_gate::PredictiveGate;
|
||||
use super::quality_gate::{QualityGate, Verdict};
|
||||
|
||||
/// Configuration for the Windows `WiFi` sensing pipeline.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct PipelineConfig {
|
||||
/// Maximum number of BSSID slots.
|
||||
pub max_bssids: usize,
|
||||
/// Residual gating threshold (stage 1).
|
||||
pub gate_threshold: f32,
|
||||
/// Correlation window size in frames (stage 3).
|
||||
pub correlation_window: usize,
|
||||
/// Correlation threshold for co-varying classification (stage 3).
|
||||
pub correlation_threshold: f32,
|
||||
/// Minimum BSSIDs for a valid frame.
|
||||
pub min_bssids: usize,
|
||||
/// Enable breathing extraction (stage 5).
|
||||
pub enable_breathing: bool,
|
||||
/// Enable fingerprint matching (stage 7).
|
||||
pub enable_fingerprint: bool,
|
||||
/// Sample rate in Hz.
|
||||
pub sample_rate: f32,
|
||||
}
|
||||
|
||||
impl Default for PipelineConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
max_bssids: 32,
|
||||
gate_threshold: 0.05,
|
||||
correlation_window: 30,
|
||||
correlation_threshold: 0.7,
|
||||
min_bssids: 3,
|
||||
enable_breathing: true,
|
||||
enable_fingerprint: true,
|
||||
sample_rate: 2.0,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// The complete Windows `WiFi` sensing pipeline (Domain Service).
|
||||
///
|
||||
/// Connects stages 1-7 into a single `process()` call that transforms
|
||||
/// a `MultiApFrame` into an `EnhancedSensingResult`.
|
||||
///
|
||||
/// Stages:
|
||||
/// 1. Predictive gating (EMA residual filter)
|
||||
/// 2. Attention weighting (softmax dot-product)
|
||||
/// 3. Spatial correlation (Pearson + clustering)
|
||||
/// 4. Motion estimation (weighted variance + EMA)
|
||||
/// 5. Breathing extraction (bandpass + zero-crossing)
|
||||
/// 6. Quality gate (three-filter: structural / shift / evidence)
|
||||
/// 7. Fingerprint matching (cosine similarity templates)
|
||||
pub struct WindowsWifiPipeline {
|
||||
gate: PredictiveGate,
|
||||
attention: AttentionWeighter,
|
||||
correlator: BssidCorrelator,
|
||||
motion: MultiApMotionEstimator,
|
||||
breathing: CoarseBreathingExtractor,
|
||||
quality: QualityGate,
|
||||
fingerprint: FingerprintMatcher,
|
||||
config: PipelineConfig,
|
||||
/// Whether fingerprint defaults have been initialised.
|
||||
fingerprints_initialised: bool,
|
||||
/// Frame counter.
|
||||
frame_count: u64,
|
||||
}
|
||||
|
||||
impl WindowsWifiPipeline {
|
||||
/// Create a new pipeline with default configuration.
|
||||
#[must_use]
|
||||
pub fn new() -> Self {
|
||||
Self::with_config(PipelineConfig::default())
|
||||
}
|
||||
|
||||
/// Create with default configuration (alias for `new`).
|
||||
#[must_use]
|
||||
pub fn with_defaults() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
|
||||
/// Create a new pipeline with custom configuration.
|
||||
#[must_use]
|
||||
pub fn with_config(config: PipelineConfig) -> Self {
|
||||
Self {
|
||||
gate: PredictiveGate::new(config.max_bssids, config.gate_threshold),
|
||||
attention: AttentionWeighter::new(1),
|
||||
correlator: BssidCorrelator::new(
|
||||
config.max_bssids,
|
||||
config.correlation_window,
|
||||
config.correlation_threshold,
|
||||
),
|
||||
motion: MultiApMotionEstimator::new(),
|
||||
breathing: CoarseBreathingExtractor::new(
|
||||
config.max_bssids,
|
||||
config.sample_rate,
|
||||
0.1,
|
||||
0.5,
|
||||
),
|
||||
quality: QualityGate::new(),
|
||||
fingerprint: FingerprintMatcher::new(config.max_bssids, 0.5),
|
||||
fingerprints_initialised: false,
|
||||
frame_count: 0,
|
||||
config,
|
||||
}
|
||||
}
|
||||
|
||||
/// Process a single multi-BSSID frame through all pipeline stages.
|
||||
///
|
||||
/// Returns an `EnhancedSensingResult` with motion, breathing,
|
||||
/// posture, and quality information.
|
||||
pub fn process(&mut self, frame: &MultiApFrame) -> EnhancedSensingResult {
|
||||
self.frame_count += 1;
|
||||
|
||||
let n = frame.bssid_count;
|
||||
|
||||
// Convert f64 amplitudes to f32 for pipeline stages.
|
||||
#[allow(clippy::cast_possible_truncation)]
|
||||
let amps_f32: Vec<f32> = frame.amplitudes.iter().map(|&a| a as f32).collect();
|
||||
|
||||
// Initialise fingerprint defaults on first frame with enough BSSIDs.
|
||||
if !self.fingerprints_initialised
|
||||
&& self.config.enable_fingerprint
|
||||
&& amps_f32.len() == self.config.max_bssids
|
||||
{
|
||||
self.fingerprint.generate_defaults(&s_f32);
|
||||
self.fingerprints_initialised = true;
|
||||
}
|
||||
|
||||
// Check minimum BSSID count.
|
||||
if n < self.config.min_bssids {
|
||||
return Self::make_empty_result(frame, n);
|
||||
}
|
||||
|
||||
// -- Stage 1: Predictive gating --
|
||||
let Some(residuals) = self.gate.gate(&s_f32) else {
|
||||
// Static environment, no body present.
|
||||
return Self::make_empty_result(frame, n);
|
||||
};
|
||||
|
||||
// -- Stage 2: Attention weighting --
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
let mean_residual =
|
||||
residuals.iter().map(|r| r.abs()).sum::<f32>() / residuals.len().max(1) as f32;
|
||||
let query = vec![mean_residual];
|
||||
let keys: Vec<Vec<f32>> = residuals.iter().map(|&r| vec![r]).collect();
|
||||
let values: Vec<Vec<f32>> = amps_f32.iter().map(|&a| vec![a]).collect();
|
||||
let (_weighted, weights) = self.attention.weight(&query, &keys, &values);
|
||||
|
||||
// -- Stage 3: Spatial correlation --
|
||||
let corr = self.correlator.update(&s_f32);
|
||||
|
||||
// -- Stage 4: Motion estimation --
|
||||
let motion = self.motion.estimate(&residuals, &weights, &corr.diversity);
|
||||
|
||||
// -- Stage 5: Breathing extraction (only when stationary) --
|
||||
let breathing = if self.config.enable_breathing && motion.level == MotionLevel::Minimal {
|
||||
self.breathing.extract(&residuals, &weights)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
// -- Stage 6: Quality gate --
|
||||
let quality_result = self.quality.evaluate(
|
||||
n,
|
||||
frame.mean_rssi(),
|
||||
f64::from(corr.mean_correlation()),
|
||||
motion.score,
|
||||
);
|
||||
|
||||
// -- Stage 7: Fingerprint matching --
|
||||
let posture = if self.config.enable_fingerprint {
|
||||
self.fingerprint.classify(&s_f32).map(|(p, _sim)| p)
|
||||
} else {
|
||||
None
|
||||
};
|
||||
|
||||
// Count body-sensitive BSSIDs (attention weight above 1.5x average).
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
let avg_weight = 1.0 / n.max(1) as f32;
|
||||
let sensitive_count = weights.iter().filter(|&&w| w > avg_weight * 1.5).count();
|
||||
|
||||
// Map internal quality gate verdict to domain Verdict.
|
||||
let domain_verdict = match &quality_result.verdict {
|
||||
Verdict::Permit => DomainVerdict::Permit,
|
||||
Verdict::Defer => DomainVerdict::Warn,
|
||||
Verdict::Deny(_) => DomainVerdict::Deny,
|
||||
};
|
||||
|
||||
// Build the domain BreathingEstimate if we have one.
|
||||
let domain_breathing = breathing.map(|b| DomainBreathingEstimate {
|
||||
rate_bpm: f64::from(b.bpm),
|
||||
confidence: f64::from(b.confidence),
|
||||
bssid_count: sensitive_count,
|
||||
});
|
||||
|
||||
EnhancedSensingResult {
|
||||
motion: DomainMotionEstimate {
|
||||
score: f64::from(motion.score),
|
||||
level: motion.level,
|
||||
contributing_bssids: motion.n_contributing,
|
||||
},
|
||||
breathing: domain_breathing,
|
||||
posture,
|
||||
signal_quality: SignalQuality {
|
||||
score: quality_result.quality,
|
||||
bssid_count: n,
|
||||
spectral_gap: f64::from(corr.mean_correlation()),
|
||||
mean_rssi_dbm: frame.mean_rssi(),
|
||||
},
|
||||
bssid_count: n,
|
||||
verdict: domain_verdict,
|
||||
}
|
||||
}
|
||||
|
||||
/// Build an empty/gated result for frames that don't pass initial checks.
|
||||
fn make_empty_result(frame: &MultiApFrame, n: usize) -> EnhancedSensingResult {
|
||||
EnhancedSensingResult {
|
||||
motion: DomainMotionEstimate {
|
||||
score: 0.0,
|
||||
level: MotionLevel::None,
|
||||
contributing_bssids: 0,
|
||||
},
|
||||
breathing: None,
|
||||
posture: None,
|
||||
signal_quality: SignalQuality {
|
||||
score: 0.0,
|
||||
bssid_count: n,
|
||||
spectral_gap: 0.0,
|
||||
mean_rssi_dbm: frame.mean_rssi(),
|
||||
},
|
||||
bssid_count: n,
|
||||
verdict: DomainVerdict::Deny,
|
||||
}
|
||||
}
|
||||
|
||||
/// Store a reference fingerprint pattern.
|
||||
///
|
||||
/// # Errors
|
||||
///
|
||||
/// Returns an error if the pattern dimension does not match `max_bssids`.
|
||||
pub fn store_fingerprint(
|
||||
&mut self,
|
||||
pattern: Vec<f32>,
|
||||
label: PostureClass,
|
||||
) -> Result<(), String> {
|
||||
self.fingerprint.store_pattern(pattern, label)
|
||||
}
|
||||
|
||||
/// Reset all pipeline state.
|
||||
pub fn reset(&mut self) {
|
||||
self.gate = PredictiveGate::new(self.config.max_bssids, self.config.gate_threshold);
|
||||
self.correlator = BssidCorrelator::new(
|
||||
self.config.max_bssids,
|
||||
self.config.correlation_window,
|
||||
self.config.correlation_threshold,
|
||||
);
|
||||
self.motion.reset();
|
||||
self.breathing.reset();
|
||||
self.quality.reset();
|
||||
self.fingerprint.clear();
|
||||
self.fingerprints_initialised = false;
|
||||
self.frame_count = 0;
|
||||
}
|
||||
|
||||
/// Number of frames processed.
|
||||
#[must_use]
|
||||
pub fn frame_count(&self) -> u64 {
|
||||
self.frame_count
|
||||
}
|
||||
|
||||
/// Current pipeline configuration.
|
||||
#[must_use]
|
||||
pub fn config(&self) -> &PipelineConfig {
|
||||
&self.config
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for WindowsWifiPipeline {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
use std::collections::VecDeque;
|
||||
use std::time::Instant;
|
||||
|
||||
fn make_frame(bssid_count: usize, rssi_values: &[f64]) -> MultiApFrame {
|
||||
let amplitudes: Vec<f64> = rssi_values
|
||||
.iter()
|
||||
.map(|&r| 10.0_f64.powf((r + 100.0) / 20.0))
|
||||
.collect();
|
||||
MultiApFrame {
|
||||
bssid_count,
|
||||
rssi_dbm: rssi_values.to_vec(),
|
||||
amplitudes,
|
||||
phases: vec![0.0; bssid_count],
|
||||
per_bssid_variance: vec![0.1; bssid_count],
|
||||
histories: vec![VecDeque::new(); bssid_count],
|
||||
sample_rate_hz: 2.0,
|
||||
timestamp: Instant::now(),
|
||||
}
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pipeline_creates_ok() {
|
||||
let pipeline = WindowsWifiPipeline::with_defaults();
|
||||
assert_eq!(pipeline.frame_count(), 0);
|
||||
assert_eq!(pipeline.config().max_bssids, 32);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn too_few_bssids_returns_deny() {
|
||||
let mut pipeline = WindowsWifiPipeline::new();
|
||||
let frame = make_frame(2, &[-60.0, -70.0]);
|
||||
let result = pipeline.process(&frame);
|
||||
assert_eq!(result.verdict, DomainVerdict::Deny);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn first_frame_increments_count() {
|
||||
let mut pipeline = WindowsWifiPipeline::with_config(PipelineConfig {
|
||||
min_bssids: 1,
|
||||
max_bssids: 4,
|
||||
..Default::default()
|
||||
});
|
||||
let frame = make_frame(4, &[-60.0, -65.0, -70.0, -75.0]);
|
||||
let _result = pipeline.process(&frame);
|
||||
assert_eq!(pipeline.frame_count(), 1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn static_signal_returns_deny_after_learning() {
|
||||
let mut pipeline = WindowsWifiPipeline::with_config(PipelineConfig {
|
||||
min_bssids: 1,
|
||||
max_bssids: 4,
|
||||
..Default::default()
|
||||
});
|
||||
let frame = make_frame(4, &[-60.0, -65.0, -70.0, -75.0]);
|
||||
|
||||
// Train on static signal.
|
||||
pipeline.process(&frame);
|
||||
pipeline.process(&frame);
|
||||
pipeline.process(&frame);
|
||||
|
||||
// After learning, static signal should be gated (Deny verdict).
|
||||
let result = pipeline.process(&frame);
|
||||
assert_eq!(
|
||||
result.verdict,
|
||||
DomainVerdict::Deny,
|
||||
"static signal should be gated"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn changing_signal_increments_count() {
|
||||
let mut pipeline = WindowsWifiPipeline::with_config(PipelineConfig {
|
||||
min_bssids: 1,
|
||||
max_bssids: 4,
|
||||
..Default::default()
|
||||
});
|
||||
let baseline = make_frame(4, &[-60.0, -65.0, -70.0, -75.0]);
|
||||
|
||||
// Learn baseline.
|
||||
for _ in 0..5 {
|
||||
pipeline.process(&baseline);
|
||||
}
|
||||
|
||||
// Significant change should be noticed.
|
||||
let changed = make_frame(4, &[-60.0, -65.0, -70.0, -30.0]);
|
||||
pipeline.process(&changed);
|
||||
assert!(pipeline.frame_count() > 5);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reset_clears_state() {
|
||||
let mut pipeline = WindowsWifiPipeline::new();
|
||||
let frame = make_frame(4, &[-60.0, -65.0, -70.0, -75.0]);
|
||||
pipeline.process(&frame);
|
||||
assert_eq!(pipeline.frame_count(), 1);
|
||||
pipeline.reset();
|
||||
assert_eq!(pipeline.frame_count(), 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn default_creates_pipeline() {
|
||||
let _pipeline = WindowsWifiPipeline::default();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn pipeline_throughput_benchmark() {
|
||||
let mut pipeline = WindowsWifiPipeline::with_config(PipelineConfig {
|
||||
min_bssids: 1,
|
||||
max_bssids: 4,
|
||||
..Default::default()
|
||||
});
|
||||
let frame = make_frame(4, &[-60.0, -65.0, -70.0, -75.0]);
|
||||
|
||||
let start = Instant::now();
|
||||
let n_frames = 10_000;
|
||||
for _ in 0..n_frames {
|
||||
pipeline.process(&frame);
|
||||
}
|
||||
let elapsed = start.elapsed();
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
let fps = n_frames as f64 / elapsed.as_secs_f64();
|
||||
println!("Pipeline throughput: {fps:.0} frames/sec ({elapsed:?} for {n_frames} frames)");
|
||||
assert!(fps > 100.0, "Pipeline should process >100 frames/sec, got {fps:.0}");
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,141 @@
|
||||
//! Stage 1: Predictive gating via EMA-based residual filter.
|
||||
//!
|
||||
//! Suppresses static BSSIDs by computing residuals between predicted
|
||||
//! (EMA) and actual RSSI values. Only transmits frames where significant
|
||||
//! change is detected (body interaction).
|
||||
//!
|
||||
//! This is a lightweight pure-Rust implementation. When `ruvector-nervous-system`
|
||||
//! becomes available, the inner EMA predictor can be replaced with
|
||||
//! `PredictiveLayer` for more sophisticated prediction.
|
||||
|
||||
/// Wrapper around an EMA predictor for multi-BSSID residual gating.
|
||||
pub struct PredictiveGate {
|
||||
/// Per-BSSID EMA predictions.
|
||||
predictions: Vec<f32>,
|
||||
/// Whether a prediction has been initialised for each slot.
|
||||
initialised: Vec<bool>,
|
||||
/// EMA smoothing factor (higher = faster tracking).
|
||||
alpha: f32,
|
||||
/// Residual threshold for change detection.
|
||||
threshold: f32,
|
||||
/// Residuals from the last frame (for downstream use).
|
||||
last_residuals: Vec<f32>,
|
||||
/// Number of BSSID slots.
|
||||
n_bssids: usize,
|
||||
}
|
||||
|
||||
impl PredictiveGate {
|
||||
/// Create a new predictive gate.
|
||||
///
|
||||
/// - `n_bssids`: maximum number of tracked BSSIDs (subcarrier slots).
|
||||
/// - `threshold`: residual threshold for change detection (ADR-022 default: 0.05).
|
||||
#[must_use]
|
||||
pub fn new(n_bssids: usize, threshold: f32) -> Self {
|
||||
Self {
|
||||
predictions: vec![0.0; n_bssids],
|
||||
initialised: vec![false; n_bssids],
|
||||
alpha: 0.3,
|
||||
threshold,
|
||||
last_residuals: vec![0.0; n_bssids],
|
||||
n_bssids,
|
||||
}
|
||||
}
|
||||
|
||||
/// Process a frame. Returns `Some(residuals)` if body-correlated change
|
||||
/// is detected, `None` if the environment is static.
|
||||
pub fn gate(&mut self, amplitudes: &[f32]) -> Option<Vec<f32>> {
|
||||
let n = amplitudes.len().min(self.n_bssids);
|
||||
let mut residuals = vec![0.0f32; n];
|
||||
let mut max_residual = 0.0f32;
|
||||
|
||||
for i in 0..n {
|
||||
if self.initialised[i] {
|
||||
residuals[i] = amplitudes[i] - self.predictions[i];
|
||||
max_residual = max_residual.max(residuals[i].abs());
|
||||
// Update EMA
|
||||
self.predictions[i] =
|
||||
self.alpha * amplitudes[i] + (1.0 - self.alpha) * self.predictions[i];
|
||||
} else {
|
||||
// First observation: seed the prediction
|
||||
self.predictions[i] = amplitudes[i];
|
||||
self.initialised[i] = true;
|
||||
residuals[i] = amplitudes[i]; // first frame always transmits
|
||||
max_residual = f32::MAX;
|
||||
}
|
||||
}
|
||||
|
||||
self.last_residuals.clone_from(&residuals);
|
||||
|
||||
if max_residual > self.threshold {
|
||||
Some(residuals)
|
||||
} else {
|
||||
None
|
||||
}
|
||||
}
|
||||
|
||||
/// Return the residuals from the last `gate()` call.
|
||||
#[must_use]
|
||||
pub fn last_residuals(&self) -> &[f32] {
|
||||
&self.last_residuals
|
||||
}
|
||||
|
||||
/// Update the threshold dynamically (e.g., from SONA adaptation).
|
||||
pub fn set_threshold(&mut self, threshold: f32) {
|
||||
self.threshold = threshold;
|
||||
}
|
||||
|
||||
/// Current threshold.
|
||||
#[must_use]
|
||||
pub fn threshold(&self) -> f32 {
|
||||
self.threshold
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn static_signal_is_gated() {
|
||||
let mut gate = PredictiveGate::new(4, 0.05);
|
||||
let signal = vec![1.0, 2.0, 3.0, 4.0];
|
||||
// First frame always transmits (no prediction yet)
|
||||
assert!(gate.gate(&signal).is_some());
|
||||
// After many repeated frames, EMA converges and residuals shrink
|
||||
for _ in 0..20 {
|
||||
gate.gate(&signal);
|
||||
}
|
||||
assert!(gate.gate(&signal).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn changing_signal_transmits() {
|
||||
let mut gate = PredictiveGate::new(4, 0.05);
|
||||
let signal1 = vec![1.0, 2.0, 3.0, 4.0];
|
||||
gate.gate(&signal1);
|
||||
// Let EMA converge
|
||||
for _ in 0..20 {
|
||||
gate.gate(&signal1);
|
||||
}
|
||||
|
||||
// Large change should be transmitted
|
||||
let signal2 = vec![1.0, 2.0, 3.0, 10.0];
|
||||
assert!(gate.gate(&signal2).is_some());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn residuals_are_stored() {
|
||||
let mut gate = PredictiveGate::new(3, 0.05);
|
||||
let signal = vec![1.0, 2.0, 3.0];
|
||||
gate.gate(&signal);
|
||||
assert_eq!(gate.last_residuals().len(), 3);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn threshold_can_be_updated() {
|
||||
let mut gate = PredictiveGate::new(2, 0.05);
|
||||
assert!((gate.threshold() - 0.05).abs() < f32::EPSILON);
|
||||
gate.set_threshold(0.1);
|
||||
assert!((gate.threshold() - 0.1).abs() < f32::EPSILON);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,261 @@
|
||||
//! Stage 6: Signal quality gate.
|
||||
//!
|
||||
//! Evaluates signal quality using three factors inspired by the ruQu
|
||||
//! three-filter architecture (structural integrity, distribution drift,
|
||||
//! evidence accumulation):
|
||||
//!
|
||||
//! - **Structural**: number of active BSSIDs (graph connectivity proxy).
|
||||
//! - **Shift**: RSSI drift from running baseline.
|
||||
//! - **Evidence**: accumulated weighted variance evidence.
|
||||
//!
|
||||
//! This is a pure-Rust implementation. When the `ruqu` crate becomes
|
||||
//! available, the inner filter can be replaced with `FilterPipeline`.
|
||||
|
||||
/// Configuration for the quality gate.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct QualityGateConfig {
|
||||
/// Minimum active BSSIDs for a "Permit" verdict.
|
||||
pub min_bssids: usize,
|
||||
/// Evidence threshold for "Permit" (accumulated variance).
|
||||
pub evidence_threshold: f64,
|
||||
/// RSSI drift threshold (dBm) for triggering a "Warn".
|
||||
pub drift_threshold: f64,
|
||||
/// Maximum evidence decay per frame.
|
||||
pub evidence_decay: f64,
|
||||
}
|
||||
|
||||
impl Default for QualityGateConfig {
|
||||
fn default() -> Self {
|
||||
Self {
|
||||
min_bssids: 3,
|
||||
evidence_threshold: 0.5,
|
||||
drift_threshold: 10.0,
|
||||
evidence_decay: 0.95,
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
/// Quality gate combining structural, shift, and evidence filters.
|
||||
pub struct QualityGate {
|
||||
config: QualityGateConfig,
|
||||
/// Accumulated evidence score.
|
||||
evidence: f64,
|
||||
/// Running mean RSSI baseline for drift detection.
|
||||
prev_mean_rssi: Option<f64>,
|
||||
/// EMA smoothing factor for drift baseline.
|
||||
alpha: f64,
|
||||
}
|
||||
|
||||
impl QualityGate {
|
||||
/// Create a quality gate with default configuration.
|
||||
#[must_use]
|
||||
pub fn new() -> Self {
|
||||
Self::with_config(QualityGateConfig::default())
|
||||
}
|
||||
|
||||
/// Create a quality gate with custom configuration.
|
||||
#[must_use]
|
||||
pub fn with_config(config: QualityGateConfig) -> Self {
|
||||
Self {
|
||||
config,
|
||||
evidence: 0.0,
|
||||
prev_mean_rssi: None,
|
||||
alpha: 0.3,
|
||||
}
|
||||
}
|
||||
|
||||
/// Evaluate signal quality.
|
||||
///
|
||||
/// - `bssid_count`: number of active BSSIDs.
|
||||
/// - `mean_rssi_dbm`: mean RSSI across all BSSIDs.
|
||||
/// - `mean_correlation`: mean cross-BSSID correlation (spectral gap proxy).
|
||||
/// - `motion_score`: smoothed motion score from the estimator.
|
||||
///
|
||||
/// Returns a `QualityResult` with verdict and quality score.
|
||||
pub fn evaluate(
|
||||
&mut self,
|
||||
bssid_count: usize,
|
||||
mean_rssi_dbm: f64,
|
||||
mean_correlation: f64,
|
||||
motion_score: f32,
|
||||
) -> QualityResult {
|
||||
// --- Filter 1: Structural (BSSID count) ---
|
||||
let structural_ok = bssid_count >= self.config.min_bssids;
|
||||
|
||||
// --- Filter 2: Shift (RSSI drift detection) ---
|
||||
let drift = if let Some(prev) = self.prev_mean_rssi {
|
||||
(mean_rssi_dbm - prev).abs()
|
||||
} else {
|
||||
0.0
|
||||
};
|
||||
// Update baseline with EMA
|
||||
self.prev_mean_rssi = Some(match self.prev_mean_rssi {
|
||||
Some(prev) => self.alpha * mean_rssi_dbm + (1.0 - self.alpha) * prev,
|
||||
None => mean_rssi_dbm,
|
||||
});
|
||||
let drift_detected = drift > self.config.drift_threshold;
|
||||
|
||||
// --- Filter 3: Evidence accumulation ---
|
||||
// Motion and correlation both contribute positive evidence.
|
||||
let evidence_input = f64::from(motion_score) * 0.7 + mean_correlation * 0.3;
|
||||
self.evidence = self.evidence * self.config.evidence_decay + evidence_input;
|
||||
|
||||
// --- Quality score ---
|
||||
let quality = compute_quality_score(
|
||||
bssid_count,
|
||||
f64::from(motion_score),
|
||||
mean_correlation,
|
||||
drift_detected,
|
||||
);
|
||||
|
||||
// --- Verdict decision ---
|
||||
let verdict = if !structural_ok {
|
||||
Verdict::Deny("insufficient BSSIDs".to_string())
|
||||
} else if self.evidence < self.config.evidence_threshold * 0.5 || drift_detected {
|
||||
Verdict::Defer
|
||||
} else {
|
||||
Verdict::Permit
|
||||
};
|
||||
|
||||
QualityResult {
|
||||
verdict,
|
||||
quality,
|
||||
drift_detected,
|
||||
}
|
||||
}
|
||||
|
||||
/// Reset the gate state.
|
||||
pub fn reset(&mut self) {
|
||||
self.evidence = 0.0;
|
||||
self.prev_mean_rssi = None;
|
||||
}
|
||||
}
|
||||
|
||||
impl Default for QualityGate {
|
||||
fn default() -> Self {
|
||||
Self::new()
|
||||
}
|
||||
}
|
||||
|
||||
/// Quality verdict from the gate.
|
||||
#[derive(Debug, Clone)]
|
||||
pub struct QualityResult {
|
||||
/// Filter decision.
|
||||
pub verdict: Verdict,
|
||||
/// Signal quality score [0, 1].
|
||||
pub quality: f64,
|
||||
/// Whether environmental drift was detected.
|
||||
pub drift_detected: bool,
|
||||
}
|
||||
|
||||
/// Simplified quality gate verdict.
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
pub enum Verdict {
|
||||
/// Reading passed all quality gates and is reliable.
|
||||
Permit,
|
||||
/// Reading failed quality checks with a reason.
|
||||
Deny(String),
|
||||
/// Evidence still accumulating.
|
||||
Defer,
|
||||
}
|
||||
|
||||
impl Verdict {
|
||||
/// Returns true if this verdict permits the reading.
|
||||
#[must_use]
|
||||
pub fn is_permit(&self) -> bool {
|
||||
matches!(self, Self::Permit)
|
||||
}
|
||||
}
|
||||
|
||||
/// Compute a quality score from pipeline metrics.
|
||||
#[allow(clippy::cast_precision_loss)]
|
||||
fn compute_quality_score(
|
||||
n_active: usize,
|
||||
weighted_variance: f64,
|
||||
mean_correlation: f64,
|
||||
drift: bool,
|
||||
) -> f64 {
|
||||
// 1. Number of active BSSIDs (more = better, diminishing returns)
|
||||
let bssid_factor = (n_active as f64 / 10.0).min(1.0);
|
||||
|
||||
// 2. Evidence strength (higher weighted variance = more signal)
|
||||
let evidence_factor = (weighted_variance * 10.0).min(1.0);
|
||||
|
||||
// 3. Correlation coherence (moderate correlation is best)
|
||||
let corr_factor = 1.0 - (mean_correlation - 0.5).abs() * 2.0;
|
||||
|
||||
// 4. Drift penalty
|
||||
let drift_penalty = if drift { 0.7 } else { 1.0 };
|
||||
|
||||
let raw =
|
||||
(bssid_factor * 0.3 + evidence_factor * 0.4 + corr_factor.max(0.0) * 0.3) * drift_penalty;
|
||||
raw.clamp(0.0, 1.0)
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn new_gate_creates_ok() {
|
||||
let gate = QualityGate::new();
|
||||
assert!((gate.evidence - 0.0).abs() < f64::EPSILON);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn evaluate_with_good_signal() {
|
||||
let mut gate = QualityGate::new();
|
||||
// Pump several frames to build evidence.
|
||||
for _ in 0..20 {
|
||||
gate.evaluate(10, -60.0, 0.5, 0.3);
|
||||
}
|
||||
let result = gate.evaluate(10, -60.0, 0.5, 0.3);
|
||||
assert!(result.quality > 0.0, "quality should be positive");
|
||||
assert!(result.verdict.is_permit(), "should permit good signal");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn too_few_bssids_denied() {
|
||||
let mut gate = QualityGate::new();
|
||||
let result = gate.evaluate(1, -60.0, 0.5, 0.3);
|
||||
assert!(
|
||||
matches!(result.verdict, Verdict::Deny(_)),
|
||||
"too few BSSIDs should be denied"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn quality_increases_with_more_bssids() {
|
||||
let q_few = compute_quality_score(3, 0.1, 0.5, false);
|
||||
let q_many = compute_quality_score(10, 0.1, 0.5, false);
|
||||
assert!(q_many > q_few, "more BSSIDs should give higher quality");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn drift_reduces_quality() {
|
||||
let q_stable = compute_quality_score(5, 0.1, 0.5, false);
|
||||
let q_drift = compute_quality_score(5, 0.1, 0.5, true);
|
||||
assert!(q_drift < q_stable, "drift should reduce quality");
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn verdict_is_permit_check() {
|
||||
assert!(Verdict::Permit.is_permit());
|
||||
assert!(!Verdict::Deny("test".to_string()).is_permit());
|
||||
assert!(!Verdict::Defer.is_permit());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn default_creates_gate() {
|
||||
let _gate = QualityGate::default();
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn reset_clears_state() {
|
||||
let mut gate = QualityGate::new();
|
||||
gate.evaluate(10, -60.0, 0.5, 0.3);
|
||||
gate.reset();
|
||||
assert!(gate.prev_mean_rssi.is_none());
|
||||
assert!((gate.evidence - 0.0).abs() < f64::EPSILON);
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,9 @@
|
||||
//! Port definitions for the BSSID Acquisition bounded context.
|
||||
//!
|
||||
//! Hexagonal-architecture ports that abstract the WiFi scanning backend,
|
||||
//! enabling Tier 1 (netsh), Tier 2 (wlanapi FFI), and test-double adapters
|
||||
//! to be swapped transparently.
|
||||
|
||||
mod scan_port;
|
||||
|
||||
pub use scan_port::WlanScanPort;
|
||||
@@ -0,0 +1,17 @@
|
||||
//! The primary port (driving side) for WiFi BSSID scanning.
|
||||
|
||||
use crate::domain::bssid::BssidObservation;
|
||||
use crate::error::WifiScanError;
|
||||
|
||||
/// Port that abstracts the platform WiFi scanning backend.
|
||||
///
|
||||
/// Implementations include:
|
||||
/// - [`crate::adapter::NetshBssidScanner`] -- Tier 1, subprocess-based.
|
||||
/// - Future: `WlanApiBssidScanner` -- Tier 2, native FFI (feature-gated).
|
||||
pub trait WlanScanPort: Send + Sync {
|
||||
/// Perform a scan and return all currently visible BSSIDs.
|
||||
fn scan(&self) -> Result<Vec<BssidObservation>, WifiScanError>;
|
||||
|
||||
/// Return the BSSID to which the adapter is currently connected, if any.
|
||||
fn connected(&self) -> Result<Option<BssidObservation>, WifiScanError>;
|
||||
}
|
||||
Reference in New Issue
Block a user