Full implementation of Project AETHER — Contrastive CSI Embedding Model. ## Phases Delivered 1. ProjectionHead (64→128→128) + L2 normalization 2. CsiAugmenter (5 physically-motivated augmentations) 3. InfoNCE contrastive loss + SimCLR pretraining 4. FingerprintIndex (4 index types: env, activity, temporal, person) 5. RVF SEG_EMBED (0x0C) + CLI integration 6. Cross-modal alignment (PoseEncoder + InfoNCE) 7. Deep RuVector: MicroLoRA, EWC++, drift detection, hard-negative mining, SEG_LORA ## Stats - 276 tests passing (191 lib + 51 bin + 16 rvf + 18 vitals) - 3,342 additions across 8 files - Zero unsafe/unwrap/panic/todo stubs - ~55KB INT8 model for ESP32 edge deployment Also fixes deprecated GitHub Actions (v3→v4) and adds feat/* branch CI triggers. Closes #50
wifi-densepose-train
Complete training pipeline for WiFi-DensePose, integrated with all five ruvector crates.
Overview
wifi-densepose-train provides everything needed to train the WiFi-to-DensePose model: dataset
loading, subcarrier interpolation, loss functions, evaluation metrics, and the training loop
orchestrator. It supports both the MM-Fi dataset (NeurIPS 2023) and deterministic synthetic data
for reproducible experiments.
Without the tch-backend feature the crate still provides the dataset, configuration, and
subcarrier interpolation APIs needed for data preprocessing and proof verification.
Features
- MM-Fi dataset loader -- Reads the MM-Fi multimodal dataset (NeurIPS 2023) from disk with
memory-mapped
.npyfiles. - Synthetic dataset -- Deterministic, fixed-seed CSI generation for unit tests and proofs.
- Subcarrier interpolation -- 114 -> 56 subcarrier compression via
ruvector-solversparse interpolation with variance-based selection. - Loss functions (
tch-backend) -- Pose estimation losses including MSE, OKS, and combined multi-task loss. - Metrics (
tch-backend) -- PCKh, OKS-AP, and per-keypoint evaluation withruvector-mincut-based person matching. - Training orchestrator (
tch-backend) -- Full training loop with learning rate scheduling, gradient clipping, checkpointing, and reproducible proofs. - All 5 ruvector crates --
ruvector-mincut,ruvector-attn-mincut,ruvector-temporal-tensor,ruvector-solver, andruvector-attentionintegrated across dataset loading, metrics, and model attention.
Feature flags
| Flag | Default | Description |
|---|---|---|
tch-backend |
no | Enable PyTorch training via tch-rs |
cuda |
no | CUDA GPU acceleration (implies tch) |
Binaries
| Binary | Description |
|---|---|
train |
Main training entry point |
verify-training |
Proof verification (requires tch-backend) |
Quick Start
use wifi_densepose_train::config::TrainingConfig;
use wifi_densepose_train::dataset::{SyntheticCsiDataset, SyntheticConfig, CsiDataset};
// Build and validate config
let config = TrainingConfig::default();
config.validate().expect("config is valid");
// Create a synthetic dataset (deterministic, fixed-seed)
let syn_cfg = SyntheticConfig::default();
let dataset = SyntheticCsiDataset::new(200, syn_cfg);
// Load one sample
let sample = dataset.get(0).unwrap();
println!("amplitude shape: {:?}", sample.amplitude.shape());
Architecture
wifi-densepose-train/src/
lib.rs -- Re-exports, VERSION
config.rs -- TrainingConfig, hyperparameters, validation
dataset.rs -- CsiDataset trait, MmFiDataset, SyntheticCsiDataset, DataLoader
error.rs -- TrainError, ConfigError, DatasetError, SubcarrierError
subcarrier.rs -- interpolate_subcarriers (114->56), variance-based selection
losses.rs -- (tch) MSE, OKS, multi-task loss [feature-gated]
metrics.rs -- (tch) PCKh, OKS-AP, person matching [feature-gated]
model.rs -- (tch) Model definition with attention [feature-gated]
proof.rs -- (tch) Deterministic training proofs [feature-gated]
trainer.rs -- (tch) Training loop orchestrator [feature-gated]
Related Crates
| Crate | Role |
|---|---|
wifi-densepose-signal |
Signal preprocessing consumed by dataset loaders |
wifi-densepose-nn |
Inference engine that loads trained models |
ruvector-mincut |
Person matching in metrics |
ruvector-attn-mincut |
Attention-weighted graph cuts |
ruvector-temporal-tensor |
Compressed CSI buffering in datasets |
ruvector-solver |
Sparse subcarrier interpolation |
ruvector-attention |
Spatial attention in model |
License
MIT OR Apache-2.0