Compare commits
2 Commits
MaTriXy/fe
...
claude/wif
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c707b636bd | ||
|
|
25b005a0d6 |
400
docs/adr/ADR-029-ruvsense-multistatic-sensing-mode.md
Normal file
400
docs/adr/ADR-029-ruvsense-multistatic-sensing-mode.md
Normal file
@@ -0,0 +1,400 @@
|
||||
# ADR-029: Project RuvSense -- Sensing-First RF Mode for Multistatic WiFi DensePose
|
||||
|
||||
| Field | Value |
|
||||
|-------|-------|
|
||||
| **Status** | Proposed |
|
||||
| **Date** | 2026-03-02 |
|
||||
| **Deciders** | ruv |
|
||||
| **Codename** | **RuvSense** -- RuVector-Enhanced Sensing for Multistatic Fidelity |
|
||||
| **Relates to** | ADR-012 (ESP32 Mesh), ADR-014 (SOTA Signal Processing), ADR-016 (RuVector Training), ADR-017 (RuVector Signal+MAT), ADR-018 (ESP32 Implementation), ADR-024 (AETHER Embeddings), ADR-026 (Survivor Track Lifecycle), ADR-027 (MERIDIAN Generalization) |
|
||||
|
||||
---
|
||||
|
||||
## 1. Context
|
||||
|
||||
### 1.1 The Fidelity Gap
|
||||
|
||||
Current WiFi-DensePose achieves functional pose estimation from a single ESP32 AP, but three fidelity metrics prevent production deployment:
|
||||
|
||||
| Metric | Current (Single ESP32) | Required (Production) | Root Cause |
|
||||
|--------|------------------------|----------------------|------------|
|
||||
| Torso keypoint jitter | ~15cm RMS | <3cm RMS | Single viewpoint, 20 MHz bandwidth, no temporal smoothing |
|
||||
| Multi-person separation | Fails >2 people, frequent ID swaps | 4+ people, zero swaps over 10 min | Underdetermined with 1 TX-RX link; no person-specific features |
|
||||
| Small motion sensitivity | Gross movement only | Breathing at 3m, heartbeat at 1.5m | Insufficient phase sensitivity at 2.4 GHz; noise floor too high |
|
||||
| Update rate | ~10 Hz effective | 20 Hz | Single-channel serial CSI collection |
|
||||
| Temporal stability | Drifts within hours | Stable over days | No coherence gating; model absorbs environmental drift |
|
||||
|
||||
### 1.2 The Insight: Sensing-First RF Mode on Existing Silicon
|
||||
|
||||
You do not need to invent a new WiFi standard. The winning move is a **sensing-first RF mode** that rides on existing silicon (ESP32-S3), existing bands (2.4/5 GHz), and existing regulations (802.11n NDP frames). The fidelity improvement comes from three physical levers:
|
||||
|
||||
1. **Bandwidth**: Channel-hopping across 2.4 GHz channels 1/6/11 triples effective bandwidth from 20 MHz to 60 MHz, 3x multipath separation
|
||||
2. **Carrier frequency**: Dual-band sensing (2.4 + 5 GHz) doubles phase sensitivity to small motion
|
||||
3. **Viewpoints**: Multistatic ESP32 mesh (4 nodes = 12 TX-RX links) provides 360-degree geometric diversity
|
||||
|
||||
### 1.3 Acceptance Test
|
||||
|
||||
**Two people in a room, 20 Hz update rate, stable tracks for 10 minutes with no identity swaps and low jitter in the torso keypoints.**
|
||||
|
||||
Quantified:
|
||||
- Torso keypoint jitter < 30mm RMS (hips, shoulders, spine)
|
||||
- Zero identity swaps over 600 seconds (12,000 frames)
|
||||
- 20 Hz output rate (50 ms cycle time)
|
||||
- Breathing SNR > 10dB at 3m (validates small-motion sensitivity)
|
||||
|
||||
---
|
||||
|
||||
## 2. Decision
|
||||
|
||||
### 2.1 Architecture Overview
|
||||
|
||||
Implement RuvSense as a new bounded context within `wifi-densepose-signal`, consisting of 6 modules:
|
||||
|
||||
```
|
||||
wifi-densepose-signal/src/ruvsense/
|
||||
├── mod.rs // Module exports, RuvSense pipeline orchestrator
|
||||
├── multiband.rs // Multi-band CSI frame fusion (§2.2)
|
||||
├── phase_align.rs // Cross-channel phase alignment (§2.3)
|
||||
├── multistatic.rs // Multi-node viewpoint fusion (§2.4)
|
||||
├── coherence.rs // Coherence metric computation (§2.5)
|
||||
├── coherence_gate.rs // Gated update policy (§2.6)
|
||||
└── pose_tracker.rs // 17-keypoint Kalman tracker with re-ID (§2.7)
|
||||
```
|
||||
|
||||
### 2.2 Channel-Hopping Firmware (ESP32-S3)
|
||||
|
||||
Modify the ESP32 firmware (`firmware/esp32-csi-node/main/csi_collector.c`) to cycle through non-overlapping channels at configurable dwell times:
|
||||
|
||||
```c
|
||||
// Channel hop table (populated from NVS at boot)
|
||||
static uint8_t s_hop_channels[6] = {1, 6, 11, 36, 40, 44};
|
||||
static uint8_t s_hop_count = 3; // default: 2.4 GHz only
|
||||
static uint32_t s_dwell_ms = 50; // 50ms per channel
|
||||
```
|
||||
|
||||
At 100 Hz raw CSI rate with 50 ms dwell across 3 channels, each channel yields ~33 frames/second. The existing ADR-018 binary frame format already carries `channel_freq_mhz` at offset 8, so no wire format change is needed.
|
||||
|
||||
**NDP frame injection:** `esp_wifi_80211_tx()` injects deterministic Null Data Packet frames (preamble-only, no payload, ~24 us airtime) at GPIO-triggered intervals. This is sensing-first: the primary RF emission purpose is CSI measurement, not data communication.
|
||||
|
||||
### 2.3 Multi-Band Frame Fusion
|
||||
|
||||
Aggregate per-channel CSI frames into a wideband virtual snapshot:
|
||||
|
||||
```rust
|
||||
/// Fused multi-band CSI from one node at one time slot.
|
||||
pub struct MultiBandCsiFrame {
|
||||
pub node_id: u8,
|
||||
pub timestamp_us: u64,
|
||||
/// One canonical-56 row per channel, ordered by center frequency.
|
||||
pub channel_frames: Vec<CanonicalCsiFrame>,
|
||||
/// Center frequencies (MHz) for each channel row.
|
||||
pub frequencies_mhz: Vec<u32>,
|
||||
/// Cross-channel coherence score (0.0-1.0).
|
||||
pub coherence: f32,
|
||||
}
|
||||
```
|
||||
|
||||
Cross-channel phase alignment uses `ruvector-solver::NeumannSolver` to solve for the channel-dependent phase rotation introduced by the ESP32 local oscillator during channel hops. The system:
|
||||
|
||||
```
|
||||
[Φ₁, Φ₆, Φ₁₁] = [Φ_body + δ₁, Φ_body + δ₆, Φ_body + δ₁₁]
|
||||
```
|
||||
|
||||
NeumannSolver fits the `δ` offsets from the static subcarrier components (which should have zero body-caused phase shift), then removes them.
|
||||
|
||||
### 2.4 Multistatic Viewpoint Fusion
|
||||
|
||||
With N ESP32 nodes, collect N `MultiBandCsiFrame` per time slot and fuse with geometric diversity:
|
||||
|
||||
**TDMA Sensing Schedule (4 nodes):**
|
||||
|
||||
| Slot | TX | RX₁ | RX₂ | RX₃ | Duration |
|
||||
|------|-----|-----|-----|-----|----------|
|
||||
| 0 | Node A | B | C | D | 4 ms |
|
||||
| 1 | Node B | A | C | D | 4 ms |
|
||||
| 2 | Node C | A | B | D | 4 ms |
|
||||
| 3 | Node D | A | B | C | 4 ms |
|
||||
| 4 | -- | Processing + fusion | | | 30 ms |
|
||||
| **Total** | | | | | **50 ms = 20 Hz** |
|
||||
|
||||
Synchronization: GPIO pulse from aggregator node at cycle start. Clock drift at ±10ppm over 50 ms is ~0.5 us, well within the 1 ms guard interval.
|
||||
|
||||
**Cross-node fusion** uses `ruvector-attn-mincut::attn_mincut` where time-frequency cells from different nodes attend to each other. Cells showing correlated motion energy across nodes (body reflection) are amplified; cells with single-node energy (local multipath artifact) are suppressed.
|
||||
|
||||
**Multi-person separation** via `ruvector-mincut::DynamicMinCut`:
|
||||
|
||||
1. Build cross-link temporal correlation graph (nodes = TX-RX links, edges = correlation coefficient)
|
||||
2. `DynamicMinCut` partitions into K clusters (one per detected person)
|
||||
3. Attention fusion (§5.3 of research doc) runs independently per cluster
|
||||
|
||||
### 2.5 Coherence Metric
|
||||
|
||||
Per-link coherence quantifies consistency with recent history:
|
||||
|
||||
```rust
|
||||
pub fn coherence_score(
|
||||
current: &[f32],
|
||||
reference: &[f32],
|
||||
variance: &[f32],
|
||||
) -> f32 {
|
||||
current.iter().zip(reference.iter()).zip(variance.iter())
|
||||
.map(|((&c, &r), &v)| {
|
||||
let z = (c - r).abs() / v.sqrt().max(1e-6);
|
||||
let weight = 1.0 / (v + 1e-6);
|
||||
((-0.5 * z * z).exp(), weight)
|
||||
})
|
||||
.fold((0.0, 0.0), |(sc, sw), (c, w)| (sc + c * w, sw + w))
|
||||
.pipe(|(sc, sw)| sc / sw)
|
||||
}
|
||||
```
|
||||
|
||||
The static/dynamic decomposition uses `ruvector-solver` to separate environmental drift (slow, global) from body motion (fast, subcarrier-specific).
|
||||
|
||||
### 2.6 Coherence-Gated Update Policy
|
||||
|
||||
```rust
|
||||
pub enum GateDecision {
|
||||
/// Coherence > 0.85: Full Kalman measurement update
|
||||
Accept(Pose),
|
||||
/// 0.5 < coherence < 0.85: Kalman predict only (3x inflated noise)
|
||||
PredictOnly,
|
||||
/// Coherence < 0.5: Reject measurement entirely
|
||||
Reject,
|
||||
/// >10s continuous low coherence: Trigger SONA recalibration (ADR-005)
|
||||
Recalibrate,
|
||||
}
|
||||
```
|
||||
|
||||
When `Recalibrate` fires:
|
||||
1. Freeze output at last known good pose
|
||||
2. Collect 200 frames (10s) of unlabeled CSI
|
||||
3. Run AETHER contrastive TTT (ADR-024) to adapt encoder
|
||||
4. Update SONA LoRA weights (ADR-005), <1ms per update
|
||||
5. Resume sensing with adapted model
|
||||
|
||||
### 2.7 Pose Tracker (17-Keypoint Kalman with Re-ID)
|
||||
|
||||
Lift the Kalman + lifecycle + re-ID infrastructure from `wifi-densepose-mat/src/tracking/` (ADR-026) into the RuvSense bounded context, extended for 17-keypoint skeletons:
|
||||
|
||||
| Parameter | Value | Rationale |
|
||||
|-----------|-------|-----------|
|
||||
| State dimension | 6 per keypoint (x,y,z,vx,vy,vz) | Constant-velocity model |
|
||||
| Process noise σ_a | 0.3 m/s² | Normal walking acceleration |
|
||||
| Measurement noise σ_obs | 0.08 m | Target <8cm RMS at torso |
|
||||
| Mahalanobis gate | χ²(3) = 9.0 | 3σ ellipsoid (same as ADR-026) |
|
||||
| Birth hits | 2 frames (100ms at 20Hz) | Reject single-frame noise |
|
||||
| Loss misses | 5 frames (250ms) | Brief occlusion tolerance |
|
||||
| Re-ID feature | AETHER 128-dim embedding | Body-shape discriminative (ADR-024) |
|
||||
| Re-ID window | 5 seconds | Sufficient for crossing recovery |
|
||||
|
||||
**Track assignment** uses `ruvector-mincut`'s `DynamicPersonMatcher` (already integrated in `metrics.rs`, ADR-016) with joint position + embedding cost:
|
||||
|
||||
```
|
||||
cost(track_i, det_j) = 0.6 * mahalanobis(track_i, det_j.position)
|
||||
+ 0.4 * (1 - cosine_sim(track_i.embedding, det_j.embedding))
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 3. GOAP Integration Plan (Goal-Oriented Action Planning)
|
||||
|
||||
### 3.1 Action Dependency Graph
|
||||
|
||||
```
|
||||
Phase 1: Foundation
|
||||
Action 1: Channel-Hopping Firmware ──────────────────────┐
|
||||
│ │
|
||||
v │
|
||||
Action 2: Multi-Band Frame Fusion ──→ Action 6: Coherence │
|
||||
│ Metric │
|
||||
v │ │
|
||||
Action 3: Multistatic Mesh v │
|
||||
│ Action 7: Coherence │
|
||||
v Gate │
|
||||
Phase 2: Tracking │ │
|
||||
Action 4: Pose Tracker ←────────────────┘ │
|
||||
│ │
|
||||
v │
|
||||
Action 5: End-to-End Pipeline @ 20 Hz ←────────────────────┘
|
||||
│
|
||||
v
|
||||
Phase 4: Hardening
|
||||
Action 8: AETHER Track Re-ID
|
||||
│
|
||||
v
|
||||
Action 9: ADR-029 Documentation (this document)
|
||||
```
|
||||
|
||||
### 3.2 Cost and RuVector Mapping
|
||||
|
||||
| # | Action | Cost | Preconditions | RuVector Crates | Effects |
|
||||
|---|--------|------|---------------|-----------------|---------|
|
||||
| 1 | Channel-hopping firmware | 4/10 | ESP32 firmware exists | None (pure C) | `bandwidth_extended = true` |
|
||||
| 2 | Multi-band frame fusion | 5/10 | Action 1 | `solver`, `attention` | `fused_multi_band_frame = true` |
|
||||
| 3 | Multistatic mesh aggregation | 5/10 | Action 2 | `mincut`, `attn-mincut` | `multistatic_mesh = true` |
|
||||
| 4 | Pose tracker | 4/10 | Action 3, 7 | `mincut` | `pose_tracker = true` |
|
||||
| 5 | End-to-end pipeline | 6/10 | Actions 2-4 | `temporal-tensor`, `attention` | `20hz_update = true` |
|
||||
| 6 | Coherence metric | 3/10 | Action 2 | `solver` | `coherence_metric = true` |
|
||||
| 7 | Coherence gate | 3/10 | Action 6 | `attn-mincut` | `coherence_gating = true` |
|
||||
| 8 | AETHER re-ID | 4/10 | Actions 4, 7 | `attention` | `identity_stable = true` |
|
||||
| 9 | ADR documentation | 2/10 | All above | None | Decision documented |
|
||||
|
||||
**Total cost: 36 units. Minimum viable path to acceptance test: Actions 1-5 + 6-7 = 30 units.**
|
||||
|
||||
### 3.3 Latency Budget (50ms cycle)
|
||||
|
||||
| Stage | Budget | Method |
|
||||
|-------|--------|--------|
|
||||
| UDP receive + parse | <1 ms | ADR-018 binary, 148 bytes, zero-alloc |
|
||||
| Multi-band fusion | ~2 ms | NeumannSolver on 2×2 phase alignment |
|
||||
| Multistatic fusion | ~3 ms | attn_mincut on 3-6 nodes × 64 velocity bins |
|
||||
| Model inference | ~30-40 ms | CsiToPoseTransformer (lightweight, no ResNet) |
|
||||
| Kalman update | <1 ms | 17 independent 6D filters, stack-allocated |
|
||||
| **Total** | **~37-47 ms** | **Fits in 50 ms** |
|
||||
|
||||
---
|
||||
|
||||
## 4. Hardware Bill of Materials
|
||||
|
||||
| Component | Qty | Unit Cost | Purpose |
|
||||
|-----------|-----|-----------|---------|
|
||||
| ESP32-S3-DevKitC-1 | 4 | $10 | TX/RX sensing nodes |
|
||||
| ESP32-S3-DevKitC-1 | 1 | $10 | Aggregator (or x86/RPi host) |
|
||||
| External 5dBi antenna | 4-8 | $3 | Improved gain, directional coverage |
|
||||
| USB-C hub (4 port) | 1 | $15 | Power distribution |
|
||||
| Wall mount brackets | 4 | $2 | Ceiling/wall installation |
|
||||
| **Total** | | **$73-91** | Complete 4-node mesh |
|
||||
|
||||
---
|
||||
|
||||
## 5. RuVector v2.0.4 Integration Map
|
||||
|
||||
All five published crates are exercised:
|
||||
|
||||
| Crate | Actions | Integration Point | Algorithmic Advantage |
|
||||
|-------|---------|-------------------|----------------------|
|
||||
| `ruvector-solver` | 2, 6 | Phase alignment; coherence matrix decomposition | O(√n) Neumann convergence |
|
||||
| `ruvector-attention` | 2, 5, 8 | Cross-channel weighting; ring buffer; embedding similarity | Sublinear attention for small d |
|
||||
| `ruvector-mincut` | 3, 4 | Viewpoint diversity partitioning; track assignment | O(n^1.5 log n) dynamic updates |
|
||||
| `ruvector-attn-mincut` | 3, 7 | Cross-node spectrogram fusion; coherence gating | Attention + mincut in one pass |
|
||||
| `ruvector-temporal-tensor` | 5 | Compressed sensing window ring buffer | 50-75% memory reduction |
|
||||
|
||||
---
|
||||
|
||||
## 6. IEEE 802.11bf Alignment
|
||||
|
||||
RuvSense's TDMA sensing schedule is forward-compatible with IEEE 802.11bf (WLAN Sensing, published 2024):
|
||||
|
||||
| RuvSense Concept | 802.11bf Equivalent |
|
||||
|-----------------|---------------------|
|
||||
| TX slot | Sensing Initiator |
|
||||
| RX slot | Sensing Responder |
|
||||
| TDMA cycle | Sensing Measurement Instance |
|
||||
| NDP frame | Sensing NDP |
|
||||
| Aggregator | Sensing Session Owner |
|
||||
|
||||
When commercial APs support 802.11bf, the ESP32 mesh can interoperate by translating SSP slots into 802.11bf Sensing Trigger frames.
|
||||
|
||||
---
|
||||
|
||||
## 7. Dependency Changes
|
||||
|
||||
### Firmware (C)
|
||||
|
||||
New files:
|
||||
- `firmware/esp32-csi-node/main/sensing_schedule.h`
|
||||
- `firmware/esp32-csi-node/main/sensing_schedule.c`
|
||||
|
||||
Modified files:
|
||||
- `firmware/esp32-csi-node/main/csi_collector.c` (add channel hopping, link tagging)
|
||||
- `firmware/esp32-csi-node/main/main.c` (add GPIO sync, TDMA timer)
|
||||
|
||||
### Rust
|
||||
|
||||
New module: `crates/wifi-densepose-signal/src/ruvsense/` (6 files, ~1500 lines estimated)
|
||||
|
||||
Modified files:
|
||||
- `crates/wifi-densepose-signal/src/lib.rs` (export `ruvsense` module)
|
||||
- `crates/wifi-densepose-signal/Cargo.toml` (no new deps; all ruvector crates already present per ADR-017)
|
||||
- `crates/wifi-densepose-sensing-server/src/main.rs` (wire RuvSense pipeline into WebSocket output)
|
||||
|
||||
No new workspace dependencies. All ruvector crates are already in the workspace `Cargo.toml`.
|
||||
|
||||
---
|
||||
|
||||
## 8. Implementation Priority
|
||||
|
||||
| Priority | Actions | Weeks | Milestone |
|
||||
|----------|---------|-------|-----------|
|
||||
| P0 | 1 (firmware) | 2 | Channel-hopping ESP32 prototype |
|
||||
| P0 | 2 (multi-band) | 2 | Wideband virtual frames |
|
||||
| P1 | 3 (multistatic) | 2 | Multi-node fusion |
|
||||
| P1 | 4 (tracker) | 1 | 17-keypoint Kalman |
|
||||
| P1 | 6, 7 (coherence) | 1 | Gated updates |
|
||||
| P2 | 5 (end-to-end) | 2 | 20 Hz pipeline |
|
||||
| P2 | 8 (AETHER re-ID) | 1 | Identity hardening |
|
||||
| P3 | 9 (docs) | 0.5 | This ADR finalized |
|
||||
| **Total** | | **~10 weeks** | **Acceptance test** |
|
||||
|
||||
---
|
||||
|
||||
## 9. Consequences
|
||||
|
||||
### 9.1 Positive
|
||||
|
||||
- **3x bandwidth improvement** without hardware changes (channel hopping on existing ESP32)
|
||||
- **12 independent viewpoints** from 4 commodity $10 nodes (C(4,2) × 2 links)
|
||||
- **20 Hz update rate** with Kalman-smoothed output for sub-30mm torso jitter
|
||||
- **Days-long stability** via coherence gating + SONA recalibration
|
||||
- **All five ruvector crates exercised** — consistent algorithmic foundation
|
||||
- **$73-91 total BOM** — accessible for research and production
|
||||
- **802.11bf forward-compatible** — investment protected as commercial sensing arrives
|
||||
- **Cognitum upgrade path** — same software stack, swap ESP32 for higher-bandwidth front end
|
||||
|
||||
### 9.2 Negative
|
||||
|
||||
- **4-node deployment** requires physical installation and calibration of node positions
|
||||
- **TDMA scheduling** reduces per-node CSI rate (each node only transmits 1/4 of the time)
|
||||
- **Channel hopping** introduces ~1-5ms gaps during `esp_wifi_set_channel()` transitions
|
||||
- **5 GHz CSI on ESP32-S3** may not be available (ESP32-C6 supports it natively)
|
||||
- **Coherence gate** may reject valid measurements during fast body motion (mitigation: gate only on static-subcarrier coherence)
|
||||
|
||||
### 9.3 Risks
|
||||
|
||||
| Risk | Probability | Impact | Mitigation |
|
||||
|------|-------------|--------|------------|
|
||||
| ESP32 channel hop causes CSI gaps | Medium | Reduced effective rate | Measure gap duration; increase dwell if >5ms |
|
||||
| 5 GHz CSI unavailable on S3 | High | Lose frequency diversity | Fallback: 3-channel 2.4 GHz still provides 3x BW; ESP32-C6 for dual-band |
|
||||
| Model inference >40ms | Medium | Miss 20 Hz target | Run model at 10 Hz; Kalman predict at 20 Hz interpolates |
|
||||
| Two-person separation fails at 3 nodes | Low | Identity swaps | AETHER re-ID recovers; increase to 4-6 nodes |
|
||||
| Coherence gate false-triggers | Low | Missed updates | Gate on environmental coherence only, not body-motion subcarriers |
|
||||
|
||||
---
|
||||
|
||||
## 10. Related ADRs
|
||||
|
||||
| ADR | Relationship |
|
||||
|-----|-------------|
|
||||
| ADR-012 | **Extended**: RuvSense adds TDMA multistatic to single-AP mesh |
|
||||
| ADR-014 | **Used**: All 6 SOTA algorithms applied per-link |
|
||||
| ADR-016 | **Extended**: New ruvector integration points for multi-link fusion |
|
||||
| ADR-017 | **Extended**: Coherence gating adds temporal stability layer |
|
||||
| ADR-018 | **Modified**: Firmware gains channel hopping, TDMA schedule, HT40 |
|
||||
| ADR-022 | **Complementary**: RuvSense is the ESP32 equivalent of Windows multi-BSSID |
|
||||
| ADR-024 | **Used**: AETHER embeddings for person re-identification |
|
||||
| ADR-026 | **Reused**: Kalman + lifecycle infrastructure lifted to RuvSense |
|
||||
| ADR-027 | **Used**: GeometryEncoder, HardwareNormalizer, FiLM conditioning |
|
||||
|
||||
---
|
||||
|
||||
## 11. References
|
||||
|
||||
1. IEEE 802.11bf-2024. "WLAN Sensing." IEEE Standards Association.
|
||||
2. Geng, J., Huang, D., De la Torre, F. (2023). "DensePose From WiFi." arXiv:2301.00250.
|
||||
3. Yan, K. et al. (2024). "Person-in-WiFi 3D." CVPR 2024, pp. 969-978.
|
||||
4. Chen, L. et al. (2026). "PerceptAlign: Geometry-Aware WiFi Sensing." arXiv:2601.12252.
|
||||
5. Kotaru, M. et al. (2015). "SpotFi: Decimeter Level Localization Using WiFi." SIGCOMM.
|
||||
6. Zheng, Y. et al. (2019). "Zero-Effort Cross-Domain Gesture Recognition with Wi-Fi." MobiSys.
|
||||
7. Zeng, Y. et al. (2019). "FarSense: Pushing the Range Limit of WiFi-based Respiration Sensing." MobiCom.
|
||||
8. AM-FM (2026). "A Foundation Model for Ambient Intelligence Through WiFi." arXiv:2602.11200.
|
||||
9. Espressif ESP-CSI. https://github.com/espressif/esp-csi
|
||||
364
docs/adr/ADR-030-ruvsense-persistent-field-model.md
Normal file
364
docs/adr/ADR-030-ruvsense-persistent-field-model.md
Normal file
@@ -0,0 +1,364 @@
|
||||
# ADR-030: RuvSense Persistent Field Model — Longitudinal Drift Detection and Exotic Sensing Tiers
|
||||
|
||||
| Field | Value |
|
||||
|-------|-------|
|
||||
| **Status** | Proposed |
|
||||
| **Date** | 2026-03-02 |
|
||||
| **Deciders** | ruv |
|
||||
| **Codename** | **RuvSense Field** — Persistent Electromagnetic World Model |
|
||||
| **Relates to** | ADR-029 (RuvSense Multistatic), ADR-005 (SONA Self-Learning), ADR-024 (AETHER Embeddings), ADR-016 (RuVector Integration), ADR-026 (Survivor Track Lifecycle), ADR-027 (MERIDIAN Generalization) |
|
||||
|
||||
---
|
||||
|
||||
## 1. Context
|
||||
|
||||
### 1.1 Beyond Pose Estimation
|
||||
|
||||
ADR-029 establishes RuvSense as a sensing-first multistatic mesh achieving 20 Hz DensePose with <30mm jitter. That treats WiFi as a **momentary pose estimator**. The next leap: treat the electromagnetic field as a **persistent world model** that remembers, predicts, and explains.
|
||||
|
||||
The most exotic capabilities come from this shift in abstraction level:
|
||||
- The room is the model, not the person
|
||||
- People are structured perturbations to a baseline
|
||||
- Changes are deltas from a known state, not raw measurements
|
||||
- Time is a first-class dimension — the system remembers days, not frames
|
||||
|
||||
### 1.2 The Seven Capability Tiers
|
||||
|
||||
| Tier | Capability | Foundation |
|
||||
|------|-----------|-----------|
|
||||
| 1 | **Field Normal Modes** — Room electromagnetic eigenstructure | Baseline calibration + SVD |
|
||||
| 2 | **Coarse RF Tomography** — 3D occupancy volume from link attenuations | Sparse tomographic inversion |
|
||||
| 3 | **Intention Lead Signals** — Pre-movement prediction (200-500ms lead) | Temporal embedding trajectory analysis |
|
||||
| 4 | **Longitudinal Biomechanics Drift** — Personal baseline deviation over days | Welford statistics + HNSW memory |
|
||||
| 5 | **Cross-Room Continuity** — Identity persistence across spaces without optics | Environment fingerprinting + transition graph |
|
||||
| 6 | **Invisible Interaction Layer** — Multi-user gesture control through walls/darkness | Per-person CSI perturbation classification |
|
||||
| 7 | **Adversarial Detection** — Physically impossible signal identification | Multi-link consistency + field model constraints |
|
||||
|
||||
### 1.3 Signals, Not Diagnoses
|
||||
|
||||
RF sensing detects **biophysical proxies**, not medical conditions:
|
||||
|
||||
| Detectable Signal | Not Detectable |
|
||||
|-------------------|---------------|
|
||||
| Breathing rate variability | COPD diagnosis |
|
||||
| Gait asymmetry shift (18% over 14 days) | Parkinson's disease |
|
||||
| Posture instability increase | Neurological condition |
|
||||
| Micro-tremor onset | Specific tremor etiology |
|
||||
| Activity level decline | Depression or pain diagnosis |
|
||||
|
||||
The output is: "Your movement symmetry has shifted 18 percent over 14 days." That is actionable without being diagnostic. The evidence chain (stored embeddings, drift statistics, coherence scores) is fully traceable.
|
||||
|
||||
### 1.4 Acceptance Tests
|
||||
|
||||
**Tier 0 (ADR-029):** Two people, 20 Hz, 10 min stable tracks, zero ID swaps, <30mm torso jitter.
|
||||
|
||||
**Tier 1-4 (this ADR):** Seven-day run, no manual tuning. System flags one real environmental change and one real human drift event, produces traceable explanation using stored embeddings plus graph constraints.
|
||||
|
||||
**Tier 5-7 (appliance):** Thirty-day local run, no camera. Detects meaningful drift with <5% false alarm rate.
|
||||
|
||||
---
|
||||
|
||||
## 2. Decision
|
||||
|
||||
### 2.1 Implement Field Normal Modes as the Foundation
|
||||
|
||||
Add a `field_model` module to `wifi-densepose-signal/src/ruvsense/` that learns the room's electromagnetic baseline during unoccupied periods and decomposes all subsequent observations into environmental drift + body perturbation.
|
||||
|
||||
```
|
||||
wifi-densepose-signal/src/ruvsense/
|
||||
├── mod.rs // (existing, extend)
|
||||
├── field_model.rs // NEW: Field normal mode computation + perturbation extraction
|
||||
├── tomography.rs // NEW: Coarse RF tomography from link attenuations
|
||||
├── longitudinal.rs // NEW: Personal baseline + drift detection
|
||||
├── intention.rs // NEW: Pre-movement lead signal detector
|
||||
├── cross_room.rs // NEW: Cross-room identity continuity
|
||||
├── gesture.rs // NEW: Gesture classification from CSI perturbations
|
||||
├── adversarial.rs // NEW: Physically impossible signal detection
|
||||
└── (existing files...)
|
||||
```
|
||||
|
||||
### 2.2 Core Architecture: The Persistent Field Model
|
||||
|
||||
```
|
||||
Time
|
||||
│
|
||||
▼
|
||||
┌────────────────────────────────┐
|
||||
│ Field Normal Modes (Tier 1) │
|
||||
│ Room baseline + SVD modes │
|
||||
│ ruvector-solver │
|
||||
└────────────┬───────────────────┘
|
||||
│ Body perturbation (environmental drift removed)
|
||||
│
|
||||
┌───────┴───────┐
|
||||
│ │
|
||||
▼ ▼
|
||||
┌──────────┐ ┌──────────────┐
|
||||
│ Pose │ │ RF Tomography│
|
||||
│ (ADR-029)│ │ (Tier 2) │
|
||||
│ 20 Hz │ │ Occupancy vol│
|
||||
└────┬─────┘ └──────────────┘
|
||||
│
|
||||
▼
|
||||
┌──────────────────────────────┐
|
||||
│ AETHER Embedding (ADR-024) │
|
||||
│ 128-dim contrastive vector │
|
||||
└────────────┬─────────────────┘
|
||||
│
|
||||
┌───────┼───────┐
|
||||
│ │ │
|
||||
▼ ▼ ▼
|
||||
┌────────┐ ┌─────┐ ┌──────────┐
|
||||
│Intention│ │Track│ │Cross-Room│
|
||||
│Lead │ │Re-ID│ │Continuity│
|
||||
│(Tier 3)│ │ │ │(Tier 5) │
|
||||
└────────┘ └──┬──┘ └──────────┘
|
||||
│
|
||||
▼
|
||||
┌──────────────────────────────┐
|
||||
│ RuVector Longitudinal Memory │
|
||||
│ HNSW + graph + Welford stats│
|
||||
│ (Tier 4) │
|
||||
└──────────────┬───────────────┘
|
||||
│
|
||||
┌───────┴───────┐
|
||||
│ │
|
||||
▼ ▼
|
||||
┌──────────────┐ ┌──────────────┐
|
||||
│ Drift Reports│ │ Adversarial │
|
||||
│ (Level 1-3) │ │ Detection │
|
||||
│ │ │ (Tier 7) │
|
||||
└──────────────┘ └──────────────┘
|
||||
```
|
||||
|
||||
### 2.3 Field Normal Modes (Tier 1)
|
||||
|
||||
**What it is:** The room's electromagnetic eigenstructure — the stable propagation paths, reflection coefficients, and interference patterns when nobody is present.
|
||||
|
||||
**How it works:**
|
||||
1. During quiet periods (empty room, overnight), collect 10 minutes of CSI across all links
|
||||
2. Compute per-link baseline (mean CSI vector)
|
||||
3. Compute environmental variation modes via SVD (temperature, humidity, time-of-day effects)
|
||||
4. Store top-K modes (K=3-5 typically captures >95% of environmental variance)
|
||||
5. At runtime: subtract baseline, project out environmental modes, keep body perturbation
|
||||
|
||||
```rust
|
||||
pub struct FieldNormalMode {
|
||||
pub baseline: Vec<Vec<Complex<f32>>>, // [n_links × n_subcarriers]
|
||||
pub environmental_modes: Vec<Vec<f32>>, // [n_modes × n_subcarriers]
|
||||
pub mode_energies: Vec<f32>, // eigenvalues
|
||||
pub calibrated_at: u64,
|
||||
pub geometry_hash: u64,
|
||||
}
|
||||
```
|
||||
|
||||
**RuVector integration:**
|
||||
- `ruvector-solver` → Low-rank SVD for mode extraction
|
||||
- `ruvector-temporal-tensor` → Compressed baseline history storage
|
||||
- `ruvector-attn-mincut` → Identify which subcarriers belong to which mode
|
||||
|
||||
### 2.4 Longitudinal Drift Detection (Tier 4)
|
||||
|
||||
**The defensible pipeline:**
|
||||
|
||||
```
|
||||
RF → AETHER contrastive embedding
|
||||
→ RuVector longitudinal memory (HNSW + graph)
|
||||
→ Coherence-gated drift detection (Welford statistics)
|
||||
→ Risk flag with traceable evidence
|
||||
```
|
||||
|
||||
**Three monitoring levels:**
|
||||
|
||||
| Level | Signal Type | Example Output |
|
||||
|-------|------------|----------------|
|
||||
| **1: Physiological** | Raw biophysical metrics | "Breathing rate: 18.3 BPM today, 7-day avg: 16.1" |
|
||||
| **2: Drift** | Personal baseline deviation | "Gait symmetry shifted 18% over 14 days" |
|
||||
| **3: Risk correlation** | Pattern-matched concern | "Pattern consistent with increased fall risk" |
|
||||
|
||||
**Storage model:**
|
||||
|
||||
```rust
|
||||
pub struct PersonalBaseline {
|
||||
pub person_id: PersonId,
|
||||
pub gait_symmetry: WelfordStats,
|
||||
pub stability_index: WelfordStats,
|
||||
pub breathing_regularity: WelfordStats,
|
||||
pub micro_tremor: WelfordStats,
|
||||
pub activity_level: WelfordStats,
|
||||
pub embedding_centroid: Vec<f32>, // [128]
|
||||
pub observation_days: u32,
|
||||
pub updated_at: u64,
|
||||
}
|
||||
```
|
||||
|
||||
**RuVector integration:**
|
||||
- `ruvector-temporal-tensor` → Compressed daily summaries (50-75% memory savings)
|
||||
- HNSW → Embedding similarity search across longitudinal record
|
||||
- `ruvector-attention` → Per-metric drift significance weighting
|
||||
- `ruvector-mincut` → Temporal segmentation (detect changepoints in metric series)
|
||||
|
||||
### 2.5 Regulatory Classification
|
||||
|
||||
| Classification | What You Claim | Regulatory Path |
|
||||
|---------------|---------------|-----------------|
|
||||
| **Consumer wellness** (recommended first) | Activity metrics, breathing rate, stability score | Self-certification, FCC Part 15 |
|
||||
| **Clinical decision support** (future) | Fall risk alert, respiratory pattern concern | FDA Class II 510(k) or De Novo |
|
||||
| **Regulated medical device** (requires clinical partner) | Diagnostic claims for specific conditions | FDA Class II/III + clinical trials |
|
||||
|
||||
**Decision: Start as consumer wellness.** Build 12+ months of real-world longitudinal data. The dataset itself becomes the asset for future regulatory submissions.
|
||||
|
||||
---
|
||||
|
||||
## 3. Appliance Product Categories
|
||||
|
||||
### 3.1 Invisible Guardian
|
||||
|
||||
Wall-mounted wellness monitor for elderly care and independent living. No camera, no microphone, no reconstructable data. Stores embeddings and structural deltas only.
|
||||
|
||||
| Spec | Value |
|
||||
|------|-------|
|
||||
| Nodes | 4 ESP32-S3 pucks per room |
|
||||
| Processing | Central hub (RPi 5 or x86) |
|
||||
| Power | PoE or USB-C |
|
||||
| Output | Risk flags, drift alerts, occupancy timeline |
|
||||
| BOM | $73-91 (ESP32 mesh) + $35-80 (hub) |
|
||||
| Validation | 30-day autonomous run, <5% false alarm rate |
|
||||
|
||||
### 3.2 Spatial Digital Twin Node
|
||||
|
||||
Live electromagnetic room model for smart buildings and workplace analytics.
|
||||
|
||||
| Spec | Value |
|
||||
|------|-------|
|
||||
| Output | Occupancy heatmap, flow vectors, dwell time, anomaly events |
|
||||
| Integration | MQTT/REST API for BMS and CAFM |
|
||||
| Retention | 30-day rolling, GDPR-compliant |
|
||||
| Vertical | Smart buildings, retail, workspace optimization |
|
||||
|
||||
### 3.3 RF Interaction Surface
|
||||
|
||||
Multi-user gesture interface. No cameras. Works in darkness, smoke, through clothing.
|
||||
|
||||
| Spec | Value |
|
||||
|------|-------|
|
||||
| Gestures | Wave, point, beckon, push, circle + custom |
|
||||
| Users | Up to 4 simultaneous |
|
||||
| Latency | <100ms gesture recognition |
|
||||
| Vertical | Smart home, hospitality, accessibility |
|
||||
|
||||
### 3.4 Pre-Incident Drift Monitor
|
||||
|
||||
Longitudinal biomechanics tracker for rehabilitation and occupational health.
|
||||
|
||||
| Spec | Value |
|
||||
|------|-------|
|
||||
| Baseline | 7-day calibration per person |
|
||||
| Alert | Metric drift >2sigma for >3 days |
|
||||
| Evidence | Stored embedding trajectory + statistical report |
|
||||
| Vertical | Elderly care, rehab, occupational health |
|
||||
|
||||
### 3.5 Vertical Recommendation for First Hardware SKU
|
||||
|
||||
**Invisible Guardian** — the elderly care wellness monitor. Rationale:
|
||||
1. Largest addressable market with immediate revenue (aging population, care facility demand)
|
||||
2. Lowest regulatory bar (consumer wellness, no diagnostic claims)
|
||||
3. Privacy advantage over cameras is a selling point, not a limitation
|
||||
4. 30-day autonomous operation validates all tiers (field model, drift detection, coherence gating)
|
||||
5. $108-171 BOM allows $299-499 retail with healthy margins
|
||||
|
||||
---
|
||||
|
||||
## 4. RuVector Integration Map (Extended)
|
||||
|
||||
All five crates are exercised across the exotic tiers:
|
||||
|
||||
| Tier | Crate | API | Role |
|
||||
|------|-------|-----|------|
|
||||
| 1 (Field) | `ruvector-solver` | `NeumannSolver` + SVD | Environmental mode decomposition |
|
||||
| 1 (Field) | `ruvector-temporal-tensor` | `TemporalTensorCompressor` | Baseline history storage |
|
||||
| 1 (Field) | `ruvector-attn-mincut` | `attn_mincut` | Mode-subcarrier assignment |
|
||||
| 2 (Tomo) | `ruvector-solver` | `NeumannSolver` (L1) | Sparse tomographic inversion |
|
||||
| 3 (Intent) | `ruvector-attention` | `ScaledDotProductAttention` | Temporal trajectory weighting |
|
||||
| 3 (Intent) | `ruvector-temporal-tensor` | `CompressedCsiBuffer` | 2-second embedding history |
|
||||
| 4 (Drift) | `ruvector-temporal-tensor` | `TemporalTensorCompressor` | Daily summary compression |
|
||||
| 4 (Drift) | `ruvector-attention` | `ScaledDotProductAttention` | Metric drift significance |
|
||||
| 4 (Drift) | `ruvector-mincut` | `DynamicMinCut` | Temporal changepoint detection |
|
||||
| 5 (Cross-Room) | `ruvector-attention` | HNSW | Room and person fingerprint matching |
|
||||
| 5 (Cross-Room) | `ruvector-mincut` | `MinCutBuilder` | Transition graph partitioning |
|
||||
| 6 (Gesture) | `ruvector-attention` | `ScaledDotProductAttention` | Gesture template matching |
|
||||
| 7 (Adversarial) | `ruvector-solver` | `NeumannSolver` | Physical plausibility verification |
|
||||
| 7 (Adversarial) | `ruvector-attn-mincut` | `attn_mincut` | Multi-link consistency check |
|
||||
|
||||
---
|
||||
|
||||
## 5. Implementation Priority
|
||||
|
||||
| Priority | Tier | Module | Weeks | Dependency |
|
||||
|----------|------|--------|-------|------------|
|
||||
| P0 | 1 | `field_model.rs` | 2 | ADR-029 multistatic mesh operational |
|
||||
| P0 | 4 | `longitudinal.rs` | 2 | Tier 1 baseline + AETHER embeddings |
|
||||
| P1 | 2 | `tomography.rs` | 1 | Tier 1 perturbation extraction |
|
||||
| P1 | 3 | `intention.rs` | 2 | Tier 1 + temporal embedding history |
|
||||
| P2 | 5 | `cross_room.rs` | 2 | Tier 4 person profiles + multi-room deployment |
|
||||
| P2 | 6 | `gesture.rs` | 1 | Tier 1 perturbation + per-person separation |
|
||||
| P3 | 7 | `adversarial.rs` | 1 | Tier 1 field model + multi-link consistency |
|
||||
|
||||
**Total exotic tier: ~11 weeks after ADR-029 acceptance test passes.**
|
||||
|
||||
---
|
||||
|
||||
## 6. Consequences
|
||||
|
||||
### 6.1 Positive
|
||||
|
||||
- **Room becomes self-sensing**: Field normal modes provide a persistent baseline that explains change as structured deltas
|
||||
- **7-day autonomous operation**: Coherence gating + SONA adaptation + longitudinal memory eliminate manual tuning
|
||||
- **Privacy by design**: No images, no audio, no reconstructable data — only embeddings and statistical summaries
|
||||
- **Traceable evidence**: Every drift alert links to stored embeddings, timestamps, and graph constraints
|
||||
- **Multiple product categories**: Same software stack, different packaging — Guardian, Twin, Interaction, Drift Monitor
|
||||
- **Regulatory clarity**: Consumer wellness first, clinical decision support later with accumulated dataset
|
||||
- **Security primitive**: Coherence gating detects adversarial injection, not just quality issues
|
||||
|
||||
### 6.2 Negative
|
||||
|
||||
- **7-day calibration** required for personal baselines (system is less useful during initial period)
|
||||
- **Empty-room calibration** needed for field normal modes (may not always be available)
|
||||
- **Storage growth**: Longitudinal memory grows ~1 KB/person/day (manageable but non-zero)
|
||||
- **Statistical power**: Drift detection requires 14+ days of data for meaningful z-scores
|
||||
- **Multi-room**: Cross-room continuity requires hardware in all rooms (cost scales linearly)
|
||||
|
||||
### 6.3 Risks
|
||||
|
||||
| Risk | Probability | Impact | Mitigation |
|
||||
|------|-------------|--------|------------|
|
||||
| Field modes drift faster than expected | Medium | False perturbation detections | Reduce mode update interval from 24h to 4h |
|
||||
| Personal baselines too variable | Medium | High false alarm rate for drift | Widen sigma threshold from 2σ to 3σ; require 5+ days |
|
||||
| Cross-room matching fails for similar body types | Low | Identity confusion | Require temporal proximity (<60s) plus spatial adjacency |
|
||||
| Gesture recognition insufficient SNR | Medium | <80% accuracy | Restrict to near-field (<2m) initially |
|
||||
| Adversarial injection via coordinated WiFi injection | Very Low | Spoofed occupancy | Multi-link consistency check makes single-link spoofing detectable |
|
||||
|
||||
---
|
||||
|
||||
## 7. Related ADRs
|
||||
|
||||
| ADR | Relationship |
|
||||
|-----|-------------|
|
||||
| ADR-029 | **Prerequisite**: Multistatic mesh is the sensing substrate for all exotic tiers |
|
||||
| ADR-005 (SONA) | **Extended**: SONA recalibration triggered by coherence gate → now also by drift events |
|
||||
| ADR-016 (RuVector) | **Extended**: All 5 crates exercised across 7 exotic tiers |
|
||||
| ADR-024 (AETHER) | **Critical dependency**: Embeddings are the representation for all longitudinal memory |
|
||||
| ADR-026 (Tracking) | **Extended**: Track lifecycle now spans days (not minutes) for drift detection |
|
||||
| ADR-027 (MERIDIAN) | **Used**: Room geometry encoding for field normal mode conditioning |
|
||||
|
||||
---
|
||||
|
||||
## 8. References
|
||||
|
||||
1. IEEE 802.11bf-2024. "WLAN Sensing." IEEE Standards Association.
|
||||
2. FDA. "General Wellness: Policy for Low Risk Devices." Guidance Document, 2019.
|
||||
3. EU MDR 2017/745. "Medical Device Regulation." Official Journal of the European Union.
|
||||
4. Welford, B.P. (1962). "Note on a Method for Calculating Corrected Sums of Squares." Technometrics.
|
||||
5. Chen, L. et al. (2026). "PerceptAlign: Geometry-Aware WiFi Sensing." arXiv:2601.12252.
|
||||
6. AM-FM (2026). "A Foundation Model for Ambient Intelligence Through WiFi." arXiv:2602.11200.
|
||||
7. Geng, J. et al. (2023). "DensePose From WiFi." arXiv:2301.00250.
|
||||
1027
docs/ddd/ruvsense-domain-model.md
Normal file
1027
docs/ddd/ruvsense-domain-model.md
Normal file
File diff suppressed because it is too large
Load Diff
1495
docs/research/ruvsense-multistatic-fidelity-architecture.md
Normal file
1495
docs/research/ruvsense-multistatic-fidelity-architecture.md
Normal file
File diff suppressed because it is too large
Load Diff
@@ -1 +0,0 @@
|
||||
EXPO_PUBLIC_DEFAULT_SERVER_URL=http://192.168.1.100:8080
|
||||
@@ -1,26 +0,0 @@
|
||||
module.exports = {
|
||||
root: true,
|
||||
parser: '@typescript-eslint/parser',
|
||||
parserOptions: {
|
||||
ecmaVersion: 'latest',
|
||||
sourceType: 'module',
|
||||
ecmaFeatures: {
|
||||
jsx: true,
|
||||
},
|
||||
},
|
||||
plugins: ['@typescript-eslint', 'react', 'react-hooks'],
|
||||
extends: [
|
||||
'eslint:recommended',
|
||||
'plugin:react/recommended',
|
||||
'plugin:react-hooks/recommended',
|
||||
'plugin:@typescript-eslint/recommended',
|
||||
],
|
||||
settings: {
|
||||
react: {
|
||||
version: 'detect',
|
||||
},
|
||||
},
|
||||
rules: {
|
||||
'react/react-in-jsx-scope': 'off',
|
||||
},
|
||||
};
|
||||
41
mobile/.gitignore
vendored
41
mobile/.gitignore
vendored
@@ -1,41 +0,0 @@
|
||||
# Learn more https://docs.github.com/en/get-started/getting-started-with-git/ignoring-files
|
||||
|
||||
# dependencies
|
||||
node_modules/
|
||||
|
||||
# Expo
|
||||
.expo/
|
||||
dist/
|
||||
web-build/
|
||||
expo-env.d.ts
|
||||
|
||||
# Native
|
||||
.kotlin/
|
||||
*.orig.*
|
||||
*.jks
|
||||
*.p8
|
||||
*.p12
|
||||
*.key
|
||||
*.mobileprovision
|
||||
|
||||
# Metro
|
||||
.metro-health-check*
|
||||
|
||||
# debug
|
||||
npm-debug.*
|
||||
yarn-debug.*
|
||||
yarn-error.*
|
||||
|
||||
# macOS
|
||||
.DS_Store
|
||||
*.pem
|
||||
|
||||
# local env files
|
||||
.env*.local
|
||||
|
||||
# typescript
|
||||
*.tsbuildinfo
|
||||
|
||||
# generated native folders
|
||||
/ios
|
||||
/android
|
||||
@@ -1,4 +0,0 @@
|
||||
{
|
||||
"singleQuote": true,
|
||||
"trailingComma": "all"
|
||||
}
|
||||
@@ -1,38 +0,0 @@
|
||||
import { useEffect } from 'react';
|
||||
import { NavigationContainer, DarkTheme } from '@react-navigation/native';
|
||||
import { GestureHandlerRootView } from 'react-native-gesture-handler';
|
||||
import { StatusBar } from 'expo-status-bar';
|
||||
import { SafeAreaProvider } from 'react-native-safe-area-context';
|
||||
import { ThemeProvider } from './src/theme/ThemeContext';
|
||||
import { RootNavigator } from './src/navigation/RootNavigator';
|
||||
|
||||
export default function App() {
|
||||
useEffect(() => {
|
||||
(globalThis as { __appStartTime?: number }).__appStartTime = Date.now();
|
||||
}, []);
|
||||
|
||||
const navigationTheme = {
|
||||
...DarkTheme,
|
||||
colors: {
|
||||
...DarkTheme.colors,
|
||||
background: '#0A0E1A',
|
||||
card: '#0D1117',
|
||||
text: '#E2E8F0',
|
||||
border: '#1E293B',
|
||||
primary: '#32B8C6',
|
||||
},
|
||||
};
|
||||
|
||||
return (
|
||||
<GestureHandlerRootView style={{ flex: 1 }}>
|
||||
<SafeAreaProvider>
|
||||
<ThemeProvider>
|
||||
<NavigationContainer theme={navigationTheme}>
|
||||
<RootNavigator />
|
||||
</NavigationContainer>
|
||||
</ThemeProvider>
|
||||
</SafeAreaProvider>
|
||||
<StatusBar style="light" />
|
||||
</GestureHandlerRootView>
|
||||
);
|
||||
}
|
||||
@@ -1,12 +0,0 @@
|
||||
export default {
|
||||
name: 'WiFi-DensePose',
|
||||
slug: 'wifi-densepose',
|
||||
version: '1.0.0',
|
||||
ios: {
|
||||
bundleIdentifier: 'com.ruvnet.wifidensepose',
|
||||
},
|
||||
android: {
|
||||
package: 'com.ruvnet.wifidensepose',
|
||||
},
|
||||
// Use expo-env and app-level defaults from the project configuration when available.
|
||||
};
|
||||
@@ -1,30 +0,0 @@
|
||||
{
|
||||
"expo": {
|
||||
"name": "mobile",
|
||||
"slug": "mobile",
|
||||
"version": "1.0.0",
|
||||
"orientation": "portrait",
|
||||
"icon": "./assets/icon.png",
|
||||
"userInterfaceStyle": "light",
|
||||
"splash": {
|
||||
"image": "./assets/splash-icon.png",
|
||||
"resizeMode": "contain",
|
||||
"backgroundColor": "#ffffff"
|
||||
},
|
||||
"ios": {
|
||||
"supportsTablet": true
|
||||
},
|
||||
"android": {
|
||||
"adaptiveIcon": {
|
||||
"backgroundColor": "#E6F4FE",
|
||||
"foregroundImage": "./assets/android-icon-foreground.png",
|
||||
"backgroundImage": "./assets/android-icon-background.png",
|
||||
"monochromeImage": "./assets/android-icon-monochrome.png"
|
||||
},
|
||||
"predictiveBackGestureEnabled": false
|
||||
},
|
||||
"web": {
|
||||
"favicon": "./assets/favicon.png"
|
||||
}
|
||||
}
|
||||
}
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 17 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 77 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 4.0 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 1.1 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 384 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 17 KiB |
@@ -1,9 +0,0 @@
|
||||
module.exports = function (api) {
|
||||
api.cache(true);
|
||||
return {
|
||||
presets: ['babel-preset-expo'],
|
||||
plugins: [
|
||||
'react-native-reanimated/plugin'
|
||||
]
|
||||
};
|
||||
};
|
||||
@@ -1,17 +0,0 @@
|
||||
{
|
||||
"cli": {
|
||||
"version": ">= 4.0.0"
|
||||
},
|
||||
"build": {
|
||||
"development": {
|
||||
"developmentClient": true,
|
||||
"distribution": "internal"
|
||||
},
|
||||
"preview": {
|
||||
"distribution": "internal"
|
||||
},
|
||||
"production": {
|
||||
"autoIncrement": true
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,4 +0,0 @@
|
||||
import { registerRootComponent } from 'expo';
|
||||
import App from './App';
|
||||
|
||||
registerRootComponent(App);
|
||||
@@ -1,8 +0,0 @@
|
||||
module.exports = {
|
||||
preset: 'jest-expo',
|
||||
setupFilesAfterEnv: ['<rootDir>/jest.setup.ts'],
|
||||
testPathIgnorePatterns: ['/src/__tests__/'],
|
||||
transformIgnorePatterns: [
|
||||
'node_modules/(?!(expo|expo-.+|react-native|@react-native|react-native-webview|react-native-reanimated|react-native-svg|react-native-safe-area-context|react-native-screens|@react-navigation|@expo|@unimodules|expo-modules-core)/)',
|
||||
],
|
||||
};
|
||||
@@ -1,11 +0,0 @@
|
||||
jest.mock('@react-native-async-storage/async-storage', () =>
|
||||
require('@react-native-async-storage/async-storage/jest/async-storage-mock')
|
||||
);
|
||||
|
||||
jest.mock('react-native-wifi-reborn', () => ({
|
||||
loadWifiList: jest.fn(async () => []),
|
||||
}));
|
||||
|
||||
jest.mock('react-native-reanimated', () =>
|
||||
require('react-native-reanimated/mock')
|
||||
);
|
||||
16327
mobile/package-lock.json
generated
16327
mobile/package-lock.json
generated
File diff suppressed because it is too large
Load Diff
@@ -1,49 +0,0 @@
|
||||
{
|
||||
"name": "mobile",
|
||||
"version": "1.0.0",
|
||||
"main": "index.ts",
|
||||
"scripts": {
|
||||
"start": "expo start",
|
||||
"android": "expo start --android",
|
||||
"ios": "expo start --ios",
|
||||
"web": "expo start --web",
|
||||
"test": "jest",
|
||||
"lint": "eslint ."
|
||||
},
|
||||
"dependencies": {
|
||||
"@expo/vector-icons": "^15.0.2",
|
||||
"@react-native-async-storage/async-storage": "2.2.0",
|
||||
"@react-navigation/bottom-tabs": "^7.15.3",
|
||||
"@react-navigation/native": "^7.1.31",
|
||||
"axios": "^1.13.6",
|
||||
"expo": "~55.0.4",
|
||||
"expo-status-bar": "~55.0.4",
|
||||
"react": "19.2.0",
|
||||
"react-native": "0.83.2",
|
||||
"react-native-gesture-handler": "~2.30.0",
|
||||
"react-native-reanimated": "4.2.1",
|
||||
"react-native-safe-area-context": "~5.6.2",
|
||||
"react-native-screens": "~4.23.0",
|
||||
"react-native-svg": "15.15.3",
|
||||
"react-native-webview": "13.16.0",
|
||||
"react-native-wifi-reborn": "^4.13.6",
|
||||
"victory-native": "^41.20.2",
|
||||
"zustand": "^5.0.11"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@testing-library/jest-native": "^5.4.3",
|
||||
"@testing-library/react-native": "^13.3.3",
|
||||
"@types/jest": "^30.0.0",
|
||||
"@types/react": "~19.2.2",
|
||||
"@typescript-eslint/eslint-plugin": "^8.56.1",
|
||||
"@typescript-eslint/parser": "^8.56.1",
|
||||
"babel-preset-expo": "^55.0.10",
|
||||
"eslint": "^10.0.2",
|
||||
"jest": "^30.2.0",
|
||||
"jest-expo": "^55.0.9",
|
||||
"prettier": "^3.8.1",
|
||||
"react-native-worklets": "^0.7.4",
|
||||
"typescript": "~5.9.2"
|
||||
},
|
||||
"private": true
|
||||
}
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,5 +0,0 @@
|
||||
describe('placeholder', () => {
|
||||
it('passes', () => {
|
||||
expect(true).toBe(true);
|
||||
});
|
||||
});
|
||||
@@ -1,585 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8" />
|
||||
<meta
|
||||
name="viewport"
|
||||
content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no"
|
||||
/>
|
||||
<meta
|
||||
http-equiv="Content-Security-Policy"
|
||||
content="default-src 'self'; script-src 'self' https://cdnjs.cloudflare.com https://cdn.jsdelivr.net; style-src 'self' 'unsafe-inline'; img-src 'self' data:; connect-src 'self'"
|
||||
/>
|
||||
<title>WiFi DensePose Splat Viewer</title>
|
||||
<style>
|
||||
html,
|
||||
body,
|
||||
#gaussian-splat-root {
|
||||
margin: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
overflow: hidden;
|
||||
background: #0a0e1a;
|
||||
touch-action: none;
|
||||
}
|
||||
|
||||
#gaussian-splat-root {
|
||||
position: relative;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div id="gaussian-splat-root"></div>
|
||||
|
||||
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/r165/three.min.js"></script>
|
||||
<script src="https://cdn.jsdelivr.net/npm/three@0.165.0/examples/js/controls/OrbitControls.js"></script>
|
||||
|
||||
<script>
|
||||
(function () {
|
||||
const postMessageToRN = (message) => {
|
||||
if (!window.ReactNativeWebView || typeof window.ReactNativeWebView.postMessage !== 'function') {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
window.ReactNativeWebView.postMessage(JSON.stringify(message));
|
||||
} catch (error) {
|
||||
console.error('Failed to post RN message', error);
|
||||
}
|
||||
};
|
||||
|
||||
const postError = (message) => {
|
||||
postMessageToRN({
|
||||
type: 'ERROR',
|
||||
payload: {
|
||||
message: typeof message === 'string' ? message : 'Unknown bridge error',
|
||||
},
|
||||
});
|
||||
};
|
||||
|
||||
// Use global THREE from CDN
|
||||
const getThree = () => window.THREE;
|
||||
|
||||
// ---- Custom Splat Shaders --------------------------------------------
|
||||
|
||||
const SPLAT_VERTEX = `
|
||||
attribute float splatSize;
|
||||
attribute vec3 splatColor;
|
||||
attribute float splatOpacity;
|
||||
|
||||
varying vec3 vColor;
|
||||
varying float vOpacity;
|
||||
|
||||
void main() {
|
||||
vColor = splatColor;
|
||||
vOpacity = splatOpacity;
|
||||
|
||||
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
|
||||
gl_PointSize = splatSize * (300.0 / -mvPosition.z);
|
||||
gl_Position = projectionMatrix * mvPosition;
|
||||
}
|
||||
`;
|
||||
|
||||
const SPLAT_FRAGMENT = `
|
||||
varying vec3 vColor;
|
||||
varying float vOpacity;
|
||||
|
||||
void main() {
|
||||
// Circular soft-edge disc
|
||||
float dist = length(gl_PointCoord - vec2(0.5));
|
||||
if (dist > 0.5) discard;
|
||||
float alpha = smoothstep(0.5, 0.2, dist) * vOpacity;
|
||||
gl_FragColor = vec4(vColor, alpha);
|
||||
}
|
||||
`;
|
||||
|
||||
// ---- Color helpers ---------------------------------------------------
|
||||
|
||||
/** Map a scalar 0-1 to blue -> green -> red gradient */
|
||||
function valueToColor(v) {
|
||||
const clamped = Math.max(0, Math.min(1, v));
|
||||
// blue(0) -> cyan(0.25) -> green(0.5) -> yellow(0.75) -> red(1)
|
||||
let r;
|
||||
let g;
|
||||
let b;
|
||||
if (clamped < 0.5) {
|
||||
const t = clamped * 2;
|
||||
r = 0;
|
||||
g = t;
|
||||
b = 1 - t;
|
||||
} else {
|
||||
const t = (clamped - 0.5) * 2;
|
||||
r = t;
|
||||
g = 1 - t;
|
||||
b = 0;
|
||||
}
|
||||
return [r, g, b];
|
||||
}
|
||||
|
||||
// ---- GaussianSplatRenderer -------------------------------------------
|
||||
|
||||
class GaussianSplatRenderer {
|
||||
/** @param {HTMLElement} container - DOM element to attach the renderer to */
|
||||
constructor(container, opts = {}) {
|
||||
const THREE = getThree();
|
||||
if (!THREE) {
|
||||
throw new Error('Three.js not loaded');
|
||||
}
|
||||
|
||||
this.container = container;
|
||||
this.width = opts.width || container.clientWidth || 800;
|
||||
this.height = opts.height || 500;
|
||||
|
||||
// Scene
|
||||
this.scene = new THREE.Scene();
|
||||
this.scene.background = new THREE.Color(0x0a0e1a);
|
||||
|
||||
// Camera — perspective looking down at the room
|
||||
this.camera = new THREE.PerspectiveCamera(45, this.width / this.height, 0.1, 200);
|
||||
this.camera.position.set(0, 10, 12);
|
||||
this.camera.lookAt(0, 0, 0);
|
||||
|
||||
// Renderer
|
||||
this.renderer = new THREE.WebGLRenderer({ antialias: true, alpha: true });
|
||||
this.renderer.setSize(this.width, this.height);
|
||||
this.renderer.setPixelRatio(Math.min(window.devicePixelRatio, 2));
|
||||
container.appendChild(this.renderer.domElement);
|
||||
|
||||
// Lights
|
||||
const ambient = new THREE.AmbientLight(0x9ec7ff, 0.35);
|
||||
this.scene.add(ambient);
|
||||
|
||||
const directional = new THREE.DirectionalLight(0x9ec7ff, 0.65);
|
||||
directional.position.set(4, 10, 6);
|
||||
directional.castShadow = false;
|
||||
this.scene.add(directional);
|
||||
|
||||
// Grid & room
|
||||
this._createRoom(THREE);
|
||||
|
||||
// Signal field splats (20x20 = 400 points on the floor plane)
|
||||
this.gridSize = 20;
|
||||
this._createFieldSplats(THREE);
|
||||
|
||||
// Node markers (ESP32 / router positions)
|
||||
this._createNodeMarkers(THREE);
|
||||
|
||||
// Body disruption blob
|
||||
this._createBodyBlob(THREE);
|
||||
|
||||
// Orbit controls for drag + pinch zoom
|
||||
this.controls = new THREE.OrbitControls(this.camera, this.renderer.domElement);
|
||||
this.controls.target.set(0, 0, 0);
|
||||
this.controls.minDistance = 6;
|
||||
this.controls.maxDistance = 40;
|
||||
this.controls.enableDamping = true;
|
||||
this.controls.dampingFactor = 0.08;
|
||||
this.controls.update();
|
||||
|
||||
// Animation state
|
||||
this._animFrame = null;
|
||||
this._lastData = null;
|
||||
this._fpsFrames = [];
|
||||
this._lastFpsReport = 0;
|
||||
|
||||
// Start render loop
|
||||
this._animate();
|
||||
}
|
||||
|
||||
// ---- Scene setup ---------------------------------------------------
|
||||
|
||||
_createRoom(THREE) {
|
||||
// Floor grid (on y = 0), 20 units
|
||||
const grid = new THREE.GridHelper(20, 20, 0x1a3a4a, 0x0d1f28);
|
||||
grid.position.y = 0;
|
||||
this.scene.add(grid);
|
||||
|
||||
// Room boundary wireframe
|
||||
const boxGeo = new THREE.BoxGeometry(20, 6, 20);
|
||||
const edges = new THREE.EdgesGeometry(boxGeo);
|
||||
const line = new THREE.LineSegments(
|
||||
edges,
|
||||
new THREE.LineBasicMaterial({ color: 0x1a4a5a, opacity: 0.3, transparent: true }),
|
||||
);
|
||||
line.position.y = 3;
|
||||
this.scene.add(line);
|
||||
}
|
||||
|
||||
_createFieldSplats(THREE) {
|
||||
const count = this.gridSize * this.gridSize;
|
||||
|
||||
const positions = new Float32Array(count * 3);
|
||||
const sizes = new Float32Array(count);
|
||||
const colors = new Float32Array(count * 3);
|
||||
const opacities = new Float32Array(count);
|
||||
|
||||
// Lay splats on the floor plane (y = 0.05 to sit just above grid)
|
||||
for (let iz = 0; iz < this.gridSize; iz++) {
|
||||
for (let ix = 0; ix < this.gridSize; ix++) {
|
||||
const idx = iz * this.gridSize + ix;
|
||||
positions[idx * 3 + 0] = (ix - this.gridSize / 2) + 0.5; // x
|
||||
positions[idx * 3 + 1] = 0.05; // y
|
||||
positions[idx * 3 + 2] = (iz - this.gridSize / 2) + 0.5; // z
|
||||
|
||||
sizes[idx] = 1.5;
|
||||
colors[idx * 3] = 0.1;
|
||||
colors[idx * 3 + 1] = 0.2;
|
||||
colors[idx * 3 + 2] = 0.6;
|
||||
opacities[idx] = 0.15;
|
||||
}
|
||||
}
|
||||
|
||||
const geo = new THREE.BufferGeometry();
|
||||
geo.setAttribute('position', new THREE.BufferAttribute(positions, 3));
|
||||
geo.setAttribute('splatSize', new THREE.BufferAttribute(sizes, 1));
|
||||
geo.setAttribute('splatColor', new THREE.BufferAttribute(colors, 3));
|
||||
geo.setAttribute('splatOpacity', new THREE.BufferAttribute(opacities, 1));
|
||||
|
||||
const mat = new THREE.ShaderMaterial({
|
||||
vertexShader: SPLAT_VERTEX,
|
||||
fragmentShader: SPLAT_FRAGMENT,
|
||||
transparent: true,
|
||||
depthWrite: false,
|
||||
blending: THREE.AdditiveBlending,
|
||||
});
|
||||
|
||||
this.fieldPoints = new THREE.Points(geo, mat);
|
||||
this.scene.add(this.fieldPoints);
|
||||
}
|
||||
|
||||
_createNodeMarkers(THREE) {
|
||||
// Router at center — green sphere
|
||||
const routerGeo = new THREE.SphereGeometry(0.3, 16, 16);
|
||||
const routerMat = new THREE.MeshBasicMaterial({ color: 0x00ff88, transparent: true, opacity: 0.8 });
|
||||
this.routerMarker = new THREE.Mesh(routerGeo, routerMat);
|
||||
this.routerMarker.position.set(0, 0.5, 0);
|
||||
this.scene.add(this.routerMarker);
|
||||
|
||||
// ESP32 node — cyan sphere (default position, updated from data)
|
||||
const nodeGeo = new THREE.SphereGeometry(0.25, 16, 16);
|
||||
const nodeMat = new THREE.MeshBasicMaterial({ color: 0x00ccff, transparent: true, opacity: 0.8 });
|
||||
this.nodeMarker = new THREE.Mesh(nodeGeo, nodeMat);
|
||||
this.nodeMarker.position.set(2, 0.5, 1.5);
|
||||
this.scene.add(this.nodeMarker);
|
||||
}
|
||||
|
||||
_createBodyBlob(THREE) {
|
||||
// A cluster of splats representing body disruption
|
||||
const count = 64;
|
||||
const positions = new Float32Array(count * 3);
|
||||
const sizes = new Float32Array(count);
|
||||
const colors = new Float32Array(count * 3);
|
||||
const opacities = new Float32Array(count);
|
||||
|
||||
for (let i = 0; i < count; i++) {
|
||||
// Random sphere distribution
|
||||
const theta = Math.random() * Math.PI * 2;
|
||||
const phi = Math.acos(2 * Math.random() - 1);
|
||||
const r = Math.random() * 1.5;
|
||||
positions[i * 3] = r * Math.sin(phi) * Math.cos(theta);
|
||||
positions[i * 3 + 1] = r * Math.cos(phi) + 2;
|
||||
positions[i * 3 + 2] = r * Math.sin(phi) * Math.sin(theta);
|
||||
|
||||
sizes[i] = 2 + Math.random() * 3;
|
||||
colors[i * 3] = 0.2;
|
||||
colors[i * 3 + 1] = 0.8;
|
||||
colors[i * 3 + 2] = 0.3;
|
||||
opacities[i] = 0.0; // hidden until presence detected
|
||||
}
|
||||
|
||||
const geo = new THREE.BufferGeometry();
|
||||
geo.setAttribute('position', new THREE.BufferAttribute(positions, 3));
|
||||
geo.setAttribute('splatSize', new THREE.BufferAttribute(sizes, 1));
|
||||
geo.setAttribute('splatColor', new THREE.BufferAttribute(colors, 3));
|
||||
geo.setAttribute('splatOpacity', new THREE.BufferAttribute(opacities, 1));
|
||||
|
||||
const mat = new THREE.ShaderMaterial({
|
||||
vertexShader: SPLAT_VERTEX,
|
||||
fragmentShader: SPLAT_FRAGMENT,
|
||||
transparent: true,
|
||||
depthWrite: false,
|
||||
blending: THREE.AdditiveBlending,
|
||||
});
|
||||
|
||||
this.bodyBlob = new THREE.Points(geo, mat);
|
||||
this.scene.add(this.bodyBlob);
|
||||
}
|
||||
|
||||
// ---- Data update --------------------------------------------------
|
||||
|
||||
/**
|
||||
* Update the visualization with new sensing data.
|
||||
* @param {object} data - sensing_update JSON from ws_server
|
||||
*/
|
||||
update(data) {
|
||||
this._lastData = data;
|
||||
if (!data) return;
|
||||
|
||||
const features = data.features || {};
|
||||
const classification = data.classification || {};
|
||||
const signalField = data.signal_field || {};
|
||||
const nodes = data.nodes || [];
|
||||
|
||||
// -- Update signal field splats ------------------------------------
|
||||
if (signalField.values && this.fieldPoints) {
|
||||
const geo = this.fieldPoints.geometry;
|
||||
const clr = geo.attributes.splatColor.array;
|
||||
const sizes = geo.attributes.splatSize.array;
|
||||
const opac = geo.attributes.splatOpacity.array;
|
||||
const vals = signalField.values;
|
||||
const count = Math.min(vals.length, this.gridSize * this.gridSize);
|
||||
|
||||
for (let i = 0; i < count; i++) {
|
||||
const v = vals[i];
|
||||
const [r, g, b] = valueToColor(v);
|
||||
clr[i * 3] = r;
|
||||
clr[i * 3 + 1] = g;
|
||||
clr[i * 3 + 2] = b;
|
||||
sizes[i] = 1.0 + v * 4.0;
|
||||
opac[i] = 0.1 + v * 0.6;
|
||||
}
|
||||
|
||||
geo.attributes.splatColor.needsUpdate = true;
|
||||
geo.attributes.splatSize.needsUpdate = true;
|
||||
geo.attributes.splatOpacity.needsUpdate = true;
|
||||
}
|
||||
|
||||
// -- Update body blob ----------------------------------------------
|
||||
if (this.bodyBlob) {
|
||||
const bGeo = this.bodyBlob.geometry;
|
||||
const bOpac = bGeo.attributes.splatOpacity.array;
|
||||
const bClr = bGeo.attributes.splatColor.array;
|
||||
const bSize = bGeo.attributes.splatSize.array;
|
||||
|
||||
const presence = classification.presence || false;
|
||||
const motionLvl = classification.motion_level || 'absent';
|
||||
const confidence = classification.confidence || 0;
|
||||
const breathing = features.breathing_band_power || 0;
|
||||
|
||||
// Breathing pulsation
|
||||
const breathPulse = 1.0 + Math.sin(Date.now() * 0.004) * Math.min(breathing * 3, 0.4);
|
||||
|
||||
for (let i = 0; i < bOpac.length; i++) {
|
||||
if (presence) {
|
||||
bOpac[i] = confidence * 0.4;
|
||||
|
||||
// Color by motion level
|
||||
if (motionLvl === 'active') {
|
||||
bClr[i * 3] = 1.0;
|
||||
bClr[i * 3 + 1] = 0.2;
|
||||
bClr[i * 3 + 2] = 0.1;
|
||||
} else {
|
||||
bClr[i * 3] = 0.1;
|
||||
bClr[i * 3 + 1] = 0.8;
|
||||
bClr[i * 3 + 2] = 0.4;
|
||||
}
|
||||
|
||||
bSize[i] = (2 + Math.random() * 2) * breathPulse;
|
||||
} else {
|
||||
bOpac[i] = 0.0;
|
||||
}
|
||||
}
|
||||
|
||||
bGeo.attributes.splatOpacity.needsUpdate = true;
|
||||
bGeo.attributes.splatColor.needsUpdate = true;
|
||||
bGeo.attributes.splatSize.needsUpdate = true;
|
||||
}
|
||||
|
||||
// -- Update node positions -----------------------------------------
|
||||
if (nodes.length > 0 && nodes[0].position && this.nodeMarker) {
|
||||
const pos = nodes[0].position;
|
||||
this.nodeMarker.position.set(pos[0], 0.5, pos[2]);
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Render loop -------------------------------------------------
|
||||
|
||||
_animate() {
|
||||
this._animFrame = requestAnimationFrame(() => this._animate());
|
||||
|
||||
const now = performance.now();
|
||||
|
||||
// Gentle router glow pulse
|
||||
if (this.routerMarker) {
|
||||
const pulse = 0.6 + 0.3 * Math.sin(now * 0.003);
|
||||
this.routerMarker.material.opacity = pulse;
|
||||
}
|
||||
|
||||
this.controls.update();
|
||||
this.renderer.render(this.scene, this.camera);
|
||||
|
||||
this._fpsFrames.push(now);
|
||||
while (this._fpsFrames.length > 0 && this._fpsFrames[0] < now - 1000) {
|
||||
this._fpsFrames.shift();
|
||||
}
|
||||
|
||||
if (now - this._lastFpsReport >= 1000) {
|
||||
const fps = this._fpsFrames.length;
|
||||
this._lastFpsReport = now;
|
||||
postMessageToRN({
|
||||
type: 'FPS_TICK',
|
||||
payload: { fps },
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// ---- Resize / cleanup --------------------------------------------
|
||||
|
||||
resize(width, height) {
|
||||
if (!width || !height) return;
|
||||
this.width = width;
|
||||
this.height = height;
|
||||
this.camera.aspect = width / height;
|
||||
this.camera.updateProjectionMatrix();
|
||||
this.renderer.setSize(width, height);
|
||||
}
|
||||
|
||||
dispose() {
|
||||
if (this._animFrame) {
|
||||
cancelAnimationFrame(this._animFrame);
|
||||
}
|
||||
|
||||
this.controls?.dispose();
|
||||
this.renderer.dispose();
|
||||
if (this.renderer.domElement.parentNode) {
|
||||
this.renderer.domElement.parentNode.removeChild(this.renderer.domElement);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Expose renderer constructor for debugging/interop
|
||||
window.GaussianSplatRenderer = GaussianSplatRenderer;
|
||||
|
||||
let renderer = null;
|
||||
let pendingFrame = null;
|
||||
let pendingResize = null;
|
||||
|
||||
const postSafeReady = () => {
|
||||
postMessageToRN({ type: 'READY' });
|
||||
};
|
||||
|
||||
const routeMessage = (event) => {
|
||||
let raw = event.data;
|
||||
if (typeof raw === 'object' && raw != null && 'data' in raw) {
|
||||
raw = raw.data;
|
||||
}
|
||||
|
||||
let message = raw;
|
||||
if (typeof raw === 'string') {
|
||||
try {
|
||||
message = JSON.parse(raw);
|
||||
} catch (err) {
|
||||
postError('Failed to parse RN message payload');
|
||||
return;
|
||||
}
|
||||
}
|
||||
|
||||
if (!message || typeof message !== 'object') {
|
||||
return;
|
||||
}
|
||||
|
||||
if (message.type === 'FRAME_UPDATE') {
|
||||
const payload = message.payload || null;
|
||||
if (!payload) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!renderer) {
|
||||
pendingFrame = payload;
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
renderer.update(payload);
|
||||
} catch (error) {
|
||||
postError((error && error.message) || 'Failed to update frame');
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if (message.type === 'RESIZE') {
|
||||
const dims = message.payload || {};
|
||||
const w = Number(dims.width);
|
||||
const h = Number(dims.height);
|
||||
if (!Number.isFinite(w) || !Number.isFinite(h) || !w || !h) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (!renderer) {
|
||||
pendingResize = { width: w, height: h };
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
renderer.resize(w, h);
|
||||
} catch (error) {
|
||||
postError((error && error.message) || 'Failed to resize renderer');
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if (message.type === 'DISPOSE') {
|
||||
if (!renderer) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
renderer.dispose();
|
||||
} catch (error) {
|
||||
postError((error && error.message) || 'Failed to dispose renderer');
|
||||
}
|
||||
renderer = null;
|
||||
return;
|
||||
}
|
||||
};
|
||||
|
||||
const buildRenderer = () => {
|
||||
const container = document.getElementById('gaussian-splat-root');
|
||||
if (!container) {
|
||||
return;
|
||||
}
|
||||
|
||||
try {
|
||||
renderer = new GaussianSplatRenderer(container, {
|
||||
width: container.clientWidth || window.innerWidth,
|
||||
height: container.clientHeight || window.innerHeight,
|
||||
});
|
||||
|
||||
if (pendingFrame) {
|
||||
renderer.update(pendingFrame);
|
||||
pendingFrame = null;
|
||||
}
|
||||
|
||||
if (pendingResize) {
|
||||
renderer.resize(pendingResize.width, pendingResize.height);
|
||||
pendingResize = null;
|
||||
}
|
||||
|
||||
postSafeReady();
|
||||
} catch (error) {
|
||||
renderer = null;
|
||||
postError((error && error.message) || 'Failed to initialize renderer');
|
||||
}
|
||||
};
|
||||
|
||||
if (document.readyState === 'loading') {
|
||||
document.addEventListener('DOMContentLoaded', buildRenderer);
|
||||
} else {
|
||||
buildRenderer();
|
||||
}
|
||||
|
||||
window.addEventListener('message', routeMessage);
|
||||
window.addEventListener('resize', () => {
|
||||
if (!renderer) {
|
||||
pendingResize = {
|
||||
width: window.innerWidth,
|
||||
height: window.innerHeight,
|
||||
};
|
||||
return;
|
||||
}
|
||||
renderer.resize(window.innerWidth, window.innerHeight);
|
||||
});
|
||||
})();
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
@@ -1,70 +0,0 @@
|
||||
import { StyleSheet, View } from 'react-native';
|
||||
import { ThemedText } from './ThemedText';
|
||||
|
||||
type ConnectionState = 'connected' | 'simulated' | 'disconnected';
|
||||
|
||||
type ConnectionBannerProps = {
|
||||
status: ConnectionState;
|
||||
};
|
||||
|
||||
const resolveState = (status: ConnectionState) => {
|
||||
if (status === 'connected') {
|
||||
return {
|
||||
label: 'LIVE STREAM',
|
||||
backgroundColor: '#0F6B2A',
|
||||
textColor: '#E2FFEA',
|
||||
};
|
||||
}
|
||||
|
||||
if (status === 'disconnected') {
|
||||
return {
|
||||
label: 'DISCONNECTED',
|
||||
backgroundColor: '#8A1E2A',
|
||||
textColor: '#FFE3E7',
|
||||
};
|
||||
}
|
||||
|
||||
return {
|
||||
label: 'SIMULATED DATA',
|
||||
backgroundColor: '#9A5F0C',
|
||||
textColor: '#FFF3E1',
|
||||
};
|
||||
};
|
||||
|
||||
export const ConnectionBanner = ({ status }: ConnectionBannerProps) => {
|
||||
const state = resolveState(status);
|
||||
|
||||
return (
|
||||
<View
|
||||
style={[
|
||||
styles.banner,
|
||||
{
|
||||
backgroundColor: state.backgroundColor,
|
||||
borderBottomColor: state.textColor,
|
||||
},
|
||||
]}
|
||||
>
|
||||
<ThemedText preset="labelMd" style={[styles.text, { color: state.textColor }]}>
|
||||
{state.label}
|
||||
</ThemedText>
|
||||
</View>
|
||||
);
|
||||
};
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
banner: {
|
||||
position: 'absolute',
|
||||
left: 0,
|
||||
right: 0,
|
||||
top: 0,
|
||||
zIndex: 100,
|
||||
paddingVertical: 6,
|
||||
borderBottomWidth: 2,
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center',
|
||||
},
|
||||
text: {
|
||||
letterSpacing: 2,
|
||||
fontWeight: '700',
|
||||
},
|
||||
});
|
||||
@@ -1,66 +0,0 @@
|
||||
import { Component, ErrorInfo, ReactNode } from 'react';
|
||||
import { Button, StyleSheet, View } from 'react-native';
|
||||
import { ThemedText } from './ThemedText';
|
||||
import { ThemedView } from './ThemedView';
|
||||
|
||||
type ErrorBoundaryProps = {
|
||||
children: ReactNode;
|
||||
};
|
||||
|
||||
type ErrorBoundaryState = {
|
||||
hasError: boolean;
|
||||
error?: Error;
|
||||
};
|
||||
|
||||
export class ErrorBoundary extends Component<ErrorBoundaryProps, ErrorBoundaryState> {
|
||||
constructor(props: ErrorBoundaryProps) {
|
||||
super(props);
|
||||
this.state = { hasError: false };
|
||||
}
|
||||
|
||||
static getDerivedStateFromError(error: Error): ErrorBoundaryState {
|
||||
return { hasError: true, error };
|
||||
}
|
||||
|
||||
componentDidCatch(error: Error, errorInfo: ErrorInfo) {
|
||||
console.error('ErrorBoundary caught an error', error, errorInfo);
|
||||
}
|
||||
|
||||
handleRetry = () => {
|
||||
this.setState({ hasError: false, error: undefined });
|
||||
};
|
||||
|
||||
render() {
|
||||
if (this.state.hasError) {
|
||||
return (
|
||||
<ThemedView style={styles.container}>
|
||||
<ThemedText preset="displayMd">Something went wrong</ThemedText>
|
||||
<ThemedText preset="bodySm" style={styles.message}>
|
||||
{this.state.error?.message ?? 'An unexpected error occurred.'}
|
||||
</ThemedText>
|
||||
<View style={styles.buttonWrap}>
|
||||
<Button title="Retry" onPress={this.handleRetry} />
|
||||
</View>
|
||||
</ThemedView>
|
||||
);
|
||||
}
|
||||
|
||||
return this.props.children;
|
||||
}
|
||||
}
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
container: {
|
||||
flex: 1,
|
||||
justifyContent: 'center',
|
||||
alignItems: 'center',
|
||||
padding: 20,
|
||||
gap: 12,
|
||||
},
|
||||
message: {
|
||||
textAlign: 'center',
|
||||
},
|
||||
buttonWrap: {
|
||||
marginTop: 8,
|
||||
},
|
||||
});
|
||||
@@ -1,96 +0,0 @@
|
||||
import { useEffect } from 'react';
|
||||
import { StyleSheet, View } from 'react-native';
|
||||
import Animated, { useAnimatedProps, useSharedValue, withTiming } from 'react-native-reanimated';
|
||||
import Svg, { Circle, G, Text as SvgText } from 'react-native-svg';
|
||||
|
||||
type GaugeArcProps = {
|
||||
value: number;
|
||||
max: number;
|
||||
label: string;
|
||||
unit: string;
|
||||
color: string;
|
||||
size?: number;
|
||||
};
|
||||
|
||||
const AnimatedCircle = Animated.createAnimatedComponent(Circle);
|
||||
|
||||
export const GaugeArc = ({ value, max, label, unit, color, size = 140 }: GaugeArcProps) => {
|
||||
const radius = (size - 20) / 2;
|
||||
const circumference = 2 * Math.PI * radius;
|
||||
const arcLength = circumference * 0.75;
|
||||
const strokeWidth = 12;
|
||||
const progress = useSharedValue(0);
|
||||
|
||||
const normalized = Math.max(0, Math.min(max > 0 ? value / max : 0, 1));
|
||||
const displayText = `${value.toFixed(1)} ${unit}`;
|
||||
|
||||
useEffect(() => {
|
||||
progress.value = withTiming(normalized, { duration: 600 });
|
||||
}, [normalized, progress]);
|
||||
|
||||
const animatedStroke = useAnimatedProps(() => {
|
||||
const dashOffset = arcLength - arcLength * progress.value;
|
||||
return {
|
||||
strokeDashoffset: dashOffset,
|
||||
};
|
||||
});
|
||||
|
||||
return (
|
||||
<View style={styles.wrapper}>
|
||||
<Svg width={size} height={size} viewBox={`0 0 ${size} ${size}`}>
|
||||
<G transform={`rotate(-135 ${size / 2} ${size / 2})`}>
|
||||
<Circle
|
||||
cx={size / 2}
|
||||
cy={size / 2}
|
||||
r={radius}
|
||||
strokeWidth={strokeWidth}
|
||||
stroke="#1E293B"
|
||||
fill="none"
|
||||
strokeDasharray={`${arcLength} ${circumference}`}
|
||||
strokeLinecap="round"
|
||||
/>
|
||||
<AnimatedCircle
|
||||
cx={size / 2}
|
||||
cy={size / 2}
|
||||
r={radius}
|
||||
strokeWidth={strokeWidth}
|
||||
stroke={color}
|
||||
fill="none"
|
||||
strokeDasharray={`${arcLength} ${circumference}`}
|
||||
strokeLinecap="round"
|
||||
animatedProps={animatedStroke}
|
||||
/>
|
||||
</G>
|
||||
<SvgText
|
||||
x={size / 2}
|
||||
y={size / 2 - 4}
|
||||
fill="#E2E8F0"
|
||||
fontSize={18}
|
||||
fontFamily="Courier New"
|
||||
fontWeight="700"
|
||||
textAnchor="middle"
|
||||
>
|
||||
{displayText}
|
||||
</SvgText>
|
||||
<SvgText
|
||||
x={size / 2}
|
||||
y={size / 2 + 16}
|
||||
fill="#94A3B8"
|
||||
fontSize={10}
|
||||
fontFamily="Courier New"
|
||||
textAnchor="middle"
|
||||
letterSpacing="0.6"
|
||||
>
|
||||
{label}
|
||||
</SvgText>
|
||||
</Svg>
|
||||
</View>
|
||||
);
|
||||
};
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
wrapper: {
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center',
|
||||
},
|
||||
});
|
||||
@@ -1,60 +0,0 @@
|
||||
import { useEffect } from 'react';
|
||||
import { StyleSheet, ViewStyle } from 'react-native';
|
||||
import Animated, { Easing, useAnimatedStyle, useSharedValue, withRepeat, withTiming } from 'react-native-reanimated';
|
||||
import Svg, { Circle } from 'react-native-svg';
|
||||
import { colors } from '../theme/colors';
|
||||
|
||||
type LoadingSpinnerProps = {
|
||||
size?: number;
|
||||
color?: string;
|
||||
style?: ViewStyle;
|
||||
};
|
||||
|
||||
export const LoadingSpinner = ({ size = 36, color = colors.accent, style }: LoadingSpinnerProps) => {
|
||||
const rotation = useSharedValue(0);
|
||||
const strokeWidth = Math.max(4, size * 0.14);
|
||||
const center = size / 2;
|
||||
const radius = center - strokeWidth;
|
||||
const circumference = 2 * Math.PI * radius;
|
||||
|
||||
useEffect(() => {
|
||||
rotation.value = withRepeat(withTiming(360, { duration: 900, easing: Easing.linear }), -1);
|
||||
}, [rotation]);
|
||||
|
||||
const animatedStyle = useAnimatedStyle(() => ({
|
||||
transform: [{ rotateZ: `${rotation.value}deg` }],
|
||||
}));
|
||||
|
||||
return (
|
||||
<Animated.View style={[styles.container, { width: size, height: size }, style, animatedStyle]} pointerEvents="none">
|
||||
<Svg width={size} height={size} viewBox={`0 0 ${size} ${size}`}>
|
||||
<Circle
|
||||
cx={center}
|
||||
cy={center}
|
||||
r={radius}
|
||||
stroke="rgba(255,255,255,0.2)"
|
||||
strokeWidth={strokeWidth}
|
||||
fill="none"
|
||||
/>
|
||||
<Circle
|
||||
cx={center}
|
||||
cy={center}
|
||||
r={radius}
|
||||
stroke={color}
|
||||
strokeWidth={strokeWidth}
|
||||
fill="none"
|
||||
strokeLinecap="round"
|
||||
strokeDasharray={`${circumference * 0.3} ${circumference * 0.7}`}
|
||||
strokeDashoffset={circumference * 0.2}
|
||||
/>
|
||||
</Svg>
|
||||
</Animated.View>
|
||||
);
|
||||
};
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
container: {
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center',
|
||||
},
|
||||
});
|
||||
@@ -1,71 +0,0 @@
|
||||
import { StyleSheet } from 'react-native';
|
||||
import { ThemedText } from './ThemedText';
|
||||
import { colors } from '../theme/colors';
|
||||
|
||||
type Mode = 'CSI' | 'RSSI' | 'SIM' | 'LIVE';
|
||||
|
||||
const modeStyle: Record<
|
||||
Mode,
|
||||
{
|
||||
background: string;
|
||||
border: string;
|
||||
color: string;
|
||||
}
|
||||
> = {
|
||||
CSI: {
|
||||
background: 'rgba(50, 184, 198, 0.25)',
|
||||
border: colors.accent,
|
||||
color: colors.accent,
|
||||
},
|
||||
RSSI: {
|
||||
background: 'rgba(255, 165, 2, 0.2)',
|
||||
border: colors.warn,
|
||||
color: colors.warn,
|
||||
},
|
||||
SIM: {
|
||||
background: 'rgba(255, 71, 87, 0.18)',
|
||||
border: colors.simulated,
|
||||
color: colors.simulated,
|
||||
},
|
||||
LIVE: {
|
||||
background: 'rgba(46, 213, 115, 0.18)',
|
||||
border: colors.connected,
|
||||
color: colors.connected,
|
||||
},
|
||||
};
|
||||
|
||||
type ModeBadgeProps = {
|
||||
mode: Mode;
|
||||
};
|
||||
|
||||
export const ModeBadge = ({ mode }: ModeBadgeProps) => {
|
||||
const style = modeStyle[mode];
|
||||
|
||||
return (
|
||||
<ThemedText
|
||||
preset="labelMd"
|
||||
style={[
|
||||
styles.badge,
|
||||
{
|
||||
backgroundColor: style.background,
|
||||
borderColor: style.border,
|
||||
color: style.color,
|
||||
},
|
||||
]}
|
||||
>
|
||||
{mode}
|
||||
</ThemedText>
|
||||
);
|
||||
};
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
badge: {
|
||||
paddingHorizontal: 10,
|
||||
paddingVertical: 4,
|
||||
borderRadius: 999,
|
||||
borderWidth: 1,
|
||||
overflow: 'hidden',
|
||||
letterSpacing: 1,
|
||||
textAlign: 'center',
|
||||
},
|
||||
});
|
||||
@@ -1,147 +0,0 @@
|
||||
import { useEffect, useMemo, useRef } from 'react';
|
||||
import { StyleProp, ViewStyle } from 'react-native';
|
||||
import Animated, { interpolateColor, useAnimatedProps, useSharedValue, withTiming, type SharedValue } from 'react-native-reanimated';
|
||||
import Svg, { Circle, G, Rect } from 'react-native-svg';
|
||||
import { colors } from '../theme/colors';
|
||||
|
||||
type Point = {
|
||||
x: number;
|
||||
y: number;
|
||||
};
|
||||
|
||||
type OccupancyGridProps = {
|
||||
values: number[];
|
||||
personPositions?: Point[];
|
||||
size?: number;
|
||||
style?: StyleProp<ViewStyle>;
|
||||
};
|
||||
|
||||
const GRID_DIMENSION = 20;
|
||||
const CELLS = GRID_DIMENSION * GRID_DIMENSION;
|
||||
|
||||
const toColor = (value: number): string => {
|
||||
const clamped = Math.max(0, Math.min(1, value));
|
||||
let r: number;
|
||||
let g: number;
|
||||
let b: number;
|
||||
|
||||
if (clamped < 0.5) {
|
||||
const t = clamped * 2;
|
||||
r = Math.round(255 * 0);
|
||||
g = Math.round(255 * t);
|
||||
b = Math.round(255 * (1 - t));
|
||||
} else {
|
||||
const t = (clamped - 0.5) * 2;
|
||||
r = Math.round(255 * t);
|
||||
g = Math.round(255 * (1 - t));
|
||||
b = 0;
|
||||
}
|
||||
|
||||
return `rgb(${r}, ${g}, ${b})`;
|
||||
};
|
||||
|
||||
const AnimatedRect = Animated.createAnimatedComponent(Rect);
|
||||
|
||||
const normalizeValues = (values: number[]) => {
|
||||
const normalized = new Array(CELLS).fill(0);
|
||||
for (let i = 0; i < CELLS; i += 1) {
|
||||
const value = values?.[i] ?? 0;
|
||||
normalized[i] = Number.isFinite(value) ? Math.max(0, Math.min(1, value)) : 0;
|
||||
}
|
||||
return normalized;
|
||||
};
|
||||
|
||||
type CellProps = {
|
||||
index: number;
|
||||
size: number;
|
||||
progress: SharedValue<number>;
|
||||
previousColors: string[];
|
||||
nextColors: string[];
|
||||
};
|
||||
|
||||
const Cell = ({ index, size, progress, previousColors, nextColors }: CellProps) => {
|
||||
const col = index % GRID_DIMENSION;
|
||||
const row = Math.floor(index / GRID_DIMENSION);
|
||||
const cellSize = size / GRID_DIMENSION;
|
||||
const x = col * cellSize;
|
||||
const y = row * cellSize;
|
||||
|
||||
const animatedProps = useAnimatedProps(() => ({
|
||||
fill: interpolateColor(
|
||||
progress.value,
|
||||
[0, 1],
|
||||
[previousColors[index] ?? colors.surfaceAlt, nextColors[index] ?? colors.surfaceAlt],
|
||||
),
|
||||
}));
|
||||
|
||||
return (
|
||||
<AnimatedRect
|
||||
x={x}
|
||||
y={y}
|
||||
width={cellSize}
|
||||
height={cellSize}
|
||||
rx={1}
|
||||
animatedProps={animatedProps}
|
||||
/>
|
||||
);
|
||||
};
|
||||
|
||||
export const OccupancyGrid = ({
|
||||
values,
|
||||
personPositions = [],
|
||||
size = 320,
|
||||
style,
|
||||
}: OccupancyGridProps) => {
|
||||
const normalizedValues = useMemo(() => normalizeValues(values), [values]);
|
||||
const previousColors = useRef<string[]>(normalizedValues.map(toColor));
|
||||
const nextColors = useRef<string[]>(normalizedValues.map(toColor));
|
||||
const progress = useSharedValue(1);
|
||||
|
||||
useEffect(() => {
|
||||
const next = normalizeValues(values);
|
||||
previousColors.current = normalizedValues.map(toColor);
|
||||
nextColors.current = next.map(toColor);
|
||||
progress.value = 0;
|
||||
progress.value = withTiming(1, { duration: 500 });
|
||||
}, [values, normalizedValues, progress]);
|
||||
|
||||
const markers = useMemo(() => {
|
||||
const cellSize = size / GRID_DIMENSION;
|
||||
return personPositions.map(({ x, y }, idx) => {
|
||||
const clampedX = Math.max(0, Math.min(GRID_DIMENSION - 1, Math.round(x)));
|
||||
const clampedY = Math.max(0, Math.min(GRID_DIMENSION - 1, Math.round(y)));
|
||||
const cx = (clampedX + 0.5) * cellSize;
|
||||
const cy = (clampedY + 0.5) * cellSize;
|
||||
const markerRadius = Math.max(3, cellSize * 0.25);
|
||||
return (
|
||||
<Circle
|
||||
key={`person-${idx}`}
|
||||
cx={cx}
|
||||
cy={cy}
|
||||
r={markerRadius}
|
||||
fill={colors.accent}
|
||||
stroke={colors.textPrimary}
|
||||
strokeWidth={1}
|
||||
/>
|
||||
);
|
||||
});
|
||||
}, [personPositions, size]);
|
||||
|
||||
return (
|
||||
<Svg width={size} height={size} style={style} viewBox={`0 0 ${size} ${size}`}>
|
||||
<G>
|
||||
{Array.from({ length: CELLS }).map((_, index) => (
|
||||
<Cell
|
||||
key={index}
|
||||
index={index}
|
||||
size={size}
|
||||
progress={progress}
|
||||
previousColors={previousColors.current}
|
||||
nextColors={nextColors.current}
|
||||
/>
|
||||
))}
|
||||
</G>
|
||||
{markers}
|
||||
</Svg>
|
||||
);
|
||||
};
|
||||
@@ -1,62 +0,0 @@
|
||||
import { useEffect } from 'react';
|
||||
import { StyleSheet, View } from 'react-native';
|
||||
import Animated, { useAnimatedStyle, useSharedValue, withTiming } from 'react-native-reanimated';
|
||||
import { ThemedText } from './ThemedText';
|
||||
import { colors } from '../theme/colors';
|
||||
|
||||
type SignalBarProps = {
|
||||
value: number;
|
||||
label: string;
|
||||
color?: string;
|
||||
};
|
||||
|
||||
const clamp01 = (value: number) => Math.max(0, Math.min(1, value));
|
||||
|
||||
export const SignalBar = ({ value, label, color = colors.accent }: SignalBarProps) => {
|
||||
const progress = useSharedValue(clamp01(value));
|
||||
|
||||
useEffect(() => {
|
||||
progress.value = withTiming(clamp01(value), { duration: 250 });
|
||||
}, [value, progress]);
|
||||
|
||||
const animatedFill = useAnimatedStyle(() => ({
|
||||
width: `${progress.value * 100}%`,
|
||||
}));
|
||||
|
||||
return (
|
||||
<View style={styles.container}>
|
||||
<ThemedText preset="bodySm" style={styles.label}>
|
||||
{label}
|
||||
</ThemedText>
|
||||
<View style={styles.track}>
|
||||
<Animated.View style={[styles.fill, { backgroundColor: color }, animatedFill]} />
|
||||
</View>
|
||||
<ThemedText preset="bodySm" style={styles.percent}>
|
||||
{Math.round(clamp01(value) * 100)}%
|
||||
</ThemedText>
|
||||
</View>
|
||||
);
|
||||
};
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
container: {
|
||||
gap: 6,
|
||||
},
|
||||
label: {
|
||||
marginBottom: 4,
|
||||
},
|
||||
track: {
|
||||
height: 8,
|
||||
borderRadius: 4,
|
||||
backgroundColor: colors.surfaceAlt,
|
||||
overflow: 'hidden',
|
||||
},
|
||||
fill: {
|
||||
height: '100%',
|
||||
borderRadius: 4,
|
||||
},
|
||||
percent: {
|
||||
textAlign: 'right',
|
||||
color: colors.textSecondary,
|
||||
},
|
||||
});
|
||||
@@ -1,64 +0,0 @@
|
||||
import { useMemo } from 'react';
|
||||
import { View, ViewStyle } from 'react-native';
|
||||
import { colors } from '../theme/colors';
|
||||
|
||||
type SparklineChartProps = {
|
||||
data: number[];
|
||||
color?: string;
|
||||
height?: number;
|
||||
style?: ViewStyle;
|
||||
};
|
||||
|
||||
const defaultHeight = 72;
|
||||
|
||||
export const SparklineChart = ({
|
||||
data,
|
||||
color = colors.accent,
|
||||
height = defaultHeight,
|
||||
style,
|
||||
}: SparklineChartProps) => {
|
||||
const normalizedData = data.length > 0 ? data : [0];
|
||||
|
||||
const chartData = useMemo(
|
||||
() =>
|
||||
normalizedData.map((value, index) => ({
|
||||
x: index,
|
||||
y: value,
|
||||
})),
|
||||
[normalizedData],
|
||||
);
|
||||
|
||||
const yValues = normalizedData.map((value) => Number(value) || 0);
|
||||
const yMin = Math.min(...yValues);
|
||||
const yMax = Math.max(...yValues);
|
||||
const yPadding = yMax - yMin === 0 ? 1 : (yMax - yMin) * 0.2;
|
||||
|
||||
return (
|
||||
<View style={style}>
|
||||
<View
|
||||
accessibilityRole="image"
|
||||
style={{
|
||||
height,
|
||||
width: '100%',
|
||||
borderRadius: 4,
|
||||
borderWidth: 1,
|
||||
borderColor: color,
|
||||
opacity: 0.2,
|
||||
backgroundColor: 'transparent',
|
||||
}}
|
||||
>
|
||||
<View
|
||||
style={{
|
||||
flex: 1,
|
||||
justifyContent: 'center',
|
||||
alignItems: 'center',
|
||||
}}
|
||||
>
|
||||
{chartData.map((point) => (
|
||||
<View key={point.x} style={{ position: 'absolute', left: `${(point.x / Math.max(normalizedData.length - 1, 1)) * 100}%` }} />
|
||||
))}
|
||||
</View>
|
||||
</View>
|
||||
</View>
|
||||
);
|
||||
};
|
||||
@@ -1,83 +0,0 @@
|
||||
import { useEffect } from 'react';
|
||||
import { StyleSheet, ViewStyle } from 'react-native';
|
||||
import Animated, {
|
||||
cancelAnimation,
|
||||
Easing,
|
||||
useAnimatedStyle,
|
||||
useSharedValue,
|
||||
withRepeat,
|
||||
withSequence,
|
||||
withTiming,
|
||||
} from 'react-native-reanimated';
|
||||
import { colors } from '../theme/colors';
|
||||
|
||||
type StatusType = 'connected' | 'simulated' | 'disconnected' | 'connecting';
|
||||
|
||||
type StatusDotProps = {
|
||||
status: StatusType;
|
||||
size?: number;
|
||||
style?: ViewStyle;
|
||||
};
|
||||
|
||||
const resolveColor = (status: StatusType): string => {
|
||||
if (status === 'connecting') return colors.warn;
|
||||
return colors[status];
|
||||
};
|
||||
|
||||
export const StatusDot = ({ status, size = 10, style }: StatusDotProps) => {
|
||||
const scale = useSharedValue(1);
|
||||
const opacity = useSharedValue(1);
|
||||
const isConnecting = status === 'connecting';
|
||||
|
||||
useEffect(() => {
|
||||
if (isConnecting) {
|
||||
scale.value = withRepeat(
|
||||
withSequence(
|
||||
withTiming(1.35, { duration: 800, easing: Easing.out(Easing.cubic) }),
|
||||
withTiming(1, { duration: 800, easing: Easing.in(Easing.cubic) }),
|
||||
),
|
||||
-1,
|
||||
);
|
||||
opacity.value = withRepeat(
|
||||
withSequence(
|
||||
withTiming(0.4, { duration: 800, easing: Easing.out(Easing.quad) }),
|
||||
withTiming(1, { duration: 800, easing: Easing.in(Easing.quad) }),
|
||||
),
|
||||
-1,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
cancelAnimation(scale);
|
||||
cancelAnimation(opacity);
|
||||
scale.value = 1;
|
||||
opacity.value = 1;
|
||||
}, [isConnecting, opacity, scale]);
|
||||
|
||||
const animatedStyle = useAnimatedStyle(() => ({
|
||||
transform: [{ scale: scale.value }],
|
||||
opacity: opacity.value,
|
||||
}));
|
||||
|
||||
return (
|
||||
<Animated.View
|
||||
style={[
|
||||
styles.dot,
|
||||
{
|
||||
width: size,
|
||||
height: size,
|
||||
backgroundColor: resolveColor(status),
|
||||
borderRadius: size / 2,
|
||||
},
|
||||
animatedStyle,
|
||||
style,
|
||||
]}
|
||||
/>
|
||||
);
|
||||
};
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
dot: {
|
||||
borderRadius: 999,
|
||||
},
|
||||
});
|
||||
@@ -1,28 +0,0 @@
|
||||
import { ComponentPropsWithoutRef } from 'react';
|
||||
import { StyleProp, Text, TextStyle } from 'react-native';
|
||||
import { useTheme } from '../hooks/useTheme';
|
||||
import { colors } from '../theme/colors';
|
||||
import { typography } from '../theme/typography';
|
||||
|
||||
type TextPreset = keyof typeof typography;
|
||||
type ColorKey = keyof typeof colors;
|
||||
|
||||
type ThemedTextProps = Omit<ComponentPropsWithoutRef<typeof Text>, 'style'> & {
|
||||
preset?: TextPreset;
|
||||
color?: ColorKey;
|
||||
style?: StyleProp<TextStyle>;
|
||||
};
|
||||
|
||||
export const ThemedText = ({
|
||||
preset = 'bodyMd',
|
||||
color = 'textPrimary',
|
||||
style,
|
||||
...props
|
||||
}: ThemedTextProps) => {
|
||||
const { colors, typography } = useTheme();
|
||||
|
||||
const presetStyle = (typography as Record<TextPreset, TextStyle>)[preset];
|
||||
const colorStyle = { color: colors[color] };
|
||||
|
||||
return <Text {...props} style={[presetStyle, colorStyle, style]} />;
|
||||
};
|
||||
@@ -1,24 +0,0 @@
|
||||
import { PropsWithChildren, forwardRef } from 'react';
|
||||
import { View, ViewProps } from 'react-native';
|
||||
import { useTheme } from '../hooks/useTheme';
|
||||
|
||||
type ThemedViewProps = PropsWithChildren<ViewProps>;
|
||||
|
||||
export const ThemedView = forwardRef<View, ThemedViewProps>(({ children, style, ...props }, ref) => {
|
||||
const { colors } = useTheme();
|
||||
|
||||
return (
|
||||
<View
|
||||
ref={ref}
|
||||
{...props}
|
||||
style={[
|
||||
{
|
||||
backgroundColor: colors.bg,
|
||||
},
|
||||
style,
|
||||
]}
|
||||
>
|
||||
{children}
|
||||
</View>
|
||||
);
|
||||
});
|
||||
@@ -1,14 +0,0 @@
|
||||
export const API_ROOT = '/api/v1';
|
||||
|
||||
export const API_POSE_STATUS_PATH = '/api/v1/pose/status';
|
||||
export const API_POSE_FRAMES_PATH = '/api/v1/pose/frames';
|
||||
export const API_POSE_ZONES_PATH = '/api/v1/pose/zones';
|
||||
export const API_POSE_CURRENT_PATH = '/api/v1/pose/current';
|
||||
export const API_STREAM_STATUS_PATH = '/api/v1/stream/status';
|
||||
export const API_STREAM_POSE_PATH = '/api/v1/stream/pose';
|
||||
export const API_MAT_EVENTS_PATH = '/api/v1/mat/events';
|
||||
|
||||
export const API_HEALTH_PATH = '/health';
|
||||
export const API_HEALTH_SYSTEM_PATH = '/health/health';
|
||||
export const API_HEALTH_READY_PATH = '/health/ready';
|
||||
export const API_HEALTH_LIVE_PATH = '/health/live';
|
||||
@@ -1,20 +0,0 @@
|
||||
export const SIMULATION_TICK_INTERVAL_MS = 500;
|
||||
export const SIMULATION_GRID_SIZE = 20;
|
||||
|
||||
export const RSSI_BASE_DBM = -45;
|
||||
export const RSSI_AMPLITUDE_DBM = 3;
|
||||
|
||||
export const VARIANCE_BASE = 1.5;
|
||||
export const VARIANCE_AMPLITUDE = 1.0;
|
||||
|
||||
export const MOTION_BAND_MIN = 0.05;
|
||||
export const MOTION_BAND_AMPLITUDE = 0.15;
|
||||
export const BREATHING_BAND_MIN = 0.03;
|
||||
export const BREATHING_BAND_AMPLITUDE = 0.08;
|
||||
|
||||
export const SIGNAL_FIELD_PRESENCE_LEVEL = 0.8;
|
||||
|
||||
export const BREATHING_BPM_MIN = 12;
|
||||
export const BREATHING_BPM_MAX = 24;
|
||||
export const HEART_BPM_MIN = 58;
|
||||
export const HEART_BPM_MAX = 96;
|
||||
@@ -1,3 +0,0 @@
|
||||
export const WS_PATH = '/api/v1/stream/pose';
|
||||
export const RECONNECT_DELAYS = [1000, 2000, 4000, 8000, 16000];
|
||||
export const MAX_RECONNECT_ATTEMPTS = 10;
|
||||
@@ -1,31 +0,0 @@
|
||||
import { useEffect } from 'react';
|
||||
import { wsService } from '@/services/ws.service';
|
||||
import { usePoseStore } from '@/stores/poseStore';
|
||||
import { useSettingsStore } from '@/stores/settingsStore';
|
||||
|
||||
export interface UsePoseStreamResult {
|
||||
connectionStatus: ReturnType<typeof usePoseStore.getState>['connectionStatus'];
|
||||
lastFrame: ReturnType<typeof usePoseStore.getState>['lastFrame'];
|
||||
isSimulated: boolean;
|
||||
}
|
||||
|
||||
export function usePoseStream(): UsePoseStreamResult {
|
||||
const serverUrl = useSettingsStore((state) => state.serverUrl);
|
||||
const connectionStatus = usePoseStore((state) => state.connectionStatus);
|
||||
const lastFrame = usePoseStore((state) => state.lastFrame);
|
||||
const isSimulated = usePoseStore((state) => state.isSimulated);
|
||||
|
||||
useEffect(() => {
|
||||
const unsubscribe = wsService.subscribe((frame) => {
|
||||
usePoseStore.getState().handleFrame(frame);
|
||||
});
|
||||
wsService.connect(serverUrl);
|
||||
|
||||
return () => {
|
||||
unsubscribe();
|
||||
wsService.disconnect();
|
||||
};
|
||||
}, [serverUrl]);
|
||||
|
||||
return { connectionStatus, lastFrame, isSimulated };
|
||||
}
|
||||
@@ -1,31 +0,0 @@
|
||||
import { useEffect, useState } from 'react';
|
||||
import { rssiService, type WifiNetwork } from '@/services/rssi.service';
|
||||
import { useSettingsStore } from '@/stores/settingsStore';
|
||||
|
||||
export function useRssiScanner(): { networks: WifiNetwork[]; isScanning: boolean } {
|
||||
const enabled = useSettingsStore((state) => state.rssiScanEnabled);
|
||||
const [networks, setNetworks] = useState<WifiNetwork[]>([]);
|
||||
const [isScanning, setIsScanning] = useState(false);
|
||||
|
||||
useEffect(() => {
|
||||
if (!enabled) {
|
||||
rssiService.stopScanning();
|
||||
setIsScanning(false);
|
||||
return;
|
||||
}
|
||||
|
||||
const unsubscribe = rssiService.subscribe((result) => {
|
||||
setNetworks(result);
|
||||
});
|
||||
rssiService.startScanning(2000);
|
||||
setIsScanning(true);
|
||||
|
||||
return () => {
|
||||
unsubscribe();
|
||||
rssiService.stopScanning();
|
||||
setIsScanning(false);
|
||||
};
|
||||
}, [enabled]);
|
||||
|
||||
return { networks, isScanning };
|
||||
}
|
||||
@@ -1,52 +0,0 @@
|
||||
import { useEffect, useState } from 'react';
|
||||
import { apiService } from '@/services/api.service';
|
||||
|
||||
interface ServerReachability {
|
||||
reachable: boolean;
|
||||
latencyMs: number | null;
|
||||
}
|
||||
|
||||
const POLL_MS = 10000;
|
||||
|
||||
export function useServerReachability(): ServerReachability {
|
||||
const [state, setState] = useState<ServerReachability>({
|
||||
reachable: false,
|
||||
latencyMs: null,
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
let active = true;
|
||||
|
||||
const check = async () => {
|
||||
const started = Date.now();
|
||||
try {
|
||||
await apiService.getStatus();
|
||||
if (!active) {
|
||||
return;
|
||||
}
|
||||
setState({
|
||||
reachable: true,
|
||||
latencyMs: Date.now() - started,
|
||||
});
|
||||
} catch {
|
||||
if (!active) {
|
||||
return;
|
||||
}
|
||||
setState({
|
||||
reachable: false,
|
||||
latencyMs: null,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
void check();
|
||||
const timer = setInterval(check, POLL_MS);
|
||||
|
||||
return () => {
|
||||
active = false;
|
||||
clearInterval(timer);
|
||||
};
|
||||
}, []);
|
||||
|
||||
return state;
|
||||
}
|
||||
@@ -1,4 +0,0 @@
|
||||
import { useContext } from 'react';
|
||||
import { ThemeContext, ThemeContextValue } from '../theme/ThemeContext';
|
||||
|
||||
export const useTheme = (): ThemeContextValue => useContext(ThemeContext);
|
||||
@@ -1,162 +0,0 @@
|
||||
import React, { Suspense, useEffect, useState } from 'react';
|
||||
import { ActivityIndicator } from 'react-native';
|
||||
import { createBottomTabNavigator } from '@react-navigation/bottom-tabs';
|
||||
import { Ionicons } from '@expo/vector-icons';
|
||||
import { ThemedText } from '../components/ThemedText';
|
||||
import { ThemedView } from '../components/ThemedView';
|
||||
import { colors } from '../theme/colors';
|
||||
import { MainTabsParamList } from './types';
|
||||
|
||||
const createPlaceholder = (label: string) => {
|
||||
const Placeholder = () => (
|
||||
<ThemedView style={{ flex: 1, alignItems: 'center', justifyContent: 'center' }}>
|
||||
<ThemedText preset="bodyLg">{label} screen not implemented yet</ThemedText>
|
||||
<ThemedText preset="bodySm" color="textSecondary">
|
||||
Placeholder shell
|
||||
</ThemedText>
|
||||
</ThemedView>
|
||||
);
|
||||
const LazyPlaceholder = React.lazy(async () => ({ default: Placeholder }));
|
||||
|
||||
const Wrapped = () => (
|
||||
<Suspense
|
||||
fallback={
|
||||
<ThemedView style={{ flex: 1, alignItems: 'center', justifyContent: 'center' }}>
|
||||
<ActivityIndicator color={colors.accent} />
|
||||
<ThemedText preset="bodySm" color="textSecondary" style={{ marginTop: 8 }}>
|
||||
Loading {label}
|
||||
</ThemedText>
|
||||
</ThemedView>
|
||||
}
|
||||
>
|
||||
<LazyPlaceholder />
|
||||
</Suspense>
|
||||
);
|
||||
|
||||
return Wrapped;
|
||||
};
|
||||
|
||||
const loadScreen = (path: string, label: string) => {
|
||||
const fallback = createPlaceholder(label);
|
||||
return React.lazy(async () => {
|
||||
try {
|
||||
const module = (await import(path)) as { default: React.ComponentType };
|
||||
if (module?.default) {
|
||||
return module;
|
||||
}
|
||||
} catch {
|
||||
// keep fallback for shell-only screens
|
||||
}
|
||||
return { default: fallback } as { default: React.ComponentType };
|
||||
});
|
||||
};
|
||||
|
||||
const LiveScreen = loadScreen('../screens/LiveScreen', 'Live');
|
||||
const VitalsScreen = loadScreen('../screens/VitalsScreen', 'Vitals');
|
||||
const ZonesScreen = loadScreen('../screens/ZonesScreen', 'Zones');
|
||||
const MATScreen = loadScreen('../screens/MATScreen', 'MAT');
|
||||
const SettingsScreen = loadScreen('../screens/SettingsScreen', 'Settings');
|
||||
|
||||
const toIconName = (routeName: keyof MainTabsParamList) => {
|
||||
switch (routeName) {
|
||||
case 'Live':
|
||||
return 'wifi';
|
||||
case 'Vitals':
|
||||
return 'heart';
|
||||
case 'Zones':
|
||||
return 'grid';
|
||||
case 'MAT':
|
||||
return 'shield-checkmark';
|
||||
case 'Settings':
|
||||
return 'settings';
|
||||
default:
|
||||
return 'ellipse';
|
||||
}
|
||||
};
|
||||
|
||||
const getMatAlertCount = async (): Promise<number> => {
|
||||
try {
|
||||
const mod = (await import('../stores/matStore')) as Record<string, unknown>;
|
||||
const candidates = [mod.useMatStore, mod.useStore].filter((candidate) => {
|
||||
return (
|
||||
!!candidate &&
|
||||
typeof candidate === 'function' &&
|
||||
typeof (candidate as { getState?: () => unknown }).getState === 'function'
|
||||
);
|
||||
}) as Array<{ getState: () => { alerts?: unknown[] } }>;
|
||||
|
||||
for (const store of candidates) {
|
||||
const alerts = store.getState().alerts;
|
||||
if (Array.isArray(alerts)) {
|
||||
return alerts.length;
|
||||
}
|
||||
}
|
||||
} catch {
|
||||
return 0;
|
||||
}
|
||||
return 0;
|
||||
};
|
||||
|
||||
const screens: ReadonlyArray<{ name: keyof MainTabsParamList; component: React.ComponentType }> = [
|
||||
{ name: 'Live', component: LiveScreen },
|
||||
{ name: 'Vitals', component: VitalsScreen },
|
||||
{ name: 'Zones', component: ZonesScreen },
|
||||
{ name: 'MAT', component: MATScreen },
|
||||
{ name: 'Settings', component: SettingsScreen },
|
||||
];
|
||||
|
||||
const Tab = createBottomTabNavigator<MainTabsParamList>();
|
||||
|
||||
const Suspended = ({ component: Component }: { component: React.ComponentType }) => (
|
||||
<Suspense fallback={<ActivityIndicator color={colors.accent} />}>
|
||||
<Component />
|
||||
</Suspense>
|
||||
);
|
||||
|
||||
export const MainTabs = () => {
|
||||
const [matAlertCount, setMatAlertCount] = useState(0);
|
||||
|
||||
useEffect(() => {
|
||||
const readCount = async () => {
|
||||
const count = await getMatAlertCount();
|
||||
setMatAlertCount(count);
|
||||
};
|
||||
|
||||
void readCount();
|
||||
const timer = setInterval(readCount, 2000);
|
||||
return () => clearInterval(timer);
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<Tab.Navigator
|
||||
screenOptions={({ route }) => ({
|
||||
headerShown: false,
|
||||
tabBarActiveTintColor: colors.accent,
|
||||
tabBarInactiveTintColor: colors.textSecondary,
|
||||
tabBarStyle: {
|
||||
backgroundColor: '#0D1117',
|
||||
borderTopColor: colors.border,
|
||||
borderTopWidth: 1,
|
||||
},
|
||||
tabBarIcon: ({ color, size }) => <Ionicons name={toIconName(route.name)} size={size} color={color} />,
|
||||
tabBarLabelStyle: {
|
||||
fontFamily: 'Courier New',
|
||||
textTransform: 'uppercase',
|
||||
fontSize: 10,
|
||||
},
|
||||
tabBarLabel: ({ children, color }) => <ThemedText style={{ color }}>{children}</ThemedText>,
|
||||
})}
|
||||
>
|
||||
{screens.map(({ name, component }) => (
|
||||
<Tab.Screen
|
||||
key={name}
|
||||
name={name}
|
||||
options={{
|
||||
tabBarBadge: name === 'MAT' ? (matAlertCount > 0 ? matAlertCount : undefined) : undefined,
|
||||
}}
|
||||
component={() => <Suspended component={component} />}
|
||||
/>
|
||||
))}
|
||||
</Tab.Navigator>
|
||||
);
|
||||
};
|
||||
@@ -1,5 +0,0 @@
|
||||
import { MainTabs } from './MainTabs';
|
||||
|
||||
export const RootNavigator = () => {
|
||||
return <MainTabs />;
|
||||
};
|
||||
@@ -1,11 +0,0 @@
|
||||
export type RootStackParamList = {
|
||||
MainTabs: undefined;
|
||||
};
|
||||
|
||||
export type MainTabsParamList = {
|
||||
Live: undefined;
|
||||
Vitals: undefined;
|
||||
Zones: undefined;
|
||||
MAT: undefined;
|
||||
Settings: undefined;
|
||||
};
|
||||
@@ -1,41 +0,0 @@
|
||||
import { LayoutChangeEvent, StyleSheet } from 'react-native';
|
||||
import type { RefObject } from 'react';
|
||||
import { WebView, type WebViewMessageEvent } from 'react-native-webview';
|
||||
import GAUSSIAN_SPLATS_HTML from '@/assets/webview/gaussian-splats.html';
|
||||
|
||||
type GaussianSplatWebViewProps = {
|
||||
onMessage: (event: WebViewMessageEvent) => void;
|
||||
onError: () => void;
|
||||
webViewRef: RefObject<WebView | null>;
|
||||
onLayout?: (event: LayoutChangeEvent) => void;
|
||||
};
|
||||
|
||||
export const GaussianSplatWebView = ({
|
||||
onMessage,
|
||||
onError,
|
||||
webViewRef,
|
||||
onLayout,
|
||||
}: GaussianSplatWebViewProps) => {
|
||||
const html = typeof GAUSSIAN_SPLATS_HTML === 'string' ? GAUSSIAN_SPLATS_HTML : '';
|
||||
|
||||
return (
|
||||
<WebView
|
||||
ref={webViewRef}
|
||||
source={{ html }}
|
||||
originWhitelist={['*']}
|
||||
allowFileAccess={false}
|
||||
javaScriptEnabled
|
||||
onMessage={onMessage}
|
||||
onError={onError}
|
||||
onLayout={onLayout}
|
||||
style={styles.webView}
|
||||
/>
|
||||
);
|
||||
};
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
webView: {
|
||||
flex: 1,
|
||||
backgroundColor: '#0A0E1A',
|
||||
},
|
||||
});
|
||||
@@ -1,164 +0,0 @@
|
||||
import { Pressable, StyleSheet, View } from 'react-native';
|
||||
import { memo, useCallback, useState } from 'react';
|
||||
import Animated, { useAnimatedStyle, useSharedValue, withTiming } from 'react-native-reanimated';
|
||||
import { StatusDot } from '@/components/StatusDot';
|
||||
import { ModeBadge } from '@/components/ModeBadge';
|
||||
import { ThemedText } from '@/components/ThemedText';
|
||||
import { formatConfidence, formatRssi } from '@/utils/formatters';
|
||||
import { colors, spacing } from '@/theme';
|
||||
import type { ConnectionStatus } from '@/types/sensing';
|
||||
|
||||
type LiveMode = 'LIVE' | 'SIM' | 'RSSI';
|
||||
|
||||
type LiveHUDProps = {
|
||||
rssi?: number;
|
||||
connectionStatus: ConnectionStatus;
|
||||
fps: number;
|
||||
confidence: number;
|
||||
personCount: number;
|
||||
mode: LiveMode;
|
||||
};
|
||||
|
||||
const statusTextMap: Record<ConnectionStatus, string> = {
|
||||
connected: 'Connected',
|
||||
simulated: 'Simulated',
|
||||
connecting: 'Connecting',
|
||||
disconnected: 'Disconnected',
|
||||
};
|
||||
|
||||
const statusDotStatusMap: Record<ConnectionStatus, 'connected' | 'simulated' | 'disconnected' | 'connecting'> = {
|
||||
connected: 'connected',
|
||||
simulated: 'simulated',
|
||||
connecting: 'connecting',
|
||||
disconnected: 'disconnected',
|
||||
};
|
||||
|
||||
export const LiveHUD = memo(
|
||||
({ rssi, connectionStatus, fps, confidence, personCount, mode }: LiveHUDProps) => {
|
||||
const [panelVisible, setPanelVisible] = useState(true);
|
||||
const panelAlpha = useSharedValue(1);
|
||||
|
||||
const togglePanel = useCallback(() => {
|
||||
const next = !panelVisible;
|
||||
setPanelVisible(next);
|
||||
panelAlpha.value = withTiming(next ? 1 : 0, { duration: 220 });
|
||||
}, [panelAlpha, panelVisible]);
|
||||
|
||||
const animatedPanelStyle = useAnimatedStyle(() => ({
|
||||
opacity: panelAlpha.value,
|
||||
}));
|
||||
|
||||
const statusText = statusTextMap[connectionStatus];
|
||||
|
||||
return (
|
||||
<Pressable style={StyleSheet.absoluteFill} onPress={togglePanel}>
|
||||
<Animated.View pointerEvents="none" style={[StyleSheet.absoluteFill, animatedPanelStyle]}>
|
||||
{/* App title */}
|
||||
<View style={styles.topLeft}>
|
||||
<ThemedText preset="labelLg" style={styles.appTitle}>
|
||||
WiFi-DensePose
|
||||
</ThemedText>
|
||||
</View>
|
||||
|
||||
{/* Status + FPS */}
|
||||
<View style={styles.topRight}>
|
||||
<View style={styles.row}>
|
||||
<StatusDot status={statusDotStatusMap[connectionStatus]} size={10} />
|
||||
<ThemedText preset="labelMd" style={styles.statusText}>
|
||||
{statusText}
|
||||
</ThemedText>
|
||||
</View>
|
||||
{fps > 0 && (
|
||||
<View style={styles.row}>
|
||||
<ThemedText preset="labelMd">{fps} FPS</ThemedText>
|
||||
</View>
|
||||
)}
|
||||
</View>
|
||||
|
||||
{/* Bottom panel */}
|
||||
<View style={styles.bottomPanel}>
|
||||
<View style={styles.bottomCell}>
|
||||
<ThemedText preset="bodySm">RSSI</ThemedText>
|
||||
<ThemedText preset="displayMd" style={styles.bigValue}>
|
||||
{formatRssi(rssi)}
|
||||
</ThemedText>
|
||||
</View>
|
||||
|
||||
<View style={styles.bottomCell}>
|
||||
<ModeBadge mode={mode} />
|
||||
</View>
|
||||
|
||||
<View style={styles.bottomCellRight}>
|
||||
<ThemedText preset="bodySm">Confidence</ThemedText>
|
||||
<ThemedText preset="bodyMd" style={styles.metaText}>
|
||||
{formatConfidence(confidence)}
|
||||
</ThemedText>
|
||||
<ThemedText preset="bodySm">People: {personCount}</ThemedText>
|
||||
</View>
|
||||
</View>
|
||||
</Animated.View>
|
||||
</Pressable>
|
||||
);
|
||||
},
|
||||
);
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
topLeft: {
|
||||
position: 'absolute',
|
||||
top: spacing.md,
|
||||
left: spacing.md,
|
||||
},
|
||||
appTitle: {
|
||||
color: colors.textPrimary,
|
||||
},
|
||||
topRight: {
|
||||
position: 'absolute',
|
||||
top: spacing.md,
|
||||
right: spacing.md,
|
||||
alignItems: 'flex-end',
|
||||
gap: 4,
|
||||
},
|
||||
row: {
|
||||
flexDirection: 'row',
|
||||
alignItems: 'center',
|
||||
gap: spacing.sm,
|
||||
},
|
||||
statusText: {
|
||||
color: colors.textPrimary,
|
||||
},
|
||||
bottomPanel: {
|
||||
position: 'absolute',
|
||||
left: spacing.sm,
|
||||
right: spacing.sm,
|
||||
bottom: spacing.sm,
|
||||
minHeight: 72,
|
||||
borderRadius: 12,
|
||||
backgroundColor: 'rgba(10,14,26,0.72)',
|
||||
borderWidth: 1,
|
||||
borderColor: 'rgba(50,184,198,0.35)',
|
||||
paddingHorizontal: spacing.md,
|
||||
paddingVertical: spacing.sm,
|
||||
flexDirection: 'row',
|
||||
justifyContent: 'space-between',
|
||||
alignItems: 'center',
|
||||
},
|
||||
bottomCell: {
|
||||
flex: 1,
|
||||
alignItems: 'center',
|
||||
},
|
||||
bottomCellRight: {
|
||||
flex: 1,
|
||||
alignItems: 'flex-end',
|
||||
},
|
||||
bigValue: {
|
||||
color: colors.accent,
|
||||
marginTop: 2,
|
||||
marginBottom: 2,
|
||||
},
|
||||
metaText: {
|
||||
color: colors.textPrimary,
|
||||
marginBottom: 4,
|
||||
},
|
||||
});
|
||||
|
||||
LiveHUD.displayName = 'LiveHUD';
|
||||
@@ -1,215 +0,0 @@
|
||||
import { useCallback, useEffect, useRef, useState } from 'react';
|
||||
import { Button, LayoutChangeEvent, StyleSheet, View } from 'react-native';
|
||||
import type { WebView } from 'react-native-webview';
|
||||
import type { WebViewMessageEvent } from 'react-native-webview';
|
||||
import { ErrorBoundary } from '@/components/ErrorBoundary';
|
||||
import { LoadingSpinner } from '@/components/LoadingSpinner';
|
||||
import { ThemedText } from '@/components/ThemedText';
|
||||
import { ThemedView } from '@/components/ThemedView';
|
||||
import { usePoseStream } from '@/hooks/usePoseStream';
|
||||
import { colors, spacing } from '@/theme';
|
||||
import type { ConnectionStatus, SensingFrame } from '@/types/sensing';
|
||||
import { useGaussianBridge } from './useGaussianBridge';
|
||||
import { GaussianSplatWebView } from './GaussianSplatWebView';
|
||||
import { LiveHUD } from './LiveHUD';
|
||||
|
||||
type LiveMode = 'LIVE' | 'SIM' | 'RSSI';
|
||||
|
||||
const getMode = (
|
||||
status: ConnectionStatus,
|
||||
isSimulated: boolean,
|
||||
frame: SensingFrame | null,
|
||||
): LiveMode => {
|
||||
if (isSimulated || frame?.source === 'simulated') {
|
||||
return 'SIM';
|
||||
}
|
||||
|
||||
if (status === 'connected') {
|
||||
return 'LIVE';
|
||||
}
|
||||
|
||||
return 'RSSI';
|
||||
};
|
||||
|
||||
const dispatchWebViewMessage = (webViewRef: { current: WebView | null }, message: unknown) => {
|
||||
const webView = webViewRef.current;
|
||||
if (!webView) {
|
||||
return;
|
||||
}
|
||||
|
||||
const payload = JSON.stringify(message);
|
||||
webView.injectJavaScript(
|
||||
`window.dispatchEvent(new MessageEvent('message', { data: ${JSON.stringify(payload)} })); true;`,
|
||||
);
|
||||
};
|
||||
|
||||
export const LiveScreen = () => {
|
||||
const webViewRef = useRef<WebView | null>(null);
|
||||
const { lastFrame, connectionStatus, isSimulated } = usePoseStream();
|
||||
const bridge = useGaussianBridge(webViewRef);
|
||||
|
||||
const [webError, setWebError] = useState<string | null>(null);
|
||||
const [viewerKey, setViewerKey] = useState(0);
|
||||
const sendTimeoutRef = useRef<ReturnType<typeof setTimeout> | null>(null);
|
||||
const pendingFrameRef = useRef<SensingFrame | null>(null);
|
||||
const lastSentAtRef = useRef(0);
|
||||
|
||||
const clearSendTimeout = useCallback(() => {
|
||||
if (!sendTimeoutRef.current) {
|
||||
return;
|
||||
}
|
||||
clearTimeout(sendTimeoutRef.current);
|
||||
sendTimeoutRef.current = null;
|
||||
}, []);
|
||||
|
||||
useEffect(() => {
|
||||
if (!lastFrame) {
|
||||
return;
|
||||
}
|
||||
|
||||
pendingFrameRef.current = lastFrame;
|
||||
const now = Date.now();
|
||||
|
||||
const flush = () => {
|
||||
if (!bridge.isReady || !pendingFrameRef.current) {
|
||||
return;
|
||||
}
|
||||
|
||||
bridge.sendFrame(pendingFrameRef.current);
|
||||
lastSentAtRef.current = Date.now();
|
||||
pendingFrameRef.current = null;
|
||||
};
|
||||
|
||||
const waitMs = Math.max(0, 500 - (now - lastSentAtRef.current));
|
||||
|
||||
if (waitMs <= 0) {
|
||||
flush();
|
||||
return;
|
||||
}
|
||||
|
||||
clearSendTimeout();
|
||||
sendTimeoutRef.current = setTimeout(() => {
|
||||
sendTimeoutRef.current = null;
|
||||
flush();
|
||||
}, waitMs);
|
||||
|
||||
return () => {
|
||||
clearSendTimeout();
|
||||
};
|
||||
}, [bridge.isReady, lastFrame, bridge.sendFrame, clearSendTimeout]);
|
||||
|
||||
useEffect(() => {
|
||||
return () => {
|
||||
dispatchWebViewMessage(webViewRef, { type: 'DISPOSE' });
|
||||
clearSendTimeout();
|
||||
pendingFrameRef.current = null;
|
||||
};
|
||||
}, [clearSendTimeout]);
|
||||
|
||||
const onMessage = useCallback(
|
||||
(event: WebViewMessageEvent) => {
|
||||
bridge.onMessage(event);
|
||||
},
|
||||
[bridge],
|
||||
);
|
||||
|
||||
const onLayout = useCallback((event: LayoutChangeEvent) => {
|
||||
const { width, height } = event.nativeEvent.layout;
|
||||
if (width <= 0 || height <= 0 || Number.isNaN(width) || Number.isNaN(height)) {
|
||||
return;
|
||||
}
|
||||
|
||||
dispatchWebViewMessage(webViewRef, {
|
||||
type: 'RESIZE',
|
||||
payload: {
|
||||
width: Math.max(1, Math.floor(width)),
|
||||
height: Math.max(1, Math.floor(height)),
|
||||
},
|
||||
});
|
||||
}, []);
|
||||
|
||||
const handleWebError = useCallback(() => {
|
||||
setWebError('Live renderer failed to initialize');
|
||||
}, []);
|
||||
|
||||
const handleRetry = useCallback(() => {
|
||||
setWebError(null);
|
||||
bridge.reset();
|
||||
setViewerKey((value) => value + 1);
|
||||
}, [bridge]);
|
||||
|
||||
const rssi = lastFrame?.features?.mean_rssi;
|
||||
const personCount = lastFrame?.classification?.presence ? 1 : 0;
|
||||
const mode = getMode(connectionStatus, isSimulated, lastFrame);
|
||||
|
||||
if (webError || bridge.error) {
|
||||
return (
|
||||
<ThemedView style={styles.fallbackWrap}>
|
||||
<ThemedText preset="bodyLg">Live visualization failed</ThemedText>
|
||||
<ThemedText preset="bodySm" color="textSecondary" style={styles.errorText}>
|
||||
{webError ?? bridge.error}
|
||||
</ThemedText>
|
||||
<Button title="Retry" onPress={handleRetry} />
|
||||
</ThemedView>
|
||||
);
|
||||
}
|
||||
|
||||
return (
|
||||
<ErrorBoundary>
|
||||
<View style={styles.container}>
|
||||
<GaussianSplatWebView
|
||||
key={viewerKey}
|
||||
webViewRef={webViewRef}
|
||||
onMessage={onMessage}
|
||||
onError={handleWebError}
|
||||
onLayout={onLayout}
|
||||
/>
|
||||
|
||||
<LiveHUD
|
||||
connectionStatus={connectionStatus}
|
||||
fps={bridge.fps}
|
||||
rssi={rssi}
|
||||
confidence={lastFrame?.classification?.confidence ?? 0}
|
||||
personCount={personCount}
|
||||
mode={mode}
|
||||
/>
|
||||
|
||||
{!bridge.isReady && (
|
||||
<View style={styles.loadingWrap}>
|
||||
<LoadingSpinner />
|
||||
<ThemedText preset="bodyMd" style={styles.loadingText}>
|
||||
Loading live renderer
|
||||
</ThemedText>
|
||||
</View>
|
||||
)}
|
||||
</View>
|
||||
</ErrorBoundary>
|
||||
);
|
||||
};
|
||||
|
||||
const styles = StyleSheet.create({
|
||||
container: {
|
||||
flex: 1,
|
||||
backgroundColor: colors.bg,
|
||||
},
|
||||
loadingWrap: {
|
||||
...StyleSheet.absoluteFillObject,
|
||||
backgroundColor: colors.bg,
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center',
|
||||
gap: spacing.md,
|
||||
},
|
||||
loadingText: {
|
||||
color: colors.textSecondary,
|
||||
},
|
||||
fallbackWrap: {
|
||||
flex: 1,
|
||||
alignItems: 'center',
|
||||
justifyContent: 'center',
|
||||
gap: spacing.md,
|
||||
padding: spacing.lg,
|
||||
},
|
||||
errorText: {
|
||||
textAlign: 'center',
|
||||
},
|
||||
});
|
||||
@@ -1,97 +0,0 @@
|
||||
import { useCallback, useState } from 'react';
|
||||
import type { RefObject } from 'react';
|
||||
import type { WebViewMessageEvent } from 'react-native-webview';
|
||||
import { WebView } from 'react-native-webview';
|
||||
import type { SensingFrame } from '@/types/sensing';
|
||||
|
||||
export type GaussianBridgeMessageType = 'READY' | 'FPS_TICK' | 'ERROR';
|
||||
|
||||
type BridgeMessage = {
|
||||
type: GaussianBridgeMessageType;
|
||||
payload?: {
|
||||
fps?: number;
|
||||
message?: string;
|
||||
};
|
||||
};
|
||||
|
||||
const toJsonScript = (message: unknown): string => {
|
||||
const serialized = JSON.stringify(message);
|
||||
return `window.dispatchEvent(new MessageEvent('message', { data: ${JSON.stringify(serialized)} })); true;`;
|
||||
};
|
||||
|
||||
export const useGaussianBridge = (webViewRef: RefObject<WebView | null>) => {
|
||||
const [isReady, setIsReady] = useState(false);
|
||||
const [fps, setFps] = useState(0);
|
||||
const [error, setError] = useState<string | null>(null);
|
||||
|
||||
const send = useCallback((message: unknown) => {
|
||||
const webView = webViewRef.current;
|
||||
if (!webView) {
|
||||
return;
|
||||
}
|
||||
|
||||
webView.injectJavaScript(toJsonScript(message));
|
||||
}, [webViewRef]);
|
||||
|
||||
const sendFrame = useCallback(
|
||||
(frame: SensingFrame) => {
|
||||
send({
|
||||
type: 'FRAME_UPDATE',
|
||||
payload: frame,
|
||||
});
|
||||
},
|
||||
[send],
|
||||
);
|
||||
|
||||
const onMessage = useCallback((event: WebViewMessageEvent) => {
|
||||
let parsed: BridgeMessage | null = null;
|
||||
const raw = event.nativeEvent.data;
|
||||
|
||||
if (typeof raw === 'string') {
|
||||
try {
|
||||
parsed = JSON.parse(raw) as BridgeMessage;
|
||||
} catch {
|
||||
setError('Invalid bridge message format');
|
||||
return;
|
||||
}
|
||||
} else if (typeof raw === 'object' && raw !== null) {
|
||||
parsed = raw as BridgeMessage;
|
||||
}
|
||||
|
||||
if (!parsed) {
|
||||
return;
|
||||
}
|
||||
|
||||
if (parsed.type === 'READY') {
|
||||
setIsReady(true);
|
||||
setError(null);
|
||||
return;
|
||||
}
|
||||
|
||||
if (parsed.type === 'FPS_TICK') {
|
||||
const fpsValue = parsed.payload?.fps;
|
||||
if (typeof fpsValue === 'number' && Number.isFinite(fpsValue)) {
|
||||
setFps(Math.max(0, Math.floor(fpsValue)));
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if (parsed.type === 'ERROR') {
|
||||
setError(parsed.payload?.message ?? 'Unknown bridge error');
|
||||
setIsReady(false);
|
||||
}
|
||||
}, []);
|
||||
|
||||
return {
|
||||
sendFrame,
|
||||
onMessage,
|
||||
isReady,
|
||||
fps,
|
||||
error,
|
||||
reset: () => {
|
||||
setIsReady(false);
|
||||
setFps(0);
|
||||
setError(null);
|
||||
},
|
||||
};
|
||||
};
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user