Squashed 'vendor/ruvector/' content from commit b64c2172
git-subtree-dir: vendor/ruvector git-subtree-split: b64c21726f2bb37286d9ee36a7869fef60cc6900
This commit is contained in:
295
examples/exo-ai-2025/test-templates/README.md
Normal file
295
examples/exo-ai-2025/test-templates/README.md
Normal file
@@ -0,0 +1,295 @@
|
||||
# EXO-AI 2025 Test Templates
|
||||
|
||||
## Purpose
|
||||
|
||||
This directory contains comprehensive test templates for all EXO-AI 2025 crates. These templates are ready to be copied into the actual crate directories once the implementation code is written.
|
||||
|
||||
## Directory Structure
|
||||
|
||||
```
|
||||
test-templates/
|
||||
├── exo-core/
|
||||
│ └── tests/
|
||||
│ └── core_traits_test.rs # Core trait and type tests
|
||||
├── exo-manifold/
|
||||
│ └── tests/
|
||||
│ └── manifold_engine_test.rs # Manifold engine tests
|
||||
├── exo-hypergraph/
|
||||
│ └── tests/
|
||||
│ └── hypergraph_test.rs # Hypergraph substrate tests
|
||||
├── exo-temporal/
|
||||
│ └── tests/
|
||||
│ └── temporal_memory_test.rs # Temporal memory tests
|
||||
├── exo-federation/
|
||||
│ └── tests/
|
||||
│ └── federation_test.rs # Federation and consensus tests
|
||||
├── exo-backend-classical/
|
||||
│ └── tests/
|
||||
│ └── classical_backend_test.rs # ruvector integration tests
|
||||
├── integration/
|
||||
│ ├── manifold_hypergraph_test.rs # Cross-crate integration
|
||||
│ ├── temporal_federation_test.rs # Distributed memory
|
||||
│ └── full_stack_test.rs # Complete system tests
|
||||
└── README.md # This file
|
||||
```
|
||||
|
||||
## How to Use
|
||||
|
||||
### 1. When Crates Are Created
|
||||
|
||||
Once a coder agent creates a crate (e.g., `crates/exo-core/`), copy the corresponding test template:
|
||||
|
||||
```bash
|
||||
# Example for exo-core
|
||||
cp test-templates/exo-core/tests/core_traits_test.rs \
|
||||
crates/exo-core/tests/
|
||||
|
||||
# Uncomment the use statements and imports
|
||||
# Remove placeholder comments
|
||||
# Run tests
|
||||
cd crates/exo-core
|
||||
cargo test
|
||||
```
|
||||
|
||||
### 2. Activation Checklist
|
||||
|
||||
For each test file:
|
||||
- [ ] Copy to actual crate directory
|
||||
- [ ] Uncomment `use` statements
|
||||
- [ ] Remove placeholder comments
|
||||
- [ ] Add `#[cfg(test)]` if not present
|
||||
- [ ] Run `cargo test` to verify
|
||||
- [ ] Fix any compilation errors
|
||||
- [ ] Ensure tests pass or fail appropriately (TDD)
|
||||
|
||||
### 3. Test Categories Covered
|
||||
|
||||
Each crate has tests for:
|
||||
|
||||
#### exo-core
|
||||
- ✅ Pattern construction and validation
|
||||
- ✅ TopologicalQuery variants
|
||||
- ✅ SubstrateTime operations
|
||||
- ✅ Error handling
|
||||
- ✅ Filter types
|
||||
|
||||
#### exo-manifold
|
||||
- ✅ Gradient descent retrieval
|
||||
- ✅ Manifold deformation
|
||||
- ✅ Strategic forgetting
|
||||
- ✅ SIREN network operations
|
||||
- ✅ Fourier features
|
||||
- ✅ Tensor Train compression (feature-gated)
|
||||
- ✅ Edge cases (NaN, infinity, etc.)
|
||||
|
||||
#### exo-hypergraph
|
||||
- ✅ Hyperedge creation and query
|
||||
- ✅ Persistent homology (0D, 1D, 2D)
|
||||
- ✅ Betti numbers
|
||||
- ✅ Sheaf consistency (feature-gated)
|
||||
- ✅ Simplicial complex operations
|
||||
- ✅ Entity and relation indexing
|
||||
|
||||
#### exo-temporal
|
||||
- ✅ Causal cone queries (past, future, light-cone)
|
||||
- ✅ Memory consolidation
|
||||
- ✅ Salience computation
|
||||
- ✅ Anticipatory pre-fetch
|
||||
- ✅ Causal graph operations
|
||||
- ✅ Temporal knowledge graph
|
||||
- ✅ Short-term buffer management
|
||||
|
||||
#### exo-federation
|
||||
- ✅ Post-quantum key exchange (Kyber)
|
||||
- ✅ Byzantine fault tolerance
|
||||
- ✅ CRDT reconciliation
|
||||
- ✅ Onion routing
|
||||
- ✅ Federation handshake
|
||||
- ✅ Raft consensus
|
||||
- ✅ Encrypted channels
|
||||
|
||||
#### exo-backend-classical
|
||||
- ✅ ruvector-core integration
|
||||
- ✅ ruvector-graph integration
|
||||
- ✅ ruvector-gnn integration
|
||||
- ✅ SubstrateBackend implementation
|
||||
- ✅ Performance tests
|
||||
- ✅ Concurrency tests
|
||||
|
||||
### 4. Integration Tests
|
||||
|
||||
Integration tests in `integration/` should be placed in `crates/tests/` at the workspace root:
|
||||
|
||||
```bash
|
||||
# Create workspace integration test directory
|
||||
mkdir -p crates/tests
|
||||
|
||||
# Copy integration tests
|
||||
cp test-templates/integration/*.rs crates/tests/
|
||||
```
|
||||
|
||||
### 5. Running Tests
|
||||
|
||||
```bash
|
||||
# Run all tests in workspace
|
||||
cargo test --all-features
|
||||
|
||||
# Run tests for specific crate
|
||||
cargo test -p exo-manifold
|
||||
|
||||
# Run specific test file
|
||||
cargo test -p exo-manifold --test manifold_engine_test
|
||||
|
||||
# Run with coverage
|
||||
cargo tarpaulin --all-features
|
||||
|
||||
# Run integration tests only
|
||||
cargo test --test '*'
|
||||
|
||||
# Run benchmarks
|
||||
cargo bench
|
||||
```
|
||||
|
||||
### 6. Test-Driven Development Workflow
|
||||
|
||||
1. **Copy template** to crate directory
|
||||
2. **Uncomment imports** and test code
|
||||
3. **Run tests** - they will fail (RED)
|
||||
4. **Implement code** to make tests pass
|
||||
5. **Run tests** again - they should pass (GREEN)
|
||||
6. **Refactor** code while keeping tests green
|
||||
7. **Repeat** for next test
|
||||
|
||||
### 7. Feature Gates
|
||||
|
||||
Some tests are feature-gated:
|
||||
|
||||
```rust
|
||||
#[test]
|
||||
#[cfg(feature = "tensor-train")]
|
||||
fn test_tensor_train_compression() {
|
||||
// Only runs with --features tensor-train
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "sheaf-consistency")]
|
||||
fn test_sheaf_consistency() {
|
||||
// Only runs with --features sheaf-consistency
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "post-quantum")]
|
||||
fn test_kyber_key_exchange() {
|
||||
// Only runs with --features post-quantum
|
||||
}
|
||||
```
|
||||
|
||||
Run with features:
|
||||
```bash
|
||||
cargo test --features tensor-train
|
||||
cargo test --all-features
|
||||
```
|
||||
|
||||
### 8. Async Tests
|
||||
|
||||
Federation and temporal tests use `tokio::test`:
|
||||
|
||||
```rust
|
||||
#[tokio::test]
|
||||
async fn test_async_operation() {
|
||||
// Async test code
|
||||
}
|
||||
```
|
||||
|
||||
Ensure `tokio` is in dev-dependencies:
|
||||
```toml
|
||||
[dev-dependencies]
|
||||
tokio = { version = "1.0", features = ["full", "test-util"] }
|
||||
```
|
||||
|
||||
### 9. Test Data and Fixtures
|
||||
|
||||
Common test utilities should be placed in:
|
||||
```
|
||||
crates/test-utils/
|
||||
├── src/
|
||||
│ ├── fixtures.rs # Test data generators
|
||||
│ ├── mocks.rs # Mock implementations
|
||||
│ └── helpers.rs # Test helper functions
|
||||
```
|
||||
|
||||
### 10. Coverage Reports
|
||||
|
||||
Generate coverage reports:
|
||||
|
||||
```bash
|
||||
# Install tarpaulin
|
||||
cargo install cargo-tarpaulin
|
||||
|
||||
# Generate coverage
|
||||
cargo tarpaulin --all-features --out Html --output-dir coverage/
|
||||
|
||||
# View report
|
||||
open coverage/index.html
|
||||
```
|
||||
|
||||
### 11. Continuous Integration
|
||||
|
||||
Tests should be run in CI:
|
||||
|
||||
```yaml
|
||||
# .github/workflows/test.yml
|
||||
name: Tests
|
||||
|
||||
on: [push, pull_request]
|
||||
|
||||
jobs:
|
||||
test:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: dtolnay/rust-toolchain@stable
|
||||
- run: cargo test --all-features
|
||||
- run: cargo test --test '*' # Integration tests
|
||||
```
|
||||
|
||||
## Test Metrics
|
||||
|
||||
### Coverage Targets
|
||||
- **Unit Tests**: 85%+ statement coverage
|
||||
- **Integration Tests**: 70%+ coverage
|
||||
- **E2E Tests**: Key user scenarios
|
||||
|
||||
### Performance Targets
|
||||
| Operation | Target Latency | Target Throughput |
|
||||
|-----------|----------------|-------------------|
|
||||
| Manifold Retrieve (k=10) | <10ms | >1000 qps |
|
||||
| Hyperedge Creation | <1ms | >10000 ops/s |
|
||||
| Causal Query | <20ms | >500 qps |
|
||||
| Byzantine Commit | <100ms | >100 commits/s |
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. ✅ **Test strategy created** (`docs/TEST_STRATEGY.md`)
|
||||
2. ✅ **Test templates created** (this directory)
|
||||
3. ⏳ **Wait for coder to create crates**
|
||||
4. ⏳ **Copy templates to crates**
|
||||
5. ⏳ **Uncomment and activate tests**
|
||||
6. ⏳ **Run tests (TDD: RED phase)**
|
||||
7. ⏳ **Implement code to pass tests**
|
||||
8. ⏳ **Achieve GREEN phase**
|
||||
9. ⏳ **Refactor and optimize**
|
||||
|
||||
## References
|
||||
|
||||
- **Test Strategy**: `../docs/TEST_STRATEGY.md`
|
||||
- **Architecture**: `../architecture/ARCHITECTURE.md`
|
||||
- **Specification**: `../specs/SPECIFICATION.md`
|
||||
- **Pseudocode**: `../architecture/PSEUDOCODE.md`
|
||||
|
||||
## Contact
|
||||
|
||||
For questions about test implementation:
|
||||
- Check `docs/TEST_STRATEGY.md` for comprehensive guidance
|
||||
- Review template files for examples
|
||||
- Ensure TDD workflow is followed
|
||||
@@ -0,0 +1,362 @@
|
||||
//! Unit tests for exo-backend-classical (ruvector integration)
|
||||
|
||||
#[cfg(test)]
|
||||
mod substrate_backend_impl_tests {
|
||||
use super::*;
|
||||
// use exo_backend_classical::*;
|
||||
// use exo_core::{SubstrateBackend, Pattern, Filter};
|
||||
|
||||
#[test]
|
||||
fn test_classical_backend_construction() {
|
||||
// Test creating classical backend
|
||||
// let config = ClassicalBackendConfig {
|
||||
// hnsw_m: 16,
|
||||
// hnsw_ef_construction: 200,
|
||||
// dimension: 128,
|
||||
// };
|
||||
//
|
||||
// let backend = ClassicalBackend::new(config).unwrap();
|
||||
//
|
||||
// assert!(backend.is_initialized());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_similarity_search_basic() {
|
||||
// Test basic similarity search
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// // Insert some vectors
|
||||
// for i in 0..100 {
|
||||
// let vector = generate_random_vector(128);
|
||||
// backend.insert(&vector, &metadata(i)).unwrap();
|
||||
// }
|
||||
//
|
||||
// let query = generate_random_vector(128);
|
||||
// let results = backend.similarity_search(&query, 10, None).unwrap();
|
||||
//
|
||||
// assert_eq!(results.len(), 10);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_similarity_search_with_filter() {
|
||||
// Test similarity search with metadata filter
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// let filter = Filter::new("category", "test");
|
||||
// let results = backend.similarity_search(&query, 10, Some(&filter)).unwrap();
|
||||
//
|
||||
// // All results should match filter
|
||||
// assert!(results.iter().all(|r| r.metadata.get("category") == Some("test")));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_similarity_search_empty_index() {
|
||||
// Test search on empty index
|
||||
// let backend = ClassicalBackend::new(config).unwrap();
|
||||
// let query = vec![0.1, 0.2, 0.3];
|
||||
//
|
||||
// let results = backend.similarity_search(&query, 10, None).unwrap();
|
||||
//
|
||||
// assert!(results.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_similarity_search_k_larger_than_index() {
|
||||
// Test requesting more results than available
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// // Insert only 5 vectors
|
||||
// for i in 0..5 {
|
||||
// backend.insert(&vector(i), &metadata(i)).unwrap();
|
||||
// }
|
||||
//
|
||||
// // Request 10
|
||||
// let results = backend.similarity_search(&query, 10, None).unwrap();
|
||||
//
|
||||
// assert_eq!(results.len(), 5); // Should return only what's available
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod manifold_deform_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_manifold_deform_as_insert() {
|
||||
// Test that manifold_deform performs discrete insert on classical backend
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// let pattern = Pattern {
|
||||
// embedding: vec![0.1, 0.2, 0.3],
|
||||
// metadata: Metadata::default(),
|
||||
// timestamp: SubstrateTime::now(),
|
||||
// antecedents: vec![],
|
||||
// };
|
||||
//
|
||||
// let delta = backend.manifold_deform(&pattern, 0.5).unwrap();
|
||||
//
|
||||
// match delta {
|
||||
// ManifoldDelta::DiscreteInsert { id } => {
|
||||
// assert!(backend.contains(id));
|
||||
// }
|
||||
// _ => panic!("Expected DiscreteInsert"),
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_manifold_deform_ignores_learning_rate() {
|
||||
// Classical backend should ignore learning_rate parameter
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// let delta1 = backend.manifold_deform(&pattern, 0.1).unwrap();
|
||||
// let delta2 = backend.manifold_deform(&pattern, 0.9).unwrap();
|
||||
//
|
||||
// // Both should perform same insert operation
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod hyperedge_query_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_hyperedge_query_not_supported() {
|
||||
// Test that advanced topological queries return NotSupported
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// let query = TopologicalQuery::SheafConsistency {
|
||||
// local_sections: vec![],
|
||||
// };
|
||||
//
|
||||
// let result = backend.hyperedge_query(&query).unwrap();
|
||||
//
|
||||
// assert!(matches!(result, HyperedgeResult::NotSupported));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_hyperedge_query_basic_support() {
|
||||
// Test basic hyperedge operations if supported
|
||||
// May use ruvector-graph hyperedge features
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod ruvector_core_integration_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_ruvector_core_hnsw() {
|
||||
// Test integration with ruvector-core HNSW index
|
||||
// let backend = ClassicalBackend::new(config).unwrap();
|
||||
//
|
||||
// // Verify HNSW parameters applied
|
||||
// assert_eq!(backend.hnsw_config().m, 16);
|
||||
// assert_eq!(backend.hnsw_config().ef_construction, 200);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ruvector_core_metadata() {
|
||||
// Test metadata storage via ruvector-core
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ruvector_core_persistence() {
|
||||
// Test save/load via ruvector-core
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod ruvector_graph_integration_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_ruvector_graph_database() {
|
||||
// Test GraphDatabase integration
|
||||
// let backend = setup_backend_with_graph();
|
||||
//
|
||||
// // Create entities and edges
|
||||
// let e1 = backend.graph_db.add_node(data1);
|
||||
// let e2 = backend.graph_db.add_node(data2);
|
||||
// backend.graph_db.add_edge(e1, e2, relation);
|
||||
//
|
||||
// // Query graph
|
||||
// let neighbors = backend.graph_db.neighbors(e1);
|
||||
// assert!(neighbors.contains(&e2));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ruvector_graph_hyperedge() {
|
||||
// Test ruvector-graph hyperedge support
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod ruvector_gnn_integration_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_ruvector_gnn_layer() {
|
||||
// Test GNN layer integration
|
||||
// let backend = setup_backend_with_gnn();
|
||||
//
|
||||
// // Apply GNN layer
|
||||
// let embeddings = backend.gnn_layer.forward(&graph);
|
||||
//
|
||||
// assert!(embeddings.len() > 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_ruvector_gnn_message_passing() {
|
||||
// Test message passing via GNN
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod error_handling_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_error_conversion() {
|
||||
// Test ruvector error conversion to SubstrateBackend::Error
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// // Trigger ruvector error (e.g., invalid dimension)
|
||||
// let invalid_vector = vec![0.1]; // Wrong dimension
|
||||
// let result = backend.similarity_search(&invalid_vector, 10, None);
|
||||
//
|
||||
// assert!(result.is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_error_display() {
|
||||
// Test error display implementation
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod performance_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_search_latency() {
|
||||
// Test search latency meets targets
|
||||
// let backend = setup_large_backend(100000);
|
||||
//
|
||||
// let start = Instant::now();
|
||||
// backend.similarity_search(&query, 10, None).unwrap();
|
||||
// let duration = start.elapsed();
|
||||
//
|
||||
// assert!(duration.as_millis() < 10); // <10ms target
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_insert_throughput() {
|
||||
// Test insert throughput
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// let start = Instant::now();
|
||||
// for i in 0..10000 {
|
||||
// backend.manifold_deform(&pattern(i), 0.5).unwrap();
|
||||
// }
|
||||
// let duration = start.elapsed();
|
||||
//
|
||||
// let throughput = 10000.0 / duration.as_secs_f64();
|
||||
// assert!(throughput > 10000.0); // >10k ops/s target
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod memory_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_memory_usage() {
|
||||
// Test memory footprint
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// let initial_mem = current_memory_usage();
|
||||
//
|
||||
// // Insert vectors
|
||||
// for i in 0..100000 {
|
||||
// backend.manifold_deform(&pattern(i), 0.5).unwrap();
|
||||
// }
|
||||
//
|
||||
// let final_mem = current_memory_usage();
|
||||
// let mem_per_vector = (final_mem - initial_mem) / 100000;
|
||||
//
|
||||
// // Should be reasonable per-vector overhead
|
||||
// assert!(mem_per_vector < 1024); // <1KB per vector
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod concurrency_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_concurrent_searches() {
|
||||
// Test concurrent search operations
|
||||
// let backend = Arc::new(setup_backend());
|
||||
//
|
||||
// let handles: Vec<_> = (0..10).map(|_| {
|
||||
// let backend = backend.clone();
|
||||
// std::thread::spawn(move || {
|
||||
// backend.similarity_search(&random_query(), 10, None).unwrap()
|
||||
// })
|
||||
// }).collect();
|
||||
//
|
||||
// for handle in handles {
|
||||
// let results = handle.join().unwrap();
|
||||
// assert_eq!(results.len(), 10);
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_concurrent_inserts() {
|
||||
// Test concurrent insert operations
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod edge_cases_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_zero_dimension() {
|
||||
// Test error on zero-dimension vectors
|
||||
// let config = ClassicalBackendConfig {
|
||||
// dimension: 0,
|
||||
// ..Default::default()
|
||||
// };
|
||||
//
|
||||
// let result = ClassicalBackend::new(config);
|
||||
// assert!(result.is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_extreme_k_values() {
|
||||
// Test with k=0 and k=usize::MAX
|
||||
// let backend = setup_backend();
|
||||
//
|
||||
// let results_zero = backend.similarity_search(&query, 0, None).unwrap();
|
||||
// assert!(results_zero.is_empty());
|
||||
//
|
||||
// let results_max = backend.similarity_search(&query, usize::MAX, None).unwrap();
|
||||
// // Should return all available results
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_nan_in_query() {
|
||||
// Test handling of NaN in query vector
|
||||
// let backend = setup_backend();
|
||||
// let query_with_nan = vec![f32::NAN, 0.2, 0.3];
|
||||
//
|
||||
// let result = backend.similarity_search(&query_with_nan, 10, None);
|
||||
// assert!(result.is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_infinity_in_query() {
|
||||
// Test handling of infinity in query vector
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,126 @@
|
||||
//! Unit tests for exo-core traits and types
|
||||
|
||||
#[cfg(test)]
|
||||
mod substrate_backend_tests {
|
||||
use super::*;
|
||||
// use exo_core::*; // Uncomment when crate exists
|
||||
|
||||
#[test]
|
||||
fn test_pattern_construction() {
|
||||
// Test Pattern type construction with valid data
|
||||
// let pattern = Pattern {
|
||||
// embedding: vec![0.1, 0.2, 0.3, 0.4],
|
||||
// metadata: Metadata::default(),
|
||||
// timestamp: SubstrateTime::from_unix(1000),
|
||||
// antecedents: vec![],
|
||||
// };
|
||||
// assert_eq!(pattern.embedding.len(), 4);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_pattern_with_antecedents() {
|
||||
// Test Pattern with causal antecedents
|
||||
// let parent_id = PatternId::new();
|
||||
// let pattern = Pattern {
|
||||
// embedding: vec![0.1, 0.2, 0.3],
|
||||
// metadata: Metadata::default(),
|
||||
// timestamp: SubstrateTime::now(),
|
||||
// antecedents: vec![parent_id],
|
||||
// };
|
||||
// assert_eq!(pattern.antecedents.len(), 1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_topological_query_persistent_homology() {
|
||||
// Test PersistentHomology variant construction
|
||||
// let query = TopologicalQuery::PersistentHomology {
|
||||
// dimension: 1,
|
||||
// epsilon_range: (0.0, 1.0),
|
||||
// };
|
||||
// match query {
|
||||
// TopologicalQuery::PersistentHomology { dimension, .. } => {
|
||||
// assert_eq!(dimension, 1);
|
||||
// }
|
||||
// _ => panic!("Wrong variant"),
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_topological_query_betti_numbers() {
|
||||
// Test BettiNumbers variant
|
||||
// let query = TopologicalQuery::BettiNumbers { max_dimension: 3 };
|
||||
// match query {
|
||||
// TopologicalQuery::BettiNumbers { max_dimension } => {
|
||||
// assert_eq!(max_dimension, 3);
|
||||
// }
|
||||
// _ => panic!("Wrong variant"),
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_topological_query_sheaf_consistency() {
|
||||
// Test SheafConsistency variant
|
||||
// let sections = vec![SectionId::new(), SectionId::new()];
|
||||
// let query = TopologicalQuery::SheafConsistency {
|
||||
// local_sections: sections.clone(),
|
||||
// };
|
||||
// match query {
|
||||
// TopologicalQuery::SheafConsistency { local_sections } => {
|
||||
// assert_eq!(local_sections.len(), 2);
|
||||
// }
|
||||
// _ => panic!("Wrong variant"),
|
||||
// }
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod temporal_context_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_substrate_time_ordering() {
|
||||
// Test SubstrateTime comparison
|
||||
// let t1 = SubstrateTime::from_unix(1000);
|
||||
// let t2 = SubstrateTime::from_unix(2000);
|
||||
// assert!(t1 < t2);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_substrate_time_now() {
|
||||
// Test current time generation
|
||||
// let now = SubstrateTime::now();
|
||||
// let later = SubstrateTime::now();
|
||||
// assert!(later >= now);
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod error_handling_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_error_trait_bounds() {
|
||||
// Verify error types implement std::error::Error
|
||||
// This ensures SubstrateBackend::Error is properly bounded
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_error_display() {
|
||||
// Test error Display implementation
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod filter_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_filter_construction() {
|
||||
// Test Filter type construction
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_filter_metadata_matching() {
|
||||
// Test metadata filter application
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,394 @@
|
||||
//! Unit tests for exo-federation distributed cognitive mesh
|
||||
|
||||
#[cfg(test)]
|
||||
mod post_quantum_crypto_tests {
|
||||
use super::*;
|
||||
// use exo_federation::*;
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "post-quantum")]
|
||||
fn test_kyber_keypair_generation() {
|
||||
// Test CRYSTALS-Kyber keypair generation
|
||||
// let keypair = PostQuantumKeypair::generate();
|
||||
//
|
||||
// assert_eq!(keypair.public.len(), 1184); // Kyber768 public key size
|
||||
// assert_eq!(keypair.secret.len(), 2400); // Kyber768 secret key size
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "post-quantum")]
|
||||
fn test_kyber_encapsulation() {
|
||||
// Test key encapsulation
|
||||
// let keypair = PostQuantumKeypair::generate();
|
||||
// let (ciphertext, shared_secret1) = encapsulate(&keypair.public).unwrap();
|
||||
//
|
||||
// assert_eq!(ciphertext.len(), 1088); // Kyber768 ciphertext size
|
||||
// assert_eq!(shared_secret1.len(), 32); // 256-bit shared secret
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "post-quantum")]
|
||||
fn test_kyber_decapsulation() {
|
||||
// Test key decapsulation
|
||||
// let keypair = PostQuantumKeypair::generate();
|
||||
// let (ciphertext, shared_secret1) = encapsulate(&keypair.public).unwrap();
|
||||
//
|
||||
// let shared_secret2 = decapsulate(&ciphertext, &keypair.secret).unwrap();
|
||||
//
|
||||
// assert_eq!(shared_secret1, shared_secret2); // Should match
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "post-quantum")]
|
||||
fn test_key_derivation() {
|
||||
// Test deriving encryption keys from shared secret
|
||||
// let shared_secret = [0u8; 32];
|
||||
// let (encrypt_key, mac_key) = derive_keys(&shared_secret);
|
||||
//
|
||||
// assert_eq!(encrypt_key.len(), 32);
|
||||
// assert_eq!(mac_key.len(), 32);
|
||||
// assert_ne!(encrypt_key, mac_key); // Should be different
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod federation_handshake_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_join_federation_success() {
|
||||
// Test successful federation join
|
||||
// let mut node1 = FederatedMesh::new(config1);
|
||||
// let node2 = FederatedMesh::new(config2);
|
||||
//
|
||||
// let token = node1.join_federation(&node2.address()).await.unwrap();
|
||||
//
|
||||
// assert!(token.is_valid());
|
||||
// assert!(!token.is_expired());
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_join_federation_timeout() {
|
||||
// Test handshake timeout
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_join_federation_invalid_peer() {
|
||||
// Test joining with invalid peer address
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_federation_token_expiry() {
|
||||
// Test token expiration
|
||||
// let token = FederationToken {
|
||||
// expires: SubstrateTime::now() - 1000,
|
||||
// ..Default::default()
|
||||
// };
|
||||
//
|
||||
// assert!(token.is_expired());
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_capability_negotiation() {
|
||||
// Test capability exchange and negotiation
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod byzantine_consensus_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_byzantine_commit_sufficient_votes() {
|
||||
// Test consensus with 2f+1 agreement (n=3f+1)
|
||||
// let federation = setup_federation(node_count: 10); // f=3, need 7 votes
|
||||
//
|
||||
// let update = StateUpdate::new("test_update");
|
||||
// let proof = federation.byzantine_commit(&update).await.unwrap();
|
||||
//
|
||||
// assert!(proof.votes.len() >= 7);
|
||||
// assert!(proof.is_valid());
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_byzantine_commit_insufficient_votes() {
|
||||
// Test consensus failure with < 2f+1
|
||||
// let federation = setup_federation_with_failures(10, failures: 4);
|
||||
//
|
||||
// let update = StateUpdate::new("test_update");
|
||||
// let result = federation.byzantine_commit(&update).await;
|
||||
//
|
||||
// assert!(matches!(result, Err(Error::InsufficientConsensus)));
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_byzantine_three_phase_commit() {
|
||||
// Test Pre-prepare -> Prepare -> Commit phases
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_byzantine_malicious_proposal() {
|
||||
// Test rejection of invalid proposals
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_byzantine_view_change() {
|
||||
// Test leader change on timeout
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod crdt_reconciliation_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_crdt_gset_merge() {
|
||||
// Test G-Set (grow-only set) reconciliation
|
||||
// let mut set1 = GSet::new();
|
||||
// set1.add("item1");
|
||||
// set1.add("item2");
|
||||
//
|
||||
// let mut set2 = GSet::new();
|
||||
// set2.add("item2");
|
||||
// set2.add("item3");
|
||||
//
|
||||
// let merged = set1.merge(set2);
|
||||
//
|
||||
// assert_eq!(merged.len(), 3);
|
||||
// assert!(merged.contains("item1"));
|
||||
// assert!(merged.contains("item2"));
|
||||
// assert!(merged.contains("item3"));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_crdt_lww_register() {
|
||||
// Test LWW-Register (last-writer-wins)
|
||||
// let mut reg1 = LWWRegister::new();
|
||||
// reg1.set("value1", timestamp: 1000);
|
||||
//
|
||||
// let mut reg2 = LWWRegister::new();
|
||||
// reg2.set("value2", timestamp: 2000); // Later timestamp
|
||||
//
|
||||
// let merged = reg1.merge(reg2);
|
||||
//
|
||||
// assert_eq!(merged.get(), "value2"); // Latest wins
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_crdt_lww_map() {
|
||||
// Test LWW-Map reconciliation
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_crdt_reconcile_federated_results() {
|
||||
// Test reconciling federated query results
|
||||
// let responses = vec![
|
||||
// FederatedResponse { results: vec![r1, r2], rankings: ... },
|
||||
// FederatedResponse { results: vec![r2, r3], rankings: ... },
|
||||
// ];
|
||||
//
|
||||
// let reconciled = reconcile_crdt(responses, local_state);
|
||||
//
|
||||
// // Should contain union of results with reconciled rankings
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod onion_routing_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_onion_wrap_basic() {
|
||||
// Test onion wrapping with relay chain
|
||||
// let relays = vec![relay1, relay2, relay3];
|
||||
// let query = Query::new("test");
|
||||
//
|
||||
// let wrapped = onion_wrap(&query, &relays);
|
||||
//
|
||||
// // Should have layers for each relay
|
||||
// assert_eq!(wrapped.num_layers(), relays.len());
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_onion_routing_privacy() {
|
||||
// Test that intermediate nodes cannot decrypt payload
|
||||
// let wrapped = onion_wrap(&query, &relays);
|
||||
//
|
||||
// // Intermediate relay should not be able to see final query
|
||||
// let relay1_view = relays[1].decrypt_layer(wrapped);
|
||||
// assert!(!relay1_view.contains_plaintext_query());
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_onion_unwrap() {
|
||||
// Test unwrapping onion layers
|
||||
// let wrapped = onion_wrap(&query, &relays);
|
||||
// let response = send_through_onion(wrapped).await;
|
||||
//
|
||||
// let unwrapped = onion_unwrap(response, &local_keys, &relays);
|
||||
//
|
||||
// assert_eq!(unwrapped, expected_response);
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_onion_routing_failure() {
|
||||
// Test handling of relay failure
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod federated_query_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_federated_query_local_scope() {
|
||||
// Test query with local-only scope
|
||||
// let federation = setup_federation();
|
||||
// let results = federation.federated_query(&query, FederationScope::Local).await;
|
||||
//
|
||||
// // Should only return local results
|
||||
// assert!(results.iter().all(|r| r.source.is_local()));
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_federated_query_global_scope() {
|
||||
// Test query broadcast to all peers
|
||||
// let federation = setup_federation_with_peers(5);
|
||||
// let results = federation.federated_query(&query, FederationScope::Global).await;
|
||||
//
|
||||
// // Should have results from multiple peers
|
||||
// let sources: HashSet<_> = results.iter().map(|r| r.source).collect();
|
||||
// assert!(sources.len() > 1);
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_federated_query_scoped() {
|
||||
// Test query with specific peer scope
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_federated_query_timeout() {
|
||||
// Test handling of slow/unresponsive peers
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod raft_consensus_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_raft_leader_election() {
|
||||
// Test Raft leader election
|
||||
// let cluster = setup_raft_cluster(5);
|
||||
//
|
||||
// // Wait for leader election
|
||||
// tokio::time::sleep(Duration::from_millis(1000)).await;
|
||||
//
|
||||
// let leaders: Vec<_> = cluster.nodes.iter()
|
||||
// .filter(|n| n.is_leader())
|
||||
// .collect();
|
||||
//
|
||||
// assert_eq!(leaders.len(), 1); // Exactly one leader
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_raft_log_replication() {
|
||||
// Test log replication
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_raft_commit() {
|
||||
// Test entry commitment
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod encrypted_channel_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_encrypted_channel_send() {
|
||||
// Test sending encrypted message
|
||||
// let channel = EncryptedChannel::new(peer, encrypt_key, mac_key);
|
||||
// channel.send(message).await.unwrap();
|
||||
//
|
||||
// // Message should be encrypted
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_encrypted_channel_receive() {
|
||||
// Test receiving encrypted message
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_encrypted_channel_mac_verification() {
|
||||
// Test MAC verification on receive
|
||||
// Should reject messages with invalid MAC
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_encrypted_channel_replay_attack() {
|
||||
// Test replay attack prevention
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod edge_cases_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_single_node_federation() {
|
||||
// Test federation with single node
|
||||
// let federation = FederatedMesh::new(config);
|
||||
//
|
||||
// // Should handle queries locally
|
||||
// let results = federation.federated_query(&query, FederationScope::Global).await;
|
||||
// assert!(!results.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_network_partition() {
|
||||
// Test handling of network partition
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_byzantine_fault_tolerance_limit() {
|
||||
// Test f < n/3 Byzantine fault tolerance limit
|
||||
// With n=10, can tolerate f=3 faulty nodes
|
||||
// With f=4, consensus should fail
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_concurrent_commits() {
|
||||
// Test concurrent state updates
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,310 @@
|
||||
//! Unit tests for exo-hypergraph substrate
|
||||
|
||||
#[cfg(test)]
|
||||
mod hyperedge_creation_tests {
|
||||
use super::*;
|
||||
// use exo_hypergraph::*;
|
||||
|
||||
#[test]
|
||||
fn test_create_basic_hyperedge() {
|
||||
// Test creating a hyperedge with 3 entities
|
||||
// let mut substrate = HypergraphSubstrate::new();
|
||||
//
|
||||
// let e1 = EntityId::new();
|
||||
// let e2 = EntityId::new();
|
||||
// let e3 = EntityId::new();
|
||||
//
|
||||
// let relation = Relation::new("connects");
|
||||
// let hyperedge_id = substrate.create_hyperedge(
|
||||
// &[e1, e2, e3],
|
||||
// &relation
|
||||
// ).unwrap();
|
||||
//
|
||||
// assert!(substrate.hyperedge_exists(hyperedge_id));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_create_hyperedge_2_entities() {
|
||||
// Test creating hyperedge with 2 entities (edge case)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_create_hyperedge_many_entities() {
|
||||
// Test creating hyperedge with many entities (10+)
|
||||
// for n in [10, 50, 100] {
|
||||
// let entities: Vec<_> = (0..n).map(|_| EntityId::new()).collect();
|
||||
// let result = substrate.create_hyperedge(&entities, &relation);
|
||||
// assert!(result.is_ok());
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_create_hyperedge_invalid_entity() {
|
||||
// Test error when entity doesn't exist
|
||||
// let mut substrate = HypergraphSubstrate::new();
|
||||
// let nonexistent = EntityId::new();
|
||||
//
|
||||
// let result = substrate.create_hyperedge(&[nonexistent], &relation);
|
||||
// assert!(result.is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_create_hyperedge_duplicate_entities() {
|
||||
// Test handling of duplicate entities in set
|
||||
// let e1 = EntityId::new();
|
||||
// let result = substrate.create_hyperedge(&[e1, e1], &relation);
|
||||
// // Should either deduplicate or error
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod hyperedge_query_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_query_hyperedges_by_entity() {
|
||||
// Test finding all hyperedges containing an entity
|
||||
// let mut substrate = HypergraphSubstrate::new();
|
||||
// let e1 = substrate.add_entity("entity_1");
|
||||
//
|
||||
// let h1 = substrate.create_hyperedge(&[e1, e2], &r1).unwrap();
|
||||
// let h2 = substrate.create_hyperedge(&[e1, e3], &r2).unwrap();
|
||||
//
|
||||
// let containing_e1 = substrate.hyperedges_containing(e1);
|
||||
// assert_eq!(containing_e1.len(), 2);
|
||||
// assert!(containing_e1.contains(&h1));
|
||||
// assert!(containing_e1.contains(&h2));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_query_hyperedges_by_relation() {
|
||||
// Test finding hyperedges by relation type
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_query_hyperedges_by_entity_set() {
|
||||
// Test finding hyperedges spanning specific entity set
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod persistent_homology_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_persistent_homology_0d() {
|
||||
// Test 0-dimensional homology (connected components)
|
||||
// let substrate = build_test_hypergraph();
|
||||
//
|
||||
// let diagram = substrate.persistent_homology(0, (0.0, 1.0));
|
||||
//
|
||||
// // Verify number of connected components
|
||||
// assert_eq!(diagram.num_features(), expected_components);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_persistent_homology_1d() {
|
||||
// Test 1-dimensional homology (cycles/loops)
|
||||
// Create hypergraph with known cycle structure
|
||||
// let substrate = build_cycle_hypergraph();
|
||||
//
|
||||
// let diagram = substrate.persistent_homology(1, (0.0, 1.0));
|
||||
//
|
||||
// // Verify cycle detection
|
||||
// assert!(diagram.has_persistent_features());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_persistent_homology_2d() {
|
||||
// Test 2-dimensional homology (voids)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_persistence_diagram_birth_death() {
|
||||
// Test birth-death times in persistence diagram
|
||||
// let diagram = substrate.persistent_homology(1, (0.0, 2.0));
|
||||
//
|
||||
// for feature in diagram.features() {
|
||||
// assert!(feature.birth < feature.death);
|
||||
// assert!(feature.birth >= 0.0);
|
||||
// assert!(feature.death <= 2.0);
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_persistence_diagram_essential_features() {
|
||||
// Test detection of essential (infinite persistence) features
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod betti_numbers_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_betti_numbers_simple_complex() {
|
||||
// Test Betti numbers for simple simplicial complex
|
||||
// let substrate = build_simple_complex();
|
||||
// let betti = substrate.betti_numbers(2);
|
||||
//
|
||||
// // For a sphere: b0=1, b1=0, b2=1
|
||||
// assert_eq!(betti[0], 1); // One connected component
|
||||
// assert_eq!(betti[1], 0); // No holes
|
||||
// assert_eq!(betti[2], 1); // One void
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_betti_numbers_torus() {
|
||||
// Test Betti numbers for torus-like structure
|
||||
// Torus: b0=1, b1=2, b2=1
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_betti_numbers_disconnected() {
|
||||
// Test with multiple connected components
|
||||
// let substrate = build_disconnected_complex();
|
||||
// let betti = substrate.betti_numbers(0);
|
||||
//
|
||||
// assert_eq!(betti[0], num_components);
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod sheaf_consistency_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "sheaf-consistency")]
|
||||
fn test_sheaf_consistency_check_consistent() {
|
||||
// Test sheaf consistency on consistent structure
|
||||
// let substrate = build_consistent_sheaf();
|
||||
// let sections = vec![section1, section2];
|
||||
//
|
||||
// let result = substrate.check_sheaf_consistency(§ions);
|
||||
//
|
||||
// assert!(matches!(result, SheafConsistencyResult::Consistent));
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "sheaf-consistency")]
|
||||
fn test_sheaf_consistency_check_inconsistent() {
|
||||
// Test detection of inconsistency
|
||||
// let substrate = build_inconsistent_sheaf();
|
||||
// let sections = vec![section1, section2];
|
||||
//
|
||||
// let result = substrate.check_sheaf_consistency(§ions);
|
||||
//
|
||||
// match result {
|
||||
// SheafConsistencyResult::Inconsistent(inconsistencies) => {
|
||||
// assert!(!inconsistencies.is_empty());
|
||||
// }
|
||||
// _ => panic!("Expected inconsistency"),
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "sheaf-consistency")]
|
||||
fn test_sheaf_restriction_maps() {
|
||||
// Test restriction map operations
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod simplicial_complex_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_add_simplex_0d() {
|
||||
// Test adding 0-simplices (vertices)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_add_simplex_1d() {
|
||||
// Test adding 1-simplices (edges)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_add_simplex_2d() {
|
||||
// Test adding 2-simplices (triangles)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_add_simplex_invalid() {
|
||||
// Test adding simplex with non-existent vertices
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_simplex_boundary() {
|
||||
// Test boundary operator
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod hyperedge_index_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_entity_index_update() {
|
||||
// Test entity->hyperedges inverted index
|
||||
// let mut substrate = HypergraphSubstrate::new();
|
||||
// let e1 = substrate.add_entity("e1");
|
||||
//
|
||||
// let h1 = substrate.create_hyperedge(&[e1], &r1).unwrap();
|
||||
//
|
||||
// let containing = substrate.entity_index.get(&e1);
|
||||
// assert!(containing.contains(&h1));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_relation_index_update() {
|
||||
// Test relation->hyperedges index
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_concurrent_index_access() {
|
||||
// Test DashMap concurrent access
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod integration_with_ruvector_graph_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_ruvector_graph_integration() {
|
||||
// Test integration with ruvector-graph base
|
||||
// Verify hypergraph extends ruvector-graph properly
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_graph_database_queries() {
|
||||
// Test using base GraphDatabase for queries
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod edge_cases_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_empty_hypergraph() {
|
||||
// Test operations on empty hypergraph
|
||||
// let substrate = HypergraphSubstrate::new();
|
||||
// let betti = substrate.betti_numbers(2);
|
||||
// assert_eq!(betti[0], 0); // No components
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_single_entity() {
|
||||
// Test hypergraph with single entity
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_large_hypergraph() {
|
||||
// Test scalability with large numbers of entities/edges
|
||||
// for size in [1000, 10000, 100000] {
|
||||
// let substrate = build_large_hypergraph(size);
|
||||
// // Verify operations complete in reasonable time
|
||||
// }
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,249 @@
|
||||
//! Unit tests for exo-manifold learned manifold engine
|
||||
|
||||
#[cfg(test)]
|
||||
mod manifold_retrieval_tests {
|
||||
use super::*;
|
||||
// use exo_manifold::*;
|
||||
// use burn::backend::NdArray;
|
||||
|
||||
#[test]
|
||||
fn test_manifold_retrieve_basic() {
|
||||
// Test basic retrieval operation
|
||||
// let backend = NdArray::<f32>::default();
|
||||
// let config = ManifoldConfig::default();
|
||||
// let engine = ManifoldEngine::<NdArray<f32>>::new(config);
|
||||
//
|
||||
// let query = Tensor::from_floats([0.1, 0.2, 0.3, 0.4]);
|
||||
// let results = engine.retrieve(query, 5);
|
||||
//
|
||||
// assert_eq!(results.len(), 5);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_manifold_retrieve_convergence() {
|
||||
// Test that gradient descent converges
|
||||
// let engine = setup_test_engine();
|
||||
// let query = random_query();
|
||||
//
|
||||
// let results = engine.retrieve(query.clone(), 10);
|
||||
//
|
||||
// // Verify convergence (gradient norm below threshold)
|
||||
// assert!(engine.last_gradient_norm() < 1e-4);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_manifold_retrieve_different_k() {
|
||||
// Test retrieval with different k values
|
||||
// for k in [1, 5, 10, 50, 100] {
|
||||
// let results = engine.retrieve(query.clone(), k);
|
||||
// assert_eq!(results.len(), k);
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_manifold_retrieve_empty() {
|
||||
// Test retrieval from empty manifold
|
||||
// let engine = ManifoldEngine::new(config);
|
||||
// let results = engine.retrieve(query, 10);
|
||||
// assert!(results.is_empty());
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod manifold_deformation_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_manifold_deform_basic() {
|
||||
// Test basic deformation operation
|
||||
// let mut engine = setup_test_engine();
|
||||
// let pattern = sample_pattern();
|
||||
//
|
||||
// engine.deform(pattern, 0.8);
|
||||
//
|
||||
// // Verify manifold was updated
|
||||
// assert!(engine.has_been_deformed());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_manifold_deform_salience() {
|
||||
// Test deformation with different salience values
|
||||
// let mut engine = setup_test_engine();
|
||||
//
|
||||
// let high_salience = sample_pattern();
|
||||
// engine.deform(high_salience, 0.9);
|
||||
//
|
||||
// let low_salience = sample_pattern();
|
||||
// engine.deform(low_salience, 0.1);
|
||||
//
|
||||
// // Verify high salience has stronger influence
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_manifold_deform_gradient_update() {
|
||||
// Test that deformation updates network weights
|
||||
// let mut engine = setup_test_engine();
|
||||
// let initial_params = engine.network_parameters().clone();
|
||||
//
|
||||
// engine.deform(sample_pattern(), 0.5);
|
||||
//
|
||||
// let updated_params = engine.network_parameters();
|
||||
// assert_ne!(initial_params, updated_params);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_manifold_deform_smoothness_regularization() {
|
||||
// Test that smoothness loss is applied
|
||||
// Verify manifold doesn't overfit to single patterns
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod strategic_forgetting_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_forget_low_salience_regions() {
|
||||
// Test forgetting mechanism
|
||||
// let mut engine = setup_test_engine();
|
||||
//
|
||||
// // Populate with low-salience patterns
|
||||
// for i in 0..10 {
|
||||
// engine.deform(low_salience_pattern(i), 0.1);
|
||||
// }
|
||||
//
|
||||
// // Apply forgetting
|
||||
// let region = engine.identify_low_salience_regions(0.2);
|
||||
// engine.forget(®ion, 0.5);
|
||||
//
|
||||
// // Verify patterns are less retrievable
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_forget_preserves_high_salience() {
|
||||
// Test that forgetting doesn't affect high-salience regions
|
||||
// let mut engine = setup_test_engine();
|
||||
//
|
||||
// engine.deform(high_salience_pattern(), 0.9);
|
||||
// let before = engine.retrieve(query, 1);
|
||||
//
|
||||
// engine.forget(&low_salience_region, 0.5);
|
||||
//
|
||||
// let after = engine.retrieve(query, 1);
|
||||
// assert_similar(before, after);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_forget_kernel_application() {
|
||||
// Test Gaussian smoothing kernel
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod siren_network_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_siren_forward_pass() {
|
||||
// Test SIREN network forward propagation
|
||||
// let network = LearnedManifold::new(config);
|
||||
// let input = Tensor::from_floats([0.5, 0.5]);
|
||||
// let output = network.forward(input);
|
||||
//
|
||||
// assert!(output.dims()[0] > 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_siren_backward_pass() {
|
||||
// Test gradient computation through SIREN layers
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_siren_sinusoidal_activation() {
|
||||
// Test that SIREN uses sinusoidal activations correctly
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod fourier_features_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_fourier_encoding() {
|
||||
// Test Fourier feature transformation
|
||||
// let encoding = FourierEncoding::new(config);
|
||||
// let input = Tensor::from_floats([0.1, 0.2]);
|
||||
// let features = encoding.encode(input);
|
||||
//
|
||||
// // Verify feature dimensionality
|
||||
// assert_eq!(features.dims()[1], config.num_fourier_features);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_fourier_frequency_spectrum() {
|
||||
// Test frequency spectrum configuration
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod tensor_train_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "tensor-train")]
|
||||
fn test_tensor_train_decomposition() {
|
||||
// Test Tensor Train compression
|
||||
// let engine = setup_engine_with_tt();
|
||||
//
|
||||
// // Verify compression ratio
|
||||
// let original_size = engine.uncompressed_size();
|
||||
// let compressed_size = engine.compressed_size();
|
||||
//
|
||||
// assert!(compressed_size < original_size / 10); // >10x compression
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[cfg(feature = "tensor-train")]
|
||||
fn test_tensor_train_accuracy() {
|
||||
// Test that TT preserves accuracy
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod edge_cases_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_nan_handling() {
|
||||
// Test handling of NaN values in embeddings
|
||||
// let mut engine = setup_test_engine();
|
||||
// let pattern_with_nan = Pattern {
|
||||
// embedding: vec![f32::NAN, 0.2, 0.3],
|
||||
// ..Default::default()
|
||||
// };
|
||||
//
|
||||
// let result = engine.deform(pattern_with_nan, 0.5);
|
||||
// assert!(result.is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_infinity_handling() {
|
||||
// Test handling of infinity values
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_zero_dimension_embedding() {
|
||||
// Test empty embedding vector
|
||||
// let pattern = Pattern {
|
||||
// embedding: vec![],
|
||||
// ..Default::default()
|
||||
// };
|
||||
//
|
||||
// assert!(engine.deform(pattern, 0.5).is_err());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_max_iterations_reached() {
|
||||
// Test gradient descent timeout
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,391 @@
|
||||
//! Unit tests for exo-temporal memory coordinator
|
||||
|
||||
#[cfg(test)]
|
||||
mod causal_cone_query_tests {
|
||||
use super::*;
|
||||
// use exo_temporal::*;
|
||||
|
||||
#[test]
|
||||
fn test_causal_query_past_cone() {
|
||||
// Test querying past causal cone
|
||||
// let mut memory = TemporalMemory::new();
|
||||
//
|
||||
// let now = SubstrateTime::now();
|
||||
// let past1 = memory.store(pattern_at(now - 1000), &[]).unwrap();
|
||||
// let past2 = memory.store(pattern_at(now - 500), &[past1]).unwrap();
|
||||
// let future1 = memory.store(pattern_at(now + 500), &[]).unwrap();
|
||||
//
|
||||
// let results = memory.causal_query(
|
||||
// &query,
|
||||
// now,
|
||||
// CausalConeType::Past
|
||||
// );
|
||||
//
|
||||
// assert!(results.iter().all(|r| r.timestamp <= now));
|
||||
// assert!(results.iter().any(|r| r.id == past1));
|
||||
// assert!(results.iter().any(|r| r.id == past2));
|
||||
// assert!(!results.iter().any(|r| r.id == future1));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_causal_query_future_cone() {
|
||||
// Test querying future causal cone
|
||||
// let results = memory.causal_query(
|
||||
// &query,
|
||||
// reference_time,
|
||||
// CausalConeType::Future
|
||||
// );
|
||||
//
|
||||
// assert!(results.iter().all(|r| r.timestamp >= reference_time));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_causal_query_light_cone() {
|
||||
// Test light-cone constraint (relativistic causality)
|
||||
// let velocity = 1.0; // Speed of light
|
||||
// let results = memory.causal_query(
|
||||
// &query,
|
||||
// reference_time,
|
||||
// CausalConeType::LightCone { velocity }
|
||||
// );
|
||||
//
|
||||
// // Verify |delta_x| <= c * |delta_t|
|
||||
// for result in results {
|
||||
// let dt = (result.timestamp - reference_time).abs();
|
||||
// let dx = distance(result.position, query.position);
|
||||
// assert!(dx <= velocity * dt);
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_causal_distance_calculation() {
|
||||
// Test causal distance in causal graph
|
||||
// let p1 = memory.store(pattern1, &[]).unwrap();
|
||||
// let p2 = memory.store(pattern2, &[p1]).unwrap();
|
||||
// let p3 = memory.store(pattern3, &[p2]).unwrap();
|
||||
//
|
||||
// let distance = memory.causal_graph.distance(p1, p3);
|
||||
// assert_eq!(distance, 2); // Two hops
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod memory_consolidation_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_short_term_to_long_term() {
|
||||
// Test memory consolidation
|
||||
// let mut memory = TemporalMemory::new();
|
||||
//
|
||||
// // Fill short-term buffer
|
||||
// for i in 0..100 {
|
||||
// memory.store(pattern(i), &[]).unwrap();
|
||||
// }
|
||||
//
|
||||
// assert!(memory.short_term.should_consolidate());
|
||||
//
|
||||
// // Trigger consolidation
|
||||
// memory.consolidate();
|
||||
//
|
||||
// // Verify short-term is cleared
|
||||
// assert!(memory.short_term.is_empty());
|
||||
//
|
||||
// // Verify salient patterns moved to long-term
|
||||
// assert!(memory.long_term.size() > 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_salience_filtering() {
|
||||
// Test that only salient patterns are consolidated
|
||||
// let mut memory = TemporalMemory::new();
|
||||
//
|
||||
// let high_salience = pattern_with_salience(0.9);
|
||||
// let low_salience = pattern_with_salience(0.1);
|
||||
//
|
||||
// memory.store(high_salience.clone(), &[]).unwrap();
|
||||
// memory.store(low_salience.clone(), &[]).unwrap();
|
||||
//
|
||||
// memory.consolidate();
|
||||
//
|
||||
// // High salience should be in long-term
|
||||
// assert!(memory.long_term.contains(&high_salience));
|
||||
//
|
||||
// // Low salience should not be
|
||||
// assert!(!memory.long_term.contains(&low_salience));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_salience_computation() {
|
||||
// Test salience scoring
|
||||
// let memory = setup_test_memory();
|
||||
//
|
||||
// let pattern = sample_pattern();
|
||||
// let salience = memory.compute_salience(&pattern);
|
||||
//
|
||||
// // Salience should be between 0 and 1
|
||||
// assert!(salience >= 0.0 && salience <= 1.0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_salience_access_frequency() {
|
||||
// Test access frequency component of salience
|
||||
// let mut memory = setup_test_memory();
|
||||
// let p_id = memory.store(pattern, &[]).unwrap();
|
||||
//
|
||||
// // Access multiple times
|
||||
// for _ in 0..10 {
|
||||
// memory.retrieve(p_id);
|
||||
// }
|
||||
//
|
||||
// let salience = memory.compute_salience_for(p_id);
|
||||
// assert!(salience > baseline_salience);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_salience_recency() {
|
||||
// Test recency component
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_salience_causal_importance() {
|
||||
// Test causal importance component
|
||||
// Patterns with many dependents should have higher salience
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_salience_surprise() {
|
||||
// Test surprise component
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod anticipation_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_anticipate_sequential_pattern() {
|
||||
// Test predictive pre-fetch from sequential patterns
|
||||
// let mut memory = setup_test_memory();
|
||||
//
|
||||
// // Establish pattern: A -> B -> C
|
||||
// memory.store_sequence([pattern_a, pattern_b, pattern_c]);
|
||||
//
|
||||
// // Query A, then B
|
||||
// memory.query(&pattern_a);
|
||||
// memory.query(&pattern_b);
|
||||
//
|
||||
// // Anticipate should predict C
|
||||
// let hints = vec![AnticipationHint::SequentialPattern];
|
||||
// memory.anticipate(&hints);
|
||||
//
|
||||
// // Verify C is pre-fetched in cache
|
||||
// assert!(memory.prefetch_cache.contains_key(&hash(pattern_c)));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_anticipate_temporal_cycle() {
|
||||
// Test time-of-day pattern anticipation
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_anticipate_causal_chain() {
|
||||
// Test causal dependency prediction
|
||||
// If A causes B and C, querying A should pre-fetch B and C
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_anticipate_cache_hit() {
|
||||
// Test that anticipated queries hit cache
|
||||
// let mut memory = setup_test_memory_with_anticipation();
|
||||
//
|
||||
// // Trigger anticipation
|
||||
// memory.anticipate(&hints);
|
||||
//
|
||||
// // Query anticipated item
|
||||
// let start = now();
|
||||
// let result = memory.query(&anticipated_query);
|
||||
// let duration = now() - start;
|
||||
//
|
||||
// // Should be faster due to cache hit
|
||||
// assert!(duration < baseline_duration / 2);
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod causal_graph_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_causal_graph_add_edge() {
|
||||
// Test adding causal edge
|
||||
// let mut graph = CausalGraph::new();
|
||||
// let p1 = PatternId::new();
|
||||
// let p2 = PatternId::new();
|
||||
//
|
||||
// graph.add_edge(p1, p2);
|
||||
//
|
||||
// assert!(graph.has_edge(p1, p2));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_causal_graph_forward_edges() {
|
||||
// Test forward edge index (cause -> effects)
|
||||
// graph.add_edge(p1, p2);
|
||||
// graph.add_edge(p1, p3);
|
||||
//
|
||||
// let effects = graph.forward.get(&p1);
|
||||
// assert_eq!(effects.len(), 2);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_causal_graph_backward_edges() {
|
||||
// Test backward edge index (effect -> causes)
|
||||
// graph.add_edge(p1, p3);
|
||||
// graph.add_edge(p2, p3);
|
||||
//
|
||||
// let causes = graph.backward.get(&p3);
|
||||
// assert_eq!(causes.len(), 2);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_causal_graph_shortest_path() {
|
||||
// Test shortest path calculation
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_causal_graph_out_degree() {
|
||||
// Test out-degree for causal importance
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod temporal_knowledge_graph_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_tkg_add_temporal_fact() {
|
||||
// Test adding temporal fact to TKG
|
||||
// let mut tkg = TemporalKnowledgeGraph::new();
|
||||
// let fact = TemporalFact {
|
||||
// subject: entity1,
|
||||
// predicate: relation,
|
||||
// object: entity2,
|
||||
// timestamp: SubstrateTime::now(),
|
||||
// };
|
||||
//
|
||||
// tkg.add_fact(fact);
|
||||
//
|
||||
// assert!(tkg.has_fact(&fact));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tkg_temporal_query() {
|
||||
// Test querying facts within time range
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_tkg_temporal_relations() {
|
||||
// Test temporal relation inference
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod short_term_buffer_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_short_term_insert() {
|
||||
// Test inserting into short-term buffer
|
||||
// let mut buffer = ShortTermBuffer::new(capacity: 100);
|
||||
// let id = buffer.insert(pattern);
|
||||
// assert!(buffer.contains(id));
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_short_term_capacity() {
|
||||
// Test buffer capacity limits
|
||||
// let mut buffer = ShortTermBuffer::new(capacity: 10);
|
||||
//
|
||||
// for i in 0..20 {
|
||||
// buffer.insert(pattern(i));
|
||||
// }
|
||||
//
|
||||
// assert_eq!(buffer.len(), 10); // Should maintain capacity
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_short_term_eviction() {
|
||||
// Test eviction policy (FIFO or LRU)
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_short_term_should_consolidate() {
|
||||
// Test consolidation trigger
|
||||
// let mut buffer = ShortTermBuffer::new(capacity: 100);
|
||||
//
|
||||
// for i in 0..80 {
|
||||
// buffer.insert(pattern(i));
|
||||
// }
|
||||
//
|
||||
// assert!(buffer.should_consolidate()); // > 75% full
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod long_term_store_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_long_term_integrate() {
|
||||
// Test integrating pattern into long-term storage
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_long_term_search() {
|
||||
// Test search in long-term storage
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_long_term_decay() {
|
||||
// Test strategic decay of low-salience
|
||||
// let mut store = LongTermStore::new();
|
||||
//
|
||||
// store.integrate(high_salience_pattern(), 0.9);
|
||||
// store.integrate(low_salience_pattern(), 0.1);
|
||||
//
|
||||
// store.decay_low_salience(0.2); // Threshold
|
||||
//
|
||||
// // High salience should remain
|
||||
// // Low salience should be decayed
|
||||
}
|
||||
}
|
||||
|
||||
#[cfg(test)]
|
||||
mod edge_cases_tests {
|
||||
use super::*;
|
||||
|
||||
#[test]
|
||||
fn test_empty_antecedents() {
|
||||
// Test storing pattern with no causal antecedents
|
||||
// let mut memory = TemporalMemory::new();
|
||||
// let id = memory.store(pattern, &[]).unwrap();
|
||||
// assert!(memory.causal_graph.backward.get(&id).is_none());
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_circular_causality() {
|
||||
// Test detecting/handling circular causal dependencies
|
||||
// Should this be allowed or prevented?
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_time_travel_query() {
|
||||
// Test querying with reference_time in the future
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_concurrent_consolidation() {
|
||||
// Test concurrent access during consolidation
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,58 @@
|
||||
//! Full-stack integration tests: All components together
|
||||
|
||||
#[cfg(test)]
|
||||
mod full_stack_integration {
|
||||
use super::*;
|
||||
// use exo_core::*;
|
||||
// use exo_manifold::*;
|
||||
// use exo_hypergraph::*;
|
||||
// use exo_temporal::*;
|
||||
// use exo_federation::*;
|
||||
// use exo_backend_classical::*;
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_complete_cognitive_substrate() {
|
||||
// Test complete system: manifold + hypergraph + temporal + federation
|
||||
//
|
||||
// // Setup
|
||||
// let backend = ClassicalBackend::new(config);
|
||||
// let manifold = ManifoldEngine::new(backend.clone());
|
||||
// let hypergraph = HypergraphSubstrate::new(backend.clone());
|
||||
// let temporal = TemporalMemory::new();
|
||||
// let federation = FederatedMesh::new(fed_config);
|
||||
//
|
||||
// // Scenario: Multi-agent collaborative memory
|
||||
// // 1. Store patterns with temporal context
|
||||
// let p1 = temporal.store(pattern1, &[]).unwrap();
|
||||
//
|
||||
// // 2. Deform manifold
|
||||
// manifold.deform(&pattern1, 0.8);
|
||||
//
|
||||
// // 3. Create hypergraph relationships
|
||||
// hypergraph.create_hyperedge(&[p1, p2], &relation).unwrap();
|
||||
//
|
||||
// // 4. Query with causal constraints
|
||||
// let results = temporal.causal_query(&query, now, CausalConeType::Past);
|
||||
//
|
||||
// // 5. Federate query
|
||||
// let fed_results = federation.federated_query(&query, FederationScope::Global).await;
|
||||
//
|
||||
// // Verify all components work together
|
||||
// assert!(!results.is_empty());
|
||||
// assert!(!fed_results.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_agent_memory_lifecycle() {
|
||||
// Test complete memory lifecycle:
|
||||
// Storage -> Consolidation -> Retrieval -> Forgetting -> Federation
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_cross_component_consistency() {
|
||||
// Test that all components maintain consistent state
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,53 @@
|
||||
//! Integration tests: Manifold Engine + Hypergraph Substrate
|
||||
|
||||
#[cfg(test)]
|
||||
mod manifold_hypergraph_integration {
|
||||
use super::*;
|
||||
// use exo_manifold::*;
|
||||
// use exo_hypergraph::*;
|
||||
// use exo_backend_classical::ClassicalBackend;
|
||||
|
||||
#[test]
|
||||
fn test_manifold_with_hypergraph_structure() {
|
||||
// Test querying manifold with hypergraph topological constraints
|
||||
// let backend = ClassicalBackend::new(config);
|
||||
// let mut manifold = ManifoldEngine::new(backend.clone());
|
||||
// let mut hypergraph = HypergraphSubstrate::new(backend);
|
||||
//
|
||||
// // Store patterns in manifold
|
||||
// let p1 = manifold.deform(pattern1, 0.8);
|
||||
// let p2 = manifold.deform(pattern2, 0.7);
|
||||
// let p3 = manifold.deform(pattern3, 0.9);
|
||||
//
|
||||
// // Create hyperedges linking patterns
|
||||
// let relation = Relation::new("semantic_cluster");
|
||||
// hypergraph.create_hyperedge(&[p1, p2, p3], &relation).unwrap();
|
||||
//
|
||||
// // Query manifold and verify hypergraph structure
|
||||
// let results = manifold.retrieve(query, 10);
|
||||
//
|
||||
// // Verify results respect hypergraph topology
|
||||
// for result in results {
|
||||
// let edges = hypergraph.hyperedges_containing(result.id);
|
||||
// assert!(!edges.is_empty()); // Should be connected
|
||||
// }
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_persistent_homology_on_manifold() {
|
||||
// Test computing persistent homology on learned manifold
|
||||
// let manifold = setup_manifold_with_patterns();
|
||||
// let hypergraph = setup_hypergraph_from_manifold(&manifold);
|
||||
//
|
||||
// let diagram = hypergraph.persistent_homology(1, (0.0, 1.0));
|
||||
//
|
||||
// // Verify topological features detected
|
||||
// assert!(diagram.num_features() > 0);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn test_hypergraph_guided_retrieval() {
|
||||
// Test using hypergraph structure to guide manifold retrieval
|
||||
// Retrieve patterns, then expand via hyperedge traversal
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,47 @@
|
||||
//! Integration tests: Temporal Memory + Federation
|
||||
|
||||
#[cfg(test)]
|
||||
mod temporal_federation_integration {
|
||||
use super::*;
|
||||
// use exo_temporal::*;
|
||||
// use exo_federation::*;
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_federated_temporal_query() {
|
||||
// Test temporal queries across federation
|
||||
// let node1 = setup_federated_node_with_temporal(config1);
|
||||
// let node2 = setup_federated_node_with_temporal(config2);
|
||||
//
|
||||
// // Join federation
|
||||
// node1.join_federation(&node2.address()).await.unwrap();
|
||||
//
|
||||
// // Store temporal patterns on node1
|
||||
// let p1 = node1.temporal_memory.store(pattern1, &[]).unwrap();
|
||||
// let p2 = node1.temporal_memory.store(pattern2, &[p1]).unwrap();
|
||||
//
|
||||
// // Query from node2 with causal constraints
|
||||
// let query = Query::new("test");
|
||||
// let results = node2.federated_temporal_query(
|
||||
// &query,
|
||||
// SubstrateTime::now(),
|
||||
// CausalConeType::Past,
|
||||
// FederationScope::Global
|
||||
// ).await;
|
||||
//
|
||||
// // Should receive results from node1
|
||||
// assert!(!results.is_empty());
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_distributed_memory_consolidation() {
|
||||
// Test memory consolidation across federated nodes
|
||||
}
|
||||
|
||||
#[test]
|
||||
#[tokio::test]
|
||||
async fn test_causal_graph_federation() {
|
||||
// Test causal graph spanning multiple nodes
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user