minor updates
This commit is contained in:
72
CHANGELOG.md
Normal file
72
CHANGELOG.md
Normal file
@@ -0,0 +1,72 @@
|
|||||||
|
# Changelog
|
||||||
|
|
||||||
|
All notable changes to this project will be documented in this file.
|
||||||
|
|
||||||
|
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
|
||||||
|
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
||||||
|
|
||||||
|
## [1.1.0] - 2025-06-07
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Multi-column table of contents in README.md for improved navigation
|
||||||
|
- Enhanced documentation structure with better organization
|
||||||
|
- Improved visual layout for better user experience
|
||||||
|
|
||||||
|
### Changed
|
||||||
|
- Updated README.md table of contents to use a two-column layout
|
||||||
|
- Reorganized documentation sections for better logical flow
|
||||||
|
- Enhanced readability of navigation structure
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
- Restructured table of contents for better accessibility
|
||||||
|
- Improved visual hierarchy in documentation
|
||||||
|
- Enhanced user experience for documentation navigation
|
||||||
|
|
||||||
|
## [1.0.0] - 2024-12-01
|
||||||
|
|
||||||
|
### Added
|
||||||
|
- Initial release of WiFi DensePose
|
||||||
|
- Real-time WiFi-based human pose estimation using CSI data
|
||||||
|
- DensePose neural network integration
|
||||||
|
- RESTful API with comprehensive endpoints
|
||||||
|
- WebSocket streaming for real-time data
|
||||||
|
- Multi-person tracking capabilities
|
||||||
|
- Fall detection and activity recognition
|
||||||
|
- Healthcare, fitness, smart home, and security domain configurations
|
||||||
|
- Comprehensive CLI interface
|
||||||
|
- Docker and Kubernetes deployment support
|
||||||
|
- 100% test coverage
|
||||||
|
- Production-ready monitoring and logging
|
||||||
|
- Hardware abstraction layer for multiple WiFi devices
|
||||||
|
- Phase sanitization and signal processing
|
||||||
|
- Authentication and rate limiting
|
||||||
|
- Background task management
|
||||||
|
- Database integration with PostgreSQL and Redis
|
||||||
|
- Prometheus metrics and Grafana dashboards
|
||||||
|
- Comprehensive documentation and examples
|
||||||
|
|
||||||
|
### Features
|
||||||
|
- Privacy-preserving pose detection without cameras
|
||||||
|
- Sub-50ms latency with 30 FPS processing
|
||||||
|
- Support for up to 10 simultaneous person tracking
|
||||||
|
- Enterprise-grade security and scalability
|
||||||
|
- Cross-platform compatibility (Linux, macOS, Windows)
|
||||||
|
- GPU acceleration support
|
||||||
|
- Real-time analytics and alerting
|
||||||
|
- Configurable confidence thresholds
|
||||||
|
- Zone-based occupancy monitoring
|
||||||
|
- Historical data analysis
|
||||||
|
- Performance optimization tools
|
||||||
|
- Load testing capabilities
|
||||||
|
- Infrastructure as Code (Terraform, Ansible)
|
||||||
|
- CI/CD pipeline integration
|
||||||
|
- Comprehensive error handling and logging
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
- Complete user guide and API reference
|
||||||
|
- Deployment and troubleshooting guides
|
||||||
|
- Hardware setup and calibration instructions
|
||||||
|
- Performance benchmarks and optimization tips
|
||||||
|
- Contributing guidelines and code standards
|
||||||
|
- Security best practices
|
||||||
|
- Example configurations and use cases
|
||||||
123
README.md
123
README.md
@@ -24,60 +24,75 @@ A cutting-edge WiFi-based human pose estimation system that leverages Channel St
|
|||||||
|
|
||||||
## 📋 Table of Contents
|
## 📋 Table of Contents
|
||||||
|
|
||||||
1. [🚀 Key Features](#-key-features)
|
<table>
|
||||||
2. [🏗️ System Architecture](#️-system-architecture)
|
<tr>
|
||||||
- [Core Components](#core-components)
|
<td width="50%">
|
||||||
3. [📦 Installation](#-installation)
|
|
||||||
- [Using pip (Recommended)](#using-pip-recommended)
|
**🚀 Getting Started**
|
||||||
- [From Source](#from-source)
|
- [Key Features](#-key-features)
|
||||||
- [Using Docker](#using-docker)
|
- [System Architecture](#️-system-architecture)
|
||||||
- [System Requirements](#system-requirements)
|
- [Installation](#-installation)
|
||||||
4. [🚀 Quick Start](#-quick-start)
|
- [Using pip (Recommended)](#using-pip-recommended)
|
||||||
- [Basic Setup](#1-basic-setup)
|
- [From Source](#from-source)
|
||||||
- [Start the System](#2-start-the-system)
|
- [Using Docker](#using-docker)
|
||||||
- [Using the REST API](#3-using-the-rest-api)
|
- [System Requirements](#system-requirements)
|
||||||
- [Real-time Streaming](#4-real-time-streaming)
|
- [Quick Start](#-quick-start)
|
||||||
5. [🖥️ CLI Usage](#️-cli-usage)
|
- [Basic Setup](#1-basic-setup)
|
||||||
- [Installation](#cli-installation)
|
- [Start the System](#2-start-the-system)
|
||||||
- [Basic Commands](#basic-commands)
|
- [Using the REST API](#3-using-the-rest-api)
|
||||||
- [Configuration Commands](#configuration-commands)
|
- [Real-time Streaming](#4-real-time-streaming)
|
||||||
- [Monitoring Commands](#monitoring-commands)
|
|
||||||
- [Examples](#cli-examples)
|
**🖥️ Usage & Configuration**
|
||||||
6. [📚 Documentation](#-documentation)
|
- [CLI Usage](#️-cli-usage)
|
||||||
- [Core Documentation](#-core-documentation)
|
- [Installation](#cli-installation)
|
||||||
- [Quick Links](#-quick-links)
|
- [Basic Commands](#basic-commands)
|
||||||
- [API Overview](#-api-overview)
|
- [Configuration Commands](#configuration-commands)
|
||||||
7. [🔧 Hardware Setup](#-hardware-setup)
|
- [Examples](#cli-examples)
|
||||||
- [Supported Hardware](#supported-hardware)
|
- [Documentation](#-documentation)
|
||||||
- [Physical Setup](#physical-setup)
|
- [Core Documentation](#-core-documentation)
|
||||||
- [Network Configuration](#network-configuration)
|
- [Quick Links](#-quick-links)
|
||||||
- [Environment Calibration](#environment-calibration)
|
- [API Overview](#-api-overview)
|
||||||
8. [⚙️ Configuration](#️-configuration)
|
- [Hardware Setup](#-hardware-setup)
|
||||||
- [Environment Variables](#environment-variables)
|
- [Supported Hardware](#supported-hardware)
|
||||||
- [Domain-Specific Configurations](#domain-specific-configurations)
|
- [Physical Setup](#physical-setup)
|
||||||
- [Advanced Configuration](#advanced-configuration)
|
- [Network Configuration](#network-configuration)
|
||||||
9. [🧪 Testing](#-testing)
|
- [Environment Calibration](#environment-calibration)
|
||||||
- [Running Tests](#running-tests)
|
|
||||||
- [Test Categories](#test-categories)
|
</td>
|
||||||
- [Mock Testing](#mock-testing)
|
<td width="50%">
|
||||||
- [Continuous Integration](#continuous-integration)
|
|
||||||
10. [🚀 Deployment](#-deployment)
|
**⚙️ Advanced Topics**
|
||||||
- [Production Deployment](#production-deployment)
|
- [Configuration](#️-configuration)
|
||||||
- [Infrastructure as Code](#infrastructure-as-code)
|
- [Environment Variables](#environment-variables)
|
||||||
- [Monitoring and Logging](#monitoring-and-logging)
|
- [Domain-Specific Configurations](#domain-specific-configurations)
|
||||||
11. [📊 Performance Metrics](#-performance-metrics)
|
- [Advanced Configuration](#advanced-configuration)
|
||||||
- [Benchmark Results](#benchmark-results)
|
- [Testing](#-testing)
|
||||||
- [Performance Optimization](#performance-optimization)
|
- [Running Tests](#running-tests)
|
||||||
- [Load Testing](#load-testing)
|
- [Test Categories](#test-categories)
|
||||||
12. [🤝 Contributing](#-contributing)
|
- [Mock Testing](#mock-testing)
|
||||||
- [Development Setup](#development-setup)
|
- [Continuous Integration](#continuous-integration)
|
||||||
- [Code Standards](#code-standards)
|
- [Deployment](#-deployment)
|
||||||
- [Contribution Process](#contribution-process)
|
- [Production Deployment](#production-deployment)
|
||||||
- [Code Review Checklist](#code-review-checklist)
|
- [Infrastructure as Code](#infrastructure-as-code)
|
||||||
- [Issue Templates](#issue-templates)
|
- [Monitoring and Logging](#monitoring-and-logging)
|
||||||
13. [📄 License](#-license)
|
|
||||||
14. [🙏 Acknowledgments](#-acknowledgments)
|
**📊 Performance & Community**
|
||||||
15. [📞 Support](#-support)
|
- [Performance Metrics](#-performance-metrics)
|
||||||
|
- [Benchmark Results](#benchmark-results)
|
||||||
|
- [Performance Optimization](#performance-optimization)
|
||||||
|
- [Load Testing](#load-testing)
|
||||||
|
- [Contributing](#-contributing)
|
||||||
|
- [Development Setup](#development-setup)
|
||||||
|
- [Code Standards](#code-standards)
|
||||||
|
- [Contribution Process](#contribution-process)
|
||||||
|
- [Code Review Checklist](#code-review-checklist)
|
||||||
|
- [License](#-license)
|
||||||
|
- [Acknowledgments](#-acknowledgments)
|
||||||
|
- [Support](#-support)
|
||||||
|
|
||||||
|
</td>
|
||||||
|
</tr>
|
||||||
|
</table>
|
||||||
|
|
||||||
## 🏗️ System Architecture
|
## 🏗️ System Architecture
|
||||||
|
|
||||||
|
|||||||
112
alembic.ini
Normal file
112
alembic.ini
Normal file
@@ -0,0 +1,112 @@
|
|||||||
|
# A generic, single database configuration.
|
||||||
|
|
||||||
|
[alembic]
|
||||||
|
# path to migration scripts
|
||||||
|
script_location = src/database/migrations
|
||||||
|
|
||||||
|
# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s
|
||||||
|
# Uncomment the line below if you want the files to be prepended with date and time
|
||||||
|
# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s
|
||||||
|
|
||||||
|
# sys.path path, will be prepended to sys.path if present.
|
||||||
|
# defaults to the current working directory.
|
||||||
|
prepend_sys_path = .
|
||||||
|
|
||||||
|
# timezone to use when rendering the date within the migration file
|
||||||
|
# as well as the filename.
|
||||||
|
# If specified, requires the python-dateutil library that can be
|
||||||
|
# installed by adding `alembic[tz]` to the pip requirements
|
||||||
|
# string value is passed to dateutil.tz.gettz()
|
||||||
|
# leave blank for localtime
|
||||||
|
# timezone =
|
||||||
|
|
||||||
|
# max length of characters to apply to the
|
||||||
|
# "slug" field
|
||||||
|
# truncate_slug_length = 40
|
||||||
|
|
||||||
|
# set to 'true' to run the environment during
|
||||||
|
# the 'revision' command, regardless of autogenerate
|
||||||
|
# revision_environment = false
|
||||||
|
|
||||||
|
# set to 'true' to allow .pyc and .pyo files without
|
||||||
|
# a source .py file to be detected as revisions in the
|
||||||
|
# versions/ directory
|
||||||
|
# sourceless = false
|
||||||
|
|
||||||
|
# version number format
|
||||||
|
version_num_format = %04d
|
||||||
|
|
||||||
|
# version path separator; As mentioned above, this is the character used to split
|
||||||
|
# version_locations. The default within new alembic.ini files is "os", which uses
|
||||||
|
# os.pathsep. If this key is omitted entirely, it falls back to the legacy
|
||||||
|
# behavior of splitting on spaces and/or commas.
|
||||||
|
# Valid values for version_path_separator are:
|
||||||
|
#
|
||||||
|
# version_path_separator = :
|
||||||
|
# version_path_separator = ;
|
||||||
|
# version_path_separator = space
|
||||||
|
version_path_separator = os
|
||||||
|
|
||||||
|
# set to 'true' to search source files recursively
|
||||||
|
# in each "version_locations" directory
|
||||||
|
# new in Alembic version 1.10
|
||||||
|
# recursive_version_locations = false
|
||||||
|
|
||||||
|
# the output encoding used when revision files
|
||||||
|
# are written from script.py.mako
|
||||||
|
# output_encoding = utf-8
|
||||||
|
|
||||||
|
sqlalchemy.url = sqlite:///./data/wifi_densepose_fallback.db
|
||||||
|
|
||||||
|
|
||||||
|
[post_write_hooks]
|
||||||
|
# post_write_hooks defines scripts or Python functions that are run
|
||||||
|
# on newly generated revision scripts. See the documentation for further
|
||||||
|
# detail and examples
|
||||||
|
|
||||||
|
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||||
|
# hooks = black
|
||||||
|
# black.type = console_scripts
|
||||||
|
# black.entrypoint = black
|
||||||
|
# black.options = -l 79 REVISION_SCRIPT_FILENAME
|
||||||
|
|
||||||
|
# lint with attempts to fix using "ruff" - use the exec runner, execute a binary
|
||||||
|
# hooks = ruff
|
||||||
|
# ruff.type = exec
|
||||||
|
# ruff.executable = %(here)s/.venv/bin/ruff
|
||||||
|
# ruff.options = --fix REVISION_SCRIPT_FILENAME
|
||||||
|
|
||||||
|
# Logging configuration
|
||||||
|
[loggers]
|
||||||
|
keys = root,sqlalchemy,alembic
|
||||||
|
|
||||||
|
[handlers]
|
||||||
|
keys = console
|
||||||
|
|
||||||
|
[formatters]
|
||||||
|
keys = generic
|
||||||
|
|
||||||
|
[logger_root]
|
||||||
|
level = WARN
|
||||||
|
handlers = console
|
||||||
|
qualname =
|
||||||
|
|
||||||
|
[logger_sqlalchemy]
|
||||||
|
level = WARN
|
||||||
|
handlers =
|
||||||
|
qualname = sqlalchemy.engine
|
||||||
|
|
||||||
|
[logger_alembic]
|
||||||
|
level = INFO
|
||||||
|
handlers =
|
||||||
|
qualname = alembic
|
||||||
|
|
||||||
|
[handler_console]
|
||||||
|
class = StreamHandler
|
||||||
|
args = (sys.stderr,)
|
||||||
|
level = NOTSET
|
||||||
|
formatter = generic
|
||||||
|
|
||||||
|
[formatter_generic]
|
||||||
|
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||||
|
datefmt = %H:%M:%S
|
||||||
0
data/wifi_densepose_fallback.db
Normal file
0
data/wifi_densepose_fallback.db
Normal file
32
example.env
32
example.env
@@ -41,18 +41,40 @@ CORS_ORIGINS=* # Use specific origins in production: https://example.com,https:
|
|||||||
# =============================================================================
|
# =============================================================================
|
||||||
|
|
||||||
# Database connection (optional - defaults to SQLite in development)
|
# Database connection (optional - defaults to SQLite in development)
|
||||||
# DATABASE_URL=postgresql://user:password@localhost:5432/wifi_densepose
|
# For PostgreSQL (recommended for production):
|
||||||
# DATABASE_POOL_SIZE=10
|
DATABASE_URL=postgresql://wifi_user:wifi_password@localhost:5432/wifi_densepose
|
||||||
# DATABASE_MAX_OVERFLOW=20
|
DATABASE_POOL_SIZE=10
|
||||||
|
DATABASE_MAX_OVERFLOW=20
|
||||||
|
|
||||||
|
# Alternative: Individual database connection parameters
|
||||||
|
# DB_HOST=localhost
|
||||||
|
# DB_PORT=5432
|
||||||
|
# DB_NAME=wifi_densepose
|
||||||
|
# DB_USER=wifi_user
|
||||||
|
# DB_PASSWORD=wifi_password
|
||||||
|
|
||||||
|
# Database failsafe settings
|
||||||
|
ENABLE_DATABASE_FAILSAFE=true
|
||||||
|
SQLITE_FALLBACK_PATH=./data/wifi_densepose_fallback.db
|
||||||
|
|
||||||
# =============================================================================
|
# =============================================================================
|
||||||
# REDIS SETTINGS (Optional - for caching and rate limiting)
|
# REDIS SETTINGS (Optional - for caching and rate limiting)
|
||||||
# =============================================================================
|
# =============================================================================
|
||||||
|
|
||||||
# Redis connection (optional - defaults to localhost in development)
|
# Redis connection (optional - defaults to localhost in development)
|
||||||
# REDIS_URL=redis://localhost:6379/0
|
REDIS_URL=redis://localhost:6379/0
|
||||||
# REDIS_PASSWORD=your-redis-password
|
# REDIS_PASSWORD=your-redis-password
|
||||||
# REDIS_DB=0
|
REDIS_DB=0
|
||||||
|
REDIS_ENABLED=true
|
||||||
|
REDIS_REQUIRED=false
|
||||||
|
ENABLE_REDIS_FAILSAFE=true
|
||||||
|
|
||||||
|
# Redis connection settings
|
||||||
|
REDIS_HOST=localhost
|
||||||
|
REDIS_PORT=6379
|
||||||
|
REDIS_MAX_CONNECTIONS=10
|
||||||
|
REDIS_SOCKET_TIMEOUT=5
|
||||||
|
REDIS_CONNECT_TIMEOUT=5
|
||||||
|
|
||||||
# =============================================================================
|
# =============================================================================
|
||||||
# HARDWARE SETTINGS
|
# HARDWARE SETTINGS
|
||||||
|
|||||||
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
|
|||||||
|
|
||||||
[project]
|
[project]
|
||||||
name = "wifi-densepose"
|
name = "wifi-densepose"
|
||||||
version = "1.0.0"
|
version = "1.1.0"
|
||||||
description = "WiFi-based human pose estimation using CSI data and DensePose neural networks"
|
description = "WiFi-based human pose estimation using CSI data and DensePose neural networks"
|
||||||
readme = "README.md"
|
readme = "README.md"
|
||||||
license = "MIT"
|
license = "MIT"
|
||||||
|
|||||||
@@ -21,6 +21,16 @@ python-jose[cryptography]>=3.3.0
|
|||||||
python-multipart>=0.0.6
|
python-multipart>=0.0.6
|
||||||
passlib[bcrypt]>=1.7.4
|
passlib[bcrypt]>=1.7.4
|
||||||
|
|
||||||
|
# Database dependencies
|
||||||
|
sqlalchemy>=2.0.0
|
||||||
|
asyncpg>=0.28.0
|
||||||
|
aiosqlite>=0.19.0
|
||||||
|
redis>=4.5.0
|
||||||
|
|
||||||
|
# CLI dependencies
|
||||||
|
click>=8.0.0
|
||||||
|
alembic>=1.10.0
|
||||||
|
|
||||||
# Hardware interface dependencies
|
# Hardware interface dependencies
|
||||||
asyncio-mqtt>=0.11.0
|
asyncio-mqtt>=0.11.0
|
||||||
aiohttp>=3.8.0
|
aiohttp>=3.8.0
|
||||||
|
|||||||
@@ -29,7 +29,7 @@ Author: WiFi-DensePose Team
|
|||||||
License: MIT
|
License: MIT
|
||||||
"""
|
"""
|
||||||
|
|
||||||
__version__ = "1.0.0"
|
__version__ = "1.1.0"
|
||||||
__author__ = "WiFi-DensePose Team"
|
__author__ = "WiFi-DensePose Team"
|
||||||
__email__ = "team@wifi-densepose.com"
|
__email__ = "team@wifi-densepose.com"
|
||||||
__license__ = "MIT"
|
__license__ = "MIT"
|
||||||
|
|||||||
17
src/app.py
17
src/app.py
@@ -310,4 +310,19 @@ def setup_root_endpoints(app: FastAPI, settings: Settings):
|
|||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Error resetting services: {e}")
|
logger.error(f"Error resetting services: {e}")
|
||||||
return {"error": str(e)}
|
return {"error": str(e)}
|
||||||
|
|
||||||
|
|
||||||
|
# Create default app instance for uvicorn
|
||||||
|
def get_app() -> FastAPI:
|
||||||
|
"""Get the default application instance."""
|
||||||
|
from src.config.settings import get_settings
|
||||||
|
from src.services.orchestrator import ServiceOrchestrator
|
||||||
|
|
||||||
|
settings = get_settings()
|
||||||
|
orchestrator = ServiceOrchestrator(settings)
|
||||||
|
return create_app(settings, orchestrator)
|
||||||
|
|
||||||
|
|
||||||
|
# Default app instance for uvicorn
|
||||||
|
app = get_app()
|
||||||
109
src/cli.py
109
src/cli.py
@@ -212,6 +212,7 @@ def init(ctx, url: Optional[str]):
|
|||||||
from src.database.connection import get_database_manager
|
from src.database.connection import get_database_manager
|
||||||
from alembic.config import Config
|
from alembic.config import Config
|
||||||
from alembic import command
|
from alembic import command
|
||||||
|
import os
|
||||||
|
|
||||||
# Get settings
|
# Get settings
|
||||||
settings = get_settings_with_config(ctx.obj.get('config_file'))
|
settings = get_settings_with_config(ctx.obj.get('config_file'))
|
||||||
@@ -228,10 +229,19 @@ def init(ctx, url: Optional[str]):
|
|||||||
|
|
||||||
asyncio.run(init_db())
|
asyncio.run(init_db())
|
||||||
|
|
||||||
# Run migrations
|
# Run migrations if alembic.ini exists
|
||||||
alembic_cfg = Config("alembic.ini")
|
alembic_ini_path = "alembic.ini"
|
||||||
command.upgrade(alembic_cfg, "head")
|
if os.path.exists(alembic_ini_path):
|
||||||
logger.info("Database migrations applied successfully")
|
try:
|
||||||
|
alembic_cfg = Config(alembic_ini_path)
|
||||||
|
# Set the database URL in the config
|
||||||
|
alembic_cfg.set_main_option("sqlalchemy.url", settings.get_database_url())
|
||||||
|
command.upgrade(alembic_cfg, "head")
|
||||||
|
logger.info("Database migrations applied successfully")
|
||||||
|
except Exception as migration_error:
|
||||||
|
logger.warning(f"Migration failed, but database is initialized: {migration_error}")
|
||||||
|
else:
|
||||||
|
logger.info("No alembic.ini found, skipping migrations")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Failed to initialize database: {e}")
|
logger.error(f"Failed to initialize database: {e}")
|
||||||
@@ -493,6 +503,97 @@ def validate(ctx):
|
|||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
|
@config.command()
|
||||||
|
@click.option(
|
||||||
|
'--format',
|
||||||
|
type=click.Choice(['text', 'json']),
|
||||||
|
default='text',
|
||||||
|
help='Output format (default: text)'
|
||||||
|
)
|
||||||
|
@click.pass_context
|
||||||
|
def failsafe(ctx, format: str):
|
||||||
|
"""Show failsafe status and configuration."""
|
||||||
|
|
||||||
|
try:
|
||||||
|
import json
|
||||||
|
from src.database.connection import get_database_manager
|
||||||
|
|
||||||
|
# Get settings
|
||||||
|
settings = get_settings_with_config(ctx.obj.get('config_file'))
|
||||||
|
|
||||||
|
async def check_failsafe_status():
|
||||||
|
db_manager = get_database_manager(settings)
|
||||||
|
|
||||||
|
# Initialize database to check current state
|
||||||
|
try:
|
||||||
|
await db_manager.initialize()
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Database initialization failed: {e}")
|
||||||
|
|
||||||
|
# Collect failsafe status
|
||||||
|
failsafe_status = {
|
||||||
|
"database": {
|
||||||
|
"failsafe_enabled": settings.enable_database_failsafe,
|
||||||
|
"using_sqlite_fallback": db_manager.is_using_sqlite_fallback(),
|
||||||
|
"sqlite_fallback_path": settings.sqlite_fallback_path,
|
||||||
|
"primary_database_url": settings.get_database_url() if not db_manager.is_using_sqlite_fallback() else None,
|
||||||
|
},
|
||||||
|
"redis": {
|
||||||
|
"failsafe_enabled": settings.enable_redis_failsafe,
|
||||||
|
"redis_enabled": settings.redis_enabled,
|
||||||
|
"redis_required": settings.redis_required,
|
||||||
|
"redis_available": db_manager.is_redis_available(),
|
||||||
|
"redis_url": settings.get_redis_url() if settings.redis_enabled else None,
|
||||||
|
},
|
||||||
|
"overall_status": "healthy"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Determine overall status
|
||||||
|
if failsafe_status["database"]["using_sqlite_fallback"] or not failsafe_status["redis"]["redis_available"]:
|
||||||
|
failsafe_status["overall_status"] = "degraded"
|
||||||
|
|
||||||
|
# Output results
|
||||||
|
if format == 'json':
|
||||||
|
click.echo(json.dumps(failsafe_status, indent=2))
|
||||||
|
else:
|
||||||
|
click.echo("=== Failsafe Status ===\n")
|
||||||
|
|
||||||
|
# Database status
|
||||||
|
click.echo("Database:")
|
||||||
|
if failsafe_status["database"]["using_sqlite_fallback"]:
|
||||||
|
click.echo(" ⚠️ Using SQLite fallback database")
|
||||||
|
click.echo(f" Path: {failsafe_status['database']['sqlite_fallback_path']}")
|
||||||
|
else:
|
||||||
|
click.echo(" ✓ Using primary database (PostgreSQL)")
|
||||||
|
|
||||||
|
click.echo(f" Failsafe enabled: {'Yes' if failsafe_status['database']['failsafe_enabled'] else 'No'}")
|
||||||
|
|
||||||
|
# Redis status
|
||||||
|
click.echo("\nRedis:")
|
||||||
|
if not failsafe_status["redis"]["redis_enabled"]:
|
||||||
|
click.echo(" - Redis disabled")
|
||||||
|
elif not failsafe_status["redis"]["redis_available"]:
|
||||||
|
click.echo(" ⚠️ Redis unavailable (failsafe active)")
|
||||||
|
else:
|
||||||
|
click.echo(" ✓ Redis available")
|
||||||
|
|
||||||
|
click.echo(f" Failsafe enabled: {'Yes' if failsafe_status['redis']['failsafe_enabled'] else 'No'}")
|
||||||
|
click.echo(f" Required: {'Yes' if failsafe_status['redis']['redis_required'] else 'No'}")
|
||||||
|
|
||||||
|
# Overall status
|
||||||
|
status_icon = "✓" if failsafe_status["overall_status"] == "healthy" else "⚠️"
|
||||||
|
click.echo(f"\nOverall Status: {status_icon} {failsafe_status['overall_status'].upper()}")
|
||||||
|
|
||||||
|
if failsafe_status["overall_status"] == "degraded":
|
||||||
|
click.echo("\nNote: System is running in degraded mode using failsafe configurations.")
|
||||||
|
|
||||||
|
asyncio.run(check_failsafe_status())
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to check failsafe status: {e}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
@cli.command()
|
@cli.command()
|
||||||
def version():
|
def version():
|
||||||
"""Show version information."""
|
"""Show version information."""
|
||||||
|
|||||||
@@ -63,6 +63,15 @@ class Settings(BaseSettings):
|
|||||||
redis_enabled: bool = Field(default=True, description="Enable Redis")
|
redis_enabled: bool = Field(default=True, description="Enable Redis")
|
||||||
redis_host: str = Field(default="localhost", description="Redis host")
|
redis_host: str = Field(default="localhost", description="Redis host")
|
||||||
redis_port: int = Field(default=6379, description="Redis port")
|
redis_port: int = Field(default=6379, description="Redis port")
|
||||||
|
redis_required: bool = Field(default=False, description="Require Redis connection (fail if unavailable)")
|
||||||
|
redis_max_connections: int = Field(default=10, description="Maximum Redis connections")
|
||||||
|
redis_socket_timeout: int = Field(default=5, description="Redis socket timeout in seconds")
|
||||||
|
redis_connect_timeout: int = Field(default=5, description="Redis connection timeout in seconds")
|
||||||
|
|
||||||
|
# Failsafe settings
|
||||||
|
enable_database_failsafe: bool = Field(default=True, description="Enable automatic SQLite failsafe when PostgreSQL unavailable")
|
||||||
|
enable_redis_failsafe: bool = Field(default=True, description="Enable automatic Redis failsafe (disable when unavailable)")
|
||||||
|
sqlite_fallback_path: str = Field(default="./data/wifi_densepose_fallback.db", description="SQLite fallback database path")
|
||||||
|
|
||||||
# Hardware settings
|
# Hardware settings
|
||||||
wifi_interface: str = Field(default="wlan0", description="WiFi interface name")
|
wifi_interface: str = Field(default="wlan0", description="WiFi interface name")
|
||||||
@@ -88,6 +97,7 @@ class Settings(BaseSettings):
|
|||||||
description="Log format"
|
description="Log format"
|
||||||
)
|
)
|
||||||
log_file: Optional[str] = Field(default=None, description="Log file path")
|
log_file: Optional[str] = Field(default=None, description="Log file path")
|
||||||
|
log_directory: str = Field(default="./logs", description="Log directory path")
|
||||||
log_max_size: int = Field(default=10485760, description="Max log file size in bytes (10MB)")
|
log_max_size: int = Field(default=10485760, description="Max log file size in bytes (10MB)")
|
||||||
log_backup_count: int = Field(default=5, description="Number of log backup files")
|
log_backup_count: int = Field(default=5, description="Number of log backup files")
|
||||||
|
|
||||||
@@ -103,6 +113,7 @@ class Settings(BaseSettings):
|
|||||||
data_storage_path: str = Field(default="./data", description="Data storage directory")
|
data_storage_path: str = Field(default="./data", description="Data storage directory")
|
||||||
model_storage_path: str = Field(default="./models", description="Model storage directory")
|
model_storage_path: str = Field(default="./models", description="Model storage directory")
|
||||||
temp_storage_path: str = Field(default="./temp", description="Temporary storage directory")
|
temp_storage_path: str = Field(default="./temp", description="Temporary storage directory")
|
||||||
|
backup_directory: str = Field(default="./backups", description="Backup storage directory")
|
||||||
max_storage_size_gb: int = Field(default=100, description="Maximum storage size in GB")
|
max_storage_size_gb: int = Field(default=100, description="Maximum storage size in GB")
|
||||||
|
|
||||||
# API settings
|
# API settings
|
||||||
@@ -241,8 +252,16 @@ class Settings(BaseSettings):
|
|||||||
if self.is_development:
|
if self.is_development:
|
||||||
return f"sqlite:///{self.data_storage_path}/wifi_densepose.db"
|
return f"sqlite:///{self.data_storage_path}/wifi_densepose.db"
|
||||||
|
|
||||||
|
# SQLite failsafe for production if enabled
|
||||||
|
if self.enable_database_failsafe:
|
||||||
|
return f"sqlite:///{self.sqlite_fallback_path}"
|
||||||
|
|
||||||
raise ValueError("Database URL must be configured for non-development environments")
|
raise ValueError("Database URL must be configured for non-development environments")
|
||||||
|
|
||||||
|
def get_sqlite_fallback_url(self) -> str:
|
||||||
|
"""Get SQLite fallback database URL."""
|
||||||
|
return f"sqlite:///{self.sqlite_fallback_path}"
|
||||||
|
|
||||||
def get_redis_url(self) -> Optional[str]:
|
def get_redis_url(self) -> Optional[str]:
|
||||||
"""Get Redis URL with fallback."""
|
"""Get Redis URL with fallback."""
|
||||||
if not self.redis_enabled:
|
if not self.redis_enabled:
|
||||||
@@ -334,6 +353,8 @@ class Settings(BaseSettings):
|
|||||||
self.data_storage_path,
|
self.data_storage_path,
|
||||||
self.model_storage_path,
|
self.model_storage_path,
|
||||||
self.temp_storage_path,
|
self.temp_storage_path,
|
||||||
|
self.log_directory,
|
||||||
|
self.backup_directory,
|
||||||
]
|
]
|
||||||
|
|
||||||
for directory in directories:
|
for directory in directories:
|
||||||
|
|||||||
@@ -8,7 +8,7 @@ from typing import Optional, Dict, Any, AsyncGenerator
|
|||||||
from contextlib import asynccontextmanager
|
from contextlib import asynccontextmanager
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
from sqlalchemy import create_engine, event, pool
|
from sqlalchemy import create_engine, event, pool, text
|
||||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
||||||
from sqlalchemy.orm import sessionmaker, Session
|
from sqlalchemy.orm import sessionmaker, Session
|
||||||
from sqlalchemy.pool import QueuePool, NullPool
|
from sqlalchemy.pool import QueuePool, NullPool
|
||||||
@@ -65,12 +65,35 @@ class DatabaseManager:
|
|||||||
raise DatabaseConnectionError(f"Database initialization failed: {e}")
|
raise DatabaseConnectionError(f"Database initialization failed: {e}")
|
||||||
|
|
||||||
async def _initialize_postgresql(self):
|
async def _initialize_postgresql(self):
|
||||||
"""Initialize PostgreSQL connections."""
|
"""Initialize PostgreSQL connections with SQLite failsafe."""
|
||||||
|
postgresql_failed = False
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Try PostgreSQL first
|
||||||
|
await self._initialize_postgresql_primary()
|
||||||
|
logger.info("PostgreSQL connections initialized")
|
||||||
|
return
|
||||||
|
except Exception as e:
|
||||||
|
postgresql_failed = True
|
||||||
|
logger.error(f"PostgreSQL initialization failed: {e}")
|
||||||
|
|
||||||
|
if not self.settings.enable_database_failsafe:
|
||||||
|
raise DatabaseConnectionError(f"PostgreSQL connection failed and failsafe disabled: {e}")
|
||||||
|
|
||||||
|
logger.warning("Falling back to SQLite database")
|
||||||
|
|
||||||
|
# Fallback to SQLite if PostgreSQL failed and failsafe is enabled
|
||||||
|
if postgresql_failed and self.settings.enable_database_failsafe:
|
||||||
|
await self._initialize_sqlite_fallback()
|
||||||
|
logger.info("SQLite fallback database initialized")
|
||||||
|
|
||||||
|
async def _initialize_postgresql_primary(self):
|
||||||
|
"""Initialize primary PostgreSQL connections."""
|
||||||
# Build database URL
|
# Build database URL
|
||||||
if self.settings.database_url:
|
if self.settings.database_url and "postgresql" in self.settings.database_url:
|
||||||
db_url = self.settings.database_url
|
db_url = self.settings.database_url
|
||||||
async_db_url = self.settings.database_url.replace("postgresql://", "postgresql+asyncpg://")
|
async_db_url = self.settings.database_url.replace("postgresql://", "postgresql+asyncpg://")
|
||||||
else:
|
elif self.settings.db_host and self.settings.db_name and self.settings.db_user:
|
||||||
db_url = (
|
db_url = (
|
||||||
f"postgresql://{self.settings.db_user}:{self.settings.db_password}"
|
f"postgresql://{self.settings.db_user}:{self.settings.db_password}"
|
||||||
f"@{self.settings.db_host}:{self.settings.db_port}/{self.settings.db_name}"
|
f"@{self.settings.db_host}:{self.settings.db_port}/{self.settings.db_name}"
|
||||||
@@ -79,6 +102,8 @@ class DatabaseManager:
|
|||||||
f"postgresql+asyncpg://{self.settings.db_user}:{self.settings.db_password}"
|
f"postgresql+asyncpg://{self.settings.db_user}:{self.settings.db_password}"
|
||||||
f"@{self.settings.db_host}:{self.settings.db_port}/{self.settings.db_name}"
|
f"@{self.settings.db_host}:{self.settings.db_port}/{self.settings.db_name}"
|
||||||
)
|
)
|
||||||
|
else:
|
||||||
|
raise ValueError("PostgreSQL connection parameters not configured")
|
||||||
|
|
||||||
# Create async engine (don't specify poolclass for async engines)
|
# Create async engine (don't specify poolclass for async engines)
|
||||||
self._async_engine = create_async_engine(
|
self._async_engine = create_async_engine(
|
||||||
@@ -122,11 +147,65 @@ class DatabaseManager:
|
|||||||
|
|
||||||
# Test connections
|
# Test connections
|
||||||
await self._test_postgresql_connection()
|
await self._test_postgresql_connection()
|
||||||
|
|
||||||
|
async def _initialize_sqlite_fallback(self):
|
||||||
|
"""Initialize SQLite fallback database."""
|
||||||
|
import os
|
||||||
|
|
||||||
logger.info("PostgreSQL connections initialized")
|
# Ensure directory exists
|
||||||
|
sqlite_path = self.settings.sqlite_fallback_path
|
||||||
|
os.makedirs(os.path.dirname(sqlite_path), exist_ok=True)
|
||||||
|
|
||||||
|
# Build SQLite URLs
|
||||||
|
db_url = f"sqlite:///{sqlite_path}"
|
||||||
|
async_db_url = f"sqlite+aiosqlite:///{sqlite_path}"
|
||||||
|
|
||||||
|
# Create async engine for SQLite
|
||||||
|
self._async_engine = create_async_engine(
|
||||||
|
async_db_url,
|
||||||
|
echo=self.settings.db_echo,
|
||||||
|
future=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create sync engine for SQLite
|
||||||
|
self._sync_engine = create_engine(
|
||||||
|
db_url,
|
||||||
|
poolclass=NullPool, # SQLite doesn't need connection pooling
|
||||||
|
echo=self.settings.db_echo,
|
||||||
|
future=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create session factories
|
||||||
|
self._async_session_factory = async_sessionmaker(
|
||||||
|
self._async_engine,
|
||||||
|
class_=AsyncSession,
|
||||||
|
expire_on_commit=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
self._sync_session_factory = sessionmaker(
|
||||||
|
self._sync_engine,
|
||||||
|
expire_on_commit=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Add connection event listeners
|
||||||
|
self._setup_connection_events()
|
||||||
|
|
||||||
|
# Test SQLite connection
|
||||||
|
await self._test_sqlite_connection()
|
||||||
|
|
||||||
|
async def _test_sqlite_connection(self):
|
||||||
|
"""Test SQLite connection."""
|
||||||
|
try:
|
||||||
|
async with self._async_engine.begin() as conn:
|
||||||
|
result = await conn.execute(text("SELECT 1"))
|
||||||
|
result.fetchone() # Don't await this - fetchone() is not async
|
||||||
|
logger.debug("SQLite connection test successful")
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"SQLite connection test failed: {e}")
|
||||||
|
raise DatabaseConnectionError(f"SQLite connection test failed: {e}")
|
||||||
|
|
||||||
async def _initialize_redis(self):
|
async def _initialize_redis(self):
|
||||||
"""Initialize Redis connection."""
|
"""Initialize Redis connection with failsafe."""
|
||||||
if not self.settings.redis_enabled:
|
if not self.settings.redis_enabled:
|
||||||
logger.info("Redis disabled, skipping initialization")
|
logger.info("Redis disabled, skipping initialization")
|
||||||
return
|
return
|
||||||
@@ -160,10 +239,15 @@ class DatabaseManager:
|
|||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Failed to initialize Redis: {e}")
|
logger.error(f"Failed to initialize Redis: {e}")
|
||||||
|
|
||||||
if self.settings.redis_required:
|
if self.settings.redis_required:
|
||||||
raise
|
raise DatabaseConnectionError(f"Redis connection failed and is required: {e}")
|
||||||
|
elif self.settings.enable_redis_failsafe:
|
||||||
|
logger.warning("Redis initialization failed, continuing without Redis (failsafe enabled)")
|
||||||
|
self._redis_client = None
|
||||||
else:
|
else:
|
||||||
logger.warning("Redis initialization failed but not required, continuing without Redis")
|
logger.warning("Redis initialization failed but not required, continuing without Redis")
|
||||||
|
self._redis_client = None
|
||||||
|
|
||||||
def _setup_connection_events(self):
|
def _setup_connection_events(self):
|
||||||
"""Setup database connection event listeners."""
|
"""Setup database connection event listeners."""
|
||||||
@@ -195,8 +279,8 @@ class DatabaseManager:
|
|||||||
"""Test PostgreSQL connection."""
|
"""Test PostgreSQL connection."""
|
||||||
try:
|
try:
|
||||||
async with self._async_engine.begin() as conn:
|
async with self._async_engine.begin() as conn:
|
||||||
result = await conn.execute("SELECT 1")
|
result = await conn.execute(text("SELECT 1"))
|
||||||
await result.fetchone()
|
result.fetchone() # Don't await this - fetchone() is not async
|
||||||
logger.debug("PostgreSQL connection test successful")
|
logger.debug("PostgreSQL connection test successful")
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"PostgreSQL connection test failed: {e}")
|
logger.error(f"PostgreSQL connection test failed: {e}")
|
||||||
@@ -265,31 +349,48 @@ class DatabaseManager:
|
|||||||
async def health_check(self) -> Dict[str, Any]:
|
async def health_check(self) -> Dict[str, Any]:
|
||||||
"""Perform database health check."""
|
"""Perform database health check."""
|
||||||
health_status = {
|
health_status = {
|
||||||
"postgresql": {"status": "unknown", "details": {}},
|
"database": {"status": "unknown", "details": {}},
|
||||||
"redis": {"status": "unknown", "details": {}},
|
"redis": {"status": "unknown", "details": {}},
|
||||||
"overall": "unknown"
|
"overall": "unknown"
|
||||||
}
|
}
|
||||||
|
|
||||||
# Check PostgreSQL
|
# Check Database (PostgreSQL or SQLite)
|
||||||
try:
|
try:
|
||||||
start_time = datetime.utcnow()
|
start_time = datetime.utcnow()
|
||||||
async with self.get_async_session() as session:
|
async with self.get_async_session() as session:
|
||||||
result = await session.execute("SELECT 1")
|
result = await session.execute(text("SELECT 1"))
|
||||||
await result.fetchone()
|
result.fetchone() # Don't await this - fetchone() is not async
|
||||||
|
|
||||||
response_time = (datetime.utcnow() - start_time).total_seconds()
|
response_time = (datetime.utcnow() - start_time).total_seconds()
|
||||||
|
|
||||||
health_status["postgresql"] = {
|
# Determine database type and status
|
||||||
"status": "healthy",
|
is_sqlite = self.is_using_sqlite_fallback()
|
||||||
"details": {
|
db_type = "sqlite_fallback" if is_sqlite else "postgresql"
|
||||||
"response_time_ms": round(response_time * 1000, 2),
|
|
||||||
|
details = {
|
||||||
|
"type": db_type,
|
||||||
|
"response_time_ms": round(response_time * 1000, 2),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add pool info for PostgreSQL
|
||||||
|
if not is_sqlite and hasattr(self._async_engine, 'pool'):
|
||||||
|
details.update({
|
||||||
"pool_size": self._async_engine.pool.size(),
|
"pool_size": self._async_engine.pool.size(),
|
||||||
"checked_out": self._async_engine.pool.checkedout(),
|
"checked_out": self._async_engine.pool.checkedout(),
|
||||||
"overflow": self._async_engine.pool.overflow(),
|
"overflow": self._async_engine.pool.overflow(),
|
||||||
}
|
})
|
||||||
|
|
||||||
|
# Add failsafe info
|
||||||
|
if is_sqlite:
|
||||||
|
details["failsafe_active"] = True
|
||||||
|
details["fallback_path"] = self.settings.sqlite_fallback_path
|
||||||
|
|
||||||
|
health_status["database"] = {
|
||||||
|
"status": "healthy",
|
||||||
|
"details": details
|
||||||
}
|
}
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
health_status["postgresql"] = {
|
health_status["database"] = {
|
||||||
"status": "unhealthy",
|
"status": "unhealthy",
|
||||||
"details": {"error": str(e)}
|
"details": {"error": str(e)}
|
||||||
}
|
}
|
||||||
@@ -324,15 +425,22 @@ class DatabaseManager:
|
|||||||
}
|
}
|
||||||
|
|
||||||
# Determine overall status
|
# Determine overall status
|
||||||
postgresql_healthy = health_status["postgresql"]["status"] == "healthy"
|
database_healthy = health_status["database"]["status"] == "healthy"
|
||||||
redis_healthy = (
|
redis_healthy = (
|
||||||
health_status["redis"]["status"] in ["healthy", "disabled"] or
|
health_status["redis"]["status"] in ["healthy", "disabled"] or
|
||||||
not self.settings.redis_required
|
not self.settings.redis_required
|
||||||
)
|
)
|
||||||
|
|
||||||
if postgresql_healthy and redis_healthy:
|
# Check if using failsafe modes
|
||||||
health_status["overall"] = "healthy"
|
using_sqlite_fallback = self.is_using_sqlite_fallback()
|
||||||
elif postgresql_healthy:
|
redis_unavailable = not self.is_redis_available() and self.settings.redis_enabled
|
||||||
|
|
||||||
|
if database_healthy and redis_healthy:
|
||||||
|
if using_sqlite_fallback or redis_unavailable:
|
||||||
|
health_status["overall"] = "degraded" # Working but using failsafe
|
||||||
|
else:
|
||||||
|
health_status["overall"] = "healthy"
|
||||||
|
elif database_healthy:
|
||||||
health_status["overall"] = "degraded"
|
health_status["overall"] = "degraded"
|
||||||
else:
|
else:
|
||||||
health_status["overall"] = "unhealthy"
|
health_status["overall"] = "unhealthy"
|
||||||
@@ -394,6 +502,36 @@ class DatabaseManager:
|
|||||||
self._initialized = False
|
self._initialized = False
|
||||||
logger.info("Database connections closed")
|
logger.info("Database connections closed")
|
||||||
|
|
||||||
|
def is_using_sqlite_fallback(self) -> bool:
|
||||||
|
"""Check if currently using SQLite fallback database."""
|
||||||
|
if not self._async_engine:
|
||||||
|
return False
|
||||||
|
return "sqlite" in str(self._async_engine.url)
|
||||||
|
|
||||||
|
def is_redis_available(self) -> bool:
|
||||||
|
"""Check if Redis is available."""
|
||||||
|
return self._redis_client is not None
|
||||||
|
|
||||||
|
async def test_connection(self) -> bool:
|
||||||
|
"""Test database connection for CLI validation."""
|
||||||
|
try:
|
||||||
|
if not self._initialized:
|
||||||
|
await self.initialize()
|
||||||
|
|
||||||
|
# Test database connection (PostgreSQL or SQLite)
|
||||||
|
async with self.get_async_session() as session:
|
||||||
|
result = await session.execute(text("SELECT 1"))
|
||||||
|
result.fetchone() # Don't await this - fetchone() is not async
|
||||||
|
|
||||||
|
# Test Redis connection if enabled
|
||||||
|
if self._redis_client:
|
||||||
|
await self._redis_client.ping()
|
||||||
|
|
||||||
|
return True
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Database connection test failed: {e}")
|
||||||
|
return False
|
||||||
|
|
||||||
async def reset_connections(self):
|
async def reset_connections(self):
|
||||||
"""Reset all database connections."""
|
"""Reset all database connections."""
|
||||||
logger.info("Resetting database connections")
|
logger.info("Resetting database connections")
|
||||||
@@ -438,8 +576,8 @@ class DatabaseHealthCheck:
|
|||||||
try:
|
try:
|
||||||
start_time = datetime.utcnow()
|
start_time = datetime.utcnow()
|
||||||
async with self.db_manager.get_async_session() as session:
|
async with self.db_manager.get_async_session() as session:
|
||||||
result = await session.execute("SELECT version()")
|
result = await session.execute(text("SELECT version()"))
|
||||||
version = (await result.fetchone())[0]
|
version = result.fetchone()[0] # Don't await this - fetchone() is not async
|
||||||
|
|
||||||
response_time = (datetime.utcnow() - start_time).total_seconds()
|
response_time = (datetime.utcnow() - start_time).total_seconds()
|
||||||
|
|
||||||
|
|||||||
109
src/database/migrations/env.py
Normal file
109
src/database/migrations/env.py
Normal file
@@ -0,0 +1,109 @@
|
|||||||
|
"""Alembic environment configuration for WiFi-DensePose API."""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
from logging.config import fileConfig
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from sqlalchemy import pool
|
||||||
|
from sqlalchemy.engine import Connection
|
||||||
|
from sqlalchemy.ext.asyncio import async_engine_from_config
|
||||||
|
|
||||||
|
from alembic import context
|
||||||
|
|
||||||
|
# Add the project root to the Python path
|
||||||
|
project_root = Path(__file__).parent.parent.parent.parent
|
||||||
|
sys.path.insert(0, str(project_root))
|
||||||
|
|
||||||
|
# Import the models and settings
|
||||||
|
from src.database.models import Base
|
||||||
|
from src.config.settings import get_settings
|
||||||
|
|
||||||
|
# this is the Alembic Config object, which provides
|
||||||
|
# access to the values within the .ini file in use.
|
||||||
|
config = context.config
|
||||||
|
|
||||||
|
# Interpret the config file for Python logging.
|
||||||
|
# This line sets up loggers basically.
|
||||||
|
if config.config_file_name is not None:
|
||||||
|
fileConfig(config.config_file_name)
|
||||||
|
|
||||||
|
# add your model's MetaData object here
|
||||||
|
# for 'autogenerate' support
|
||||||
|
target_metadata = Base.metadata
|
||||||
|
|
||||||
|
# other values from the config, defined by the needs of env.py,
|
||||||
|
# can be acquired:
|
||||||
|
# my_important_option = config.get_main_option("my_important_option")
|
||||||
|
# ... etc.
|
||||||
|
|
||||||
|
|
||||||
|
def get_database_url():
|
||||||
|
"""Get the database URL from settings."""
|
||||||
|
try:
|
||||||
|
settings = get_settings()
|
||||||
|
return settings.get_database_url()
|
||||||
|
except Exception:
|
||||||
|
# Fallback to SQLite if settings can't be loaded
|
||||||
|
return "sqlite:///./data/wifi_densepose_fallback.db"
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_offline() -> None:
|
||||||
|
"""Run migrations in 'offline' mode.
|
||||||
|
|
||||||
|
This configures the context with just a URL
|
||||||
|
and not an Engine, though an Engine is acceptable
|
||||||
|
here as well. By skipping the Engine creation
|
||||||
|
we don't even need a DBAPI to be available.
|
||||||
|
|
||||||
|
Calls to context.execute() here emit the given string to the
|
||||||
|
script output.
|
||||||
|
|
||||||
|
"""
|
||||||
|
url = get_database_url()
|
||||||
|
context.configure(
|
||||||
|
url=url,
|
||||||
|
target_metadata=target_metadata,
|
||||||
|
literal_binds=True,
|
||||||
|
dialect_opts={"paramstyle": "named"},
|
||||||
|
)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
def do_run_migrations(connection: Connection) -> None:
|
||||||
|
"""Run migrations with a database connection."""
|
||||||
|
context.configure(connection=connection, target_metadata=target_metadata)
|
||||||
|
|
||||||
|
with context.begin_transaction():
|
||||||
|
context.run_migrations()
|
||||||
|
|
||||||
|
|
||||||
|
async def run_async_migrations() -> None:
|
||||||
|
"""Run migrations in async mode."""
|
||||||
|
configuration = config.get_section(config.config_ini_section)
|
||||||
|
configuration["sqlalchemy.url"] = get_database_url()
|
||||||
|
|
||||||
|
connectable = async_engine_from_config(
|
||||||
|
configuration,
|
||||||
|
prefix="sqlalchemy.",
|
||||||
|
poolclass=pool.NullPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
async with connectable.connect() as connection:
|
||||||
|
await connection.run_sync(do_run_migrations)
|
||||||
|
|
||||||
|
await connectable.dispose()
|
||||||
|
|
||||||
|
|
||||||
|
def run_migrations_online() -> None:
|
||||||
|
"""Run migrations in 'online' mode."""
|
||||||
|
asyncio.run(run_async_migrations())
|
||||||
|
|
||||||
|
|
||||||
|
if context.is_offline_mode():
|
||||||
|
run_migrations_offline()
|
||||||
|
else:
|
||||||
|
run_migrations_online()
|
||||||
26
src/database/migrations/script.py.mako
Normal file
26
src/database/migrations/script.py.mako
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
"""${message}
|
||||||
|
|
||||||
|
Revision ID: ${up_revision}
|
||||||
|
Revises: ${down_revision | comma,n}
|
||||||
|
Create Date: ${create_date}
|
||||||
|
|
||||||
|
"""
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
${imports if imports else ""}
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision = ${repr(up_revision)}
|
||||||
|
down_revision = ${repr(down_revision)}
|
||||||
|
branch_labels = ${repr(branch_labels)}
|
||||||
|
depends_on = ${repr(depends_on)}
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
"""Upgrade database schema."""
|
||||||
|
${upgrades if upgrades else "pass"}
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
"""Downgrade database schema."""
|
||||||
|
${downgrades if downgrades else "pass"}
|
||||||
@@ -160,7 +160,7 @@ class Session(Base, UUIDMixin, TimestampMixin):
|
|||||||
|
|
||||||
# Metadata
|
# Metadata
|
||||||
tags = Column(ARRAY(String), nullable=True)
|
tags = Column(ARRAY(String), nullable=True)
|
||||||
metadata = Column(JSON, nullable=True)
|
meta_data = Column(JSON, nullable=True)
|
||||||
|
|
||||||
# Statistics
|
# Statistics
|
||||||
total_frames = Column(Integer, default=0, nullable=False)
|
total_frames = Column(Integer, default=0, nullable=False)
|
||||||
@@ -191,7 +191,7 @@ class Session(Base, UUIDMixin, TimestampMixin):
|
|||||||
"config": self.config,
|
"config": self.config,
|
||||||
"device_id": str(self.device_id),
|
"device_id": str(self.device_id),
|
||||||
"tags": self.tags,
|
"tags": self.tags,
|
||||||
"metadata": self.metadata,
|
"metadata": self.meta_data,
|
||||||
"total_frames": self.total_frames,
|
"total_frames": self.total_frames,
|
||||||
"processed_frames": self.processed_frames,
|
"processed_frames": self.processed_frames,
|
||||||
"error_count": self.error_count,
|
"error_count": self.error_count,
|
||||||
@@ -240,7 +240,7 @@ class CSIData(Base, UUIDMixin, TimestampMixin):
|
|||||||
is_valid = Column(Boolean, default=True, nullable=False)
|
is_valid = Column(Boolean, default=True, nullable=False)
|
||||||
|
|
||||||
# Metadata
|
# Metadata
|
||||||
metadata = Column(JSON, nullable=True)
|
meta_data = Column(JSON, nullable=True)
|
||||||
|
|
||||||
# Constraints and indexes
|
# Constraints and indexes
|
||||||
__table_args__ = (
|
__table_args__ = (
|
||||||
@@ -278,7 +278,7 @@ class CSIData(Base, UUIDMixin, TimestampMixin):
|
|||||||
"processed_at": self.processed_at.isoformat() if self.processed_at else None,
|
"processed_at": self.processed_at.isoformat() if self.processed_at else None,
|
||||||
"quality_score": self.quality_score,
|
"quality_score": self.quality_score,
|
||||||
"is_valid": self.is_valid,
|
"is_valid": self.is_valid,
|
||||||
"metadata": self.metadata,
|
"metadata": self.meta_data,
|
||||||
"created_at": self.created_at.isoformat() if self.created_at else None,
|
"created_at": self.created_at.isoformat() if self.created_at else None,
|
||||||
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
|
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
|
||||||
}
|
}
|
||||||
@@ -317,7 +317,7 @@ class PoseDetection(Base, UUIDMixin, TimestampMixin):
|
|||||||
is_valid = Column(Boolean, default=True, nullable=False)
|
is_valid = Column(Boolean, default=True, nullable=False)
|
||||||
|
|
||||||
# Metadata
|
# Metadata
|
||||||
metadata = Column(JSON, nullable=True)
|
meta_data = Column(JSON, nullable=True)
|
||||||
|
|
||||||
# Constraints and indexes
|
# Constraints and indexes
|
||||||
__table_args__ = (
|
__table_args__ = (
|
||||||
@@ -350,7 +350,7 @@ class PoseDetection(Base, UUIDMixin, TimestampMixin):
|
|||||||
"image_quality": self.image_quality,
|
"image_quality": self.image_quality,
|
||||||
"pose_quality": self.pose_quality,
|
"pose_quality": self.pose_quality,
|
||||||
"is_valid": self.is_valid,
|
"is_valid": self.is_valid,
|
||||||
"metadata": self.metadata,
|
"metadata": self.meta_data,
|
||||||
"created_at": self.created_at.isoformat() if self.created_at else None,
|
"created_at": self.created_at.isoformat() if self.created_at else None,
|
||||||
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
|
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
|
||||||
}
|
}
|
||||||
@@ -378,7 +378,7 @@ class SystemMetric(Base, UUIDMixin, TimestampMixin):
|
|||||||
|
|
||||||
# Metadata
|
# Metadata
|
||||||
description = Column(Text, nullable=True)
|
description = Column(Text, nullable=True)
|
||||||
metadata = Column(JSON, nullable=True)
|
meta_data = Column(JSON, nullable=True)
|
||||||
|
|
||||||
# Constraints and indexes
|
# Constraints and indexes
|
||||||
__table_args__ = (
|
__table_args__ = (
|
||||||
@@ -402,7 +402,7 @@ class SystemMetric(Base, UUIDMixin, TimestampMixin):
|
|||||||
"source": self.source,
|
"source": self.source,
|
||||||
"component": self.component,
|
"component": self.component,
|
||||||
"description": self.description,
|
"description": self.description,
|
||||||
"metadata": self.metadata,
|
"metadata": self.meta_data,
|
||||||
"created_at": self.created_at.isoformat() if self.created_at else None,
|
"created_at": self.created_at.isoformat() if self.created_at else None,
|
||||||
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
|
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
|
||||||
}
|
}
|
||||||
@@ -437,7 +437,7 @@ class AuditLog(Base, UUIDMixin, TimestampMixin):
|
|||||||
error_message = Column(Text, nullable=True)
|
error_message = Column(Text, nullable=True)
|
||||||
|
|
||||||
# Metadata
|
# Metadata
|
||||||
metadata = Column(JSON, nullable=True)
|
meta_data = Column(JSON, nullable=True)
|
||||||
tags = Column(ARRAY(String), nullable=True)
|
tags = Column(ARRAY(String), nullable=True)
|
||||||
|
|
||||||
# Constraints and indexes
|
# Constraints and indexes
|
||||||
@@ -467,7 +467,7 @@ class AuditLog(Base, UUIDMixin, TimestampMixin):
|
|||||||
"changes": self.changes,
|
"changes": self.changes,
|
||||||
"success": self.success,
|
"success": self.success,
|
||||||
"error_message": self.error_message,
|
"error_message": self.error_message,
|
||||||
"metadata": self.metadata,
|
"metadata": self.meta_data,
|
||||||
"tags": self.tags,
|
"tags": self.tags,
|
||||||
"created_at": self.created_at.isoformat() if self.created_at else None,
|
"created_at": self.created_at.isoformat() if self.created_at else None,
|
||||||
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
|
"updated_at": self.updated_at.isoformat() if self.updated_at else None,
|
||||||
|
|||||||
@@ -128,9 +128,8 @@ class HardwareService:
|
|||||||
mock_mode=self.settings.mock_hardware
|
mock_mode=self.settings.mock_hardware
|
||||||
)
|
)
|
||||||
|
|
||||||
# Connect to router
|
# Connect to router (always connect, even in mock mode)
|
||||||
if not self.settings.mock_hardware:
|
await router_interface.connect()
|
||||||
await router_interface.connect()
|
|
||||||
|
|
||||||
self.router_interfaces[router_id] = router_interface
|
self.router_interfaces[router_id] = router_interface
|
||||||
self.logger.info(f"Router interface initialized: {router_id}")
|
self.logger.info(f"Router interface initialized: {router_id}")
|
||||||
|
|||||||
@@ -58,7 +58,7 @@ class MonitoringTask:
|
|||||||
source=metric_data.get("source", self.name),
|
source=metric_data.get("source", self.name),
|
||||||
component=metric_data.get("component"),
|
component=metric_data.get("component"),
|
||||||
description=metric_data.get("description"),
|
description=metric_data.get("description"),
|
||||||
metadata=metric_data.get("metadata"),
|
meta_data=metric_data.get("metadata"),
|
||||||
)
|
)
|
||||||
session.add(metric)
|
session.add(metric)
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user