🗄️ Add SQLite database system with JSON fallback and memory controls
Implement configurable database backends (SQLite/JSON) with unified memory management, automated migration, Docker support, and privacy controls. Maintains full backward compatibility while enabling future PostgreSQL/ChromaDB.
This commit is contained in:
parent
6d4de79ac2
commit
5f8c93ff69
22 changed files with 1372 additions and 23 deletions
186
DATABASE_MIGRATION.md
Normal file
186
DATABASE_MIGRATION.md
Normal file
|
|
@ -0,0 +1,186 @@
|
|||
# Database System Migration Guide
|
||||
|
||||
## Overview
|
||||
|
||||
The Discord bot now supports multiple database backends for storing user profiles and conversation memory:
|
||||
|
||||
- **SQLite**: Fast, reliable, file-based database (recommended)
|
||||
- **JSON**: Original file-based storage (backward compatible)
|
||||
- **Memory Toggle**: Option to completely disable memory features
|
||||
|
||||
## Configuration
|
||||
|
||||
Edit `src/settings.yml` to configure the database system:
|
||||
|
||||
```yaml
|
||||
database:
|
||||
backend: "sqlite" # Options: "sqlite" or "json"
|
||||
sqlite_path: "data/bot_database.db"
|
||||
json_user_profiles: "user_profiles.json"
|
||||
json_memory_data: "memory.json"
|
||||
memory_enabled: true # Set to false to disable memory completely
|
||||
```
|
||||
|
||||
## Migration from JSON
|
||||
|
||||
If you're upgrading from the old JSON-based system:
|
||||
|
||||
1. **Run the migration script:**
|
||||
```bash
|
||||
python migrate_to_database.py
|
||||
```
|
||||
|
||||
2. **What the script does:**
|
||||
- Migrates existing `user_profiles.json` to the database
|
||||
- Migrates existing `memory.json` to the database
|
||||
- Creates backups of original files
|
||||
- Verifies the migration was successful
|
||||
|
||||
3. **After migration:**
|
||||
- Your old JSON files are safely backed up
|
||||
- The bot will use the new database system
|
||||
- All existing data is preserved
|
||||
|
||||
## Backend Comparison
|
||||
|
||||
### SQLite Backend (Recommended)
|
||||
- **Pros:** Fast, reliable, concurrent access, data integrity
|
||||
- **Cons:** Requires SQLite (included with Python)
|
||||
- **Use case:** Production bots, multiple users, long-term storage
|
||||
|
||||
### JSON Backend
|
||||
- **Pros:** Human-readable, easy to backup/edit manually
|
||||
- **Cons:** Slower, potential data loss on concurrent access
|
||||
- **Use case:** Development, single-user bots, debugging
|
||||
|
||||
## Database Schema
|
||||
|
||||
### User Profiles Table
|
||||
- `user_id` (TEXT PRIMARY KEY)
|
||||
- `profile_data` (JSON)
|
||||
- `created_at` (TIMESTAMP)
|
||||
- `updated_at` (TIMESTAMP)
|
||||
|
||||
### Conversation Memory Table
|
||||
- `id` (INTEGER PRIMARY KEY)
|
||||
- `channel_id` (TEXT)
|
||||
- `user_id` (TEXT)
|
||||
- `content` (TEXT)
|
||||
- `context` (TEXT)
|
||||
- `importance_score` (REAL)
|
||||
- `timestamp` (TIMESTAMP)
|
||||
|
||||
### User Memory Table
|
||||
- `id` (INTEGER PRIMARY KEY)
|
||||
- `user_id` (TEXT)
|
||||
- `memory_type` (TEXT)
|
||||
- `content` (TEXT)
|
||||
- `importance_score` (REAL)
|
||||
- `timestamp` (TIMESTAMP)
|
||||
|
||||
## Code Changes
|
||||
|
||||
### New Files
|
||||
- `src/database.py` - Database abstraction layer
|
||||
- `src/memory_manager.py` - Unified memory management
|
||||
- `src/user_profiles_new.py` - Modern user profile management
|
||||
- `migrate_to_database.py` - Migration script
|
||||
|
||||
### Updated Files
|
||||
- `src/enhanced_ai.py` - Uses new memory manager
|
||||
- `src/bot.py` - Updated memory command imports
|
||||
- `src/settings.yml` - Added database configuration
|
||||
- `src/memory.py` - Marked as deprecated
|
||||
|
||||
## API Reference
|
||||
|
||||
### Memory Manager
|
||||
```python
|
||||
from memory_manager import memory_manager
|
||||
|
||||
# Store a message in memory
|
||||
memory_manager.analyze_and_store_message(message, context_messages)
|
||||
|
||||
# Get conversation context
|
||||
context = memory_manager.get_conversation_context(channel_id, hours=24)
|
||||
|
||||
# Get user context
|
||||
user_info = memory_manager.get_user_context(user_id)
|
||||
|
||||
# Format memory for AI prompts
|
||||
memory_text = memory_manager.format_memory_for_prompt(user_id, channel_id)
|
||||
|
||||
# Check if memory is enabled
|
||||
if memory_manager.is_enabled():
|
||||
# Memory operations
|
||||
pass
|
||||
```
|
||||
|
||||
### Database Manager
|
||||
```python
|
||||
from database import db_manager
|
||||
|
||||
# User profiles
|
||||
profile = db_manager.get_user_profile(user_id)
|
||||
db_manager.store_user_profile(user_id, profile_data)
|
||||
|
||||
# Memory storage (if enabled)
|
||||
db_manager.store_conversation_memory(channel_id, user_id, content, context, score)
|
||||
db_manager.store_user_memory(user_id, memory_type, content, score)
|
||||
|
||||
# Retrieval
|
||||
conversations = db_manager.get_conversation_context(channel_id, hours=24)
|
||||
user_memories = db_manager.get_user_context(user_id)
|
||||
|
||||
# Cleanup
|
||||
db_manager.cleanup_old_memories(days=30)
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Migration Issues
|
||||
- **File not found errors:** Ensure you're running from the bot root directory
|
||||
- **Permission errors:** Check file permissions and disk space
|
||||
- **Data corruption:** Restore from backup and try again
|
||||
|
||||
### Runtime Issues
|
||||
- **SQLite locked:** Another process may be using the database
|
||||
- **Memory disabled:** Check `memory_enabled` setting in `settings.yml`
|
||||
- **Import errors:** Ensure all new files are in the `src/` directory
|
||||
|
||||
### Performance
|
||||
- **Slow queries:** SQLite performs much better than JSON for large datasets
|
||||
- **Memory usage:** SQLite is more memory-efficient than loading entire JSON files
|
||||
- **Concurrent access:** Only SQLite supports safe concurrent access
|
||||
|
||||
## Backup and Recovery
|
||||
|
||||
### Automatic Backups
|
||||
- Migration script creates timestamped backups
|
||||
- Original JSON files are preserved
|
||||
|
||||
### Manual Backup
|
||||
```bash
|
||||
# SQLite database
|
||||
cp src/data/bot_database.db src/data/bot_database.db.backup
|
||||
|
||||
# JSON files (if using JSON backend)
|
||||
cp src/user_profiles.json src/user_profiles.json.backup
|
||||
cp src/memory.json src/memory.json.backup
|
||||
```
|
||||
|
||||
### Recovery
|
||||
1. Stop the bot
|
||||
2. Replace corrupted database with backup
|
||||
3. Restart the bot
|
||||
4. Run migration again if needed
|
||||
|
||||
## Future Extensions
|
||||
|
||||
The database abstraction layer is designed to support additional backends:
|
||||
|
||||
- **PostgreSQL**: For large-scale deployments
|
||||
- **ChromaDB**: For advanced semantic memory search
|
||||
- **Redis**: For high-performance caching
|
||||
|
||||
These can be added by implementing the `DatabaseBackend` interface in `database.py`.
|
||||
|
|
@ -1 +1,2 @@
|
|||
Truncated previous log to start fresh
|
||||
[2025-10-10 12:56:24] [ERROR] [migration:45] Failed to migrate user profiles: 'DatabaseManager' object has no attribute 'store_user_profile'
|
||||
|
|
|
|||
23
bot.log
23
bot.log
|
|
@ -1365,3 +1365,26 @@ Loop thread traceback (most recent call last):
|
|||
[2025-09-27 10:54:42] [INFO] 😴 No trigger and engagement is 0 — skipping.
|
||||
[2025-09-27 10:54:42] [INFO] [autochat:161] 😴 No trigger and engagement is 0 — skipping.
|
||||
[2025-09-27 12:38:20] [INFO ] discord.gateway: Shard ID None has successfully RESUMED session eed4b6aaccccd97b5456c73e342e25a1.
|
||||
[2025-10-10 12:56:05] [INFO] [database:325] Connected to JSON backend
|
||||
[2025-10-10 12:56:05] [INFO] [database:539] Initialized JSON database backend
|
||||
[2025-10-10 12:56:05] [INFO] [migration:147] Initializing database system...
|
||||
[2025-10-10 12:56:24] [INFO] [database:325] Connected to JSON backend
|
||||
[2025-10-10 12:56:24] [INFO] [database:539] Initialized JSON database backend
|
||||
[2025-10-10 12:56:24] [INFO] [migration:147] Initializing database system...
|
||||
[2025-10-10 12:56:24] [INFO] [migration:156] Starting migration process...
|
||||
[2025-10-10 12:56:24] [ERROR] [migration:45] Failed to migrate user profiles: 'DatabaseManager' object has no attribute 'store_user_profile'
|
||||
[2025-10-10 12:56:24] [INFO] [migration:87] Migrated 0 conversation memories and 0 user memories
|
||||
[2025-10-10 12:56:24] [INFO] [migration:92] Backed up original file to src/memory.json.backup.20251010_125624
|
||||
[2025-10-10 12:56:24] [INFO] [migration:99] Verifying migration...
|
||||
[2025-10-10 12:57:27] [INFO] [database:325] Connected to JSON backend
|
||||
[2025-10-10 12:57:27] [INFO] [database:539] Initialized JSON database backend
|
||||
[2025-10-10 12:57:27] [INFO] [migration:147] Initializing database system...
|
||||
[2025-10-10 12:57:27] [INFO] [migration:156] Starting migration process...
|
||||
[2025-10-10 12:57:27] [INFO] [migration:37] Migrated 2 user profiles
|
||||
[2025-10-10 12:57:27] [INFO] [migration:42] Backed up original file to src/user_profiles.json.backup.20251010_125727
|
||||
[2025-10-10 12:57:27] [INFO] [migration:87] Migrated 0 conversation memories and 0 user memories
|
||||
[2025-10-10 12:57:27] [INFO] [migration:92] Backed up original file to src/memory.json.backup.20251010_125727
|
||||
[2025-10-10 12:57:27] [INFO] [migration:99] Verifying migration...
|
||||
[2025-10-10 12:57:27] [INFO] [migration:107] ✓ User profile operations working
|
||||
[2025-10-10 12:57:27] [INFO] [migration:118] ✓ Memory operations working
|
||||
[2025-10-10 12:57:27] [INFO] [migration:138] ✓ Migration verification completed successfully
|
||||
|
|
|
|||
41
docker-compose.examples.yml
Normal file
41
docker-compose.examples.yml
Normal file
|
|
@ -0,0 +1,41 @@
|
|||
# docker-compose.yml example for SQLite (internal database)
|
||||
version: '3.8'
|
||||
services:
|
||||
deltabot:
|
||||
build: .
|
||||
environment:
|
||||
- DATABASE_BACKEND=sqlite
|
||||
- SQLITE_PATH=data/deltabot.db # Internal to container
|
||||
- MEMORY_ENABLED=true
|
||||
volumes:
|
||||
# Optional: Mount data directory if you want persistence across container recreations
|
||||
- ./bot-data:/app/src/data
|
||||
# Mount config if you want to edit settings externally
|
||||
- ./src/settings.yml:/app/src/settings.yml
|
||||
restart: unless-stopped
|
||||
|
||||
---
|
||||
|
||||
# docker-compose.yml example for external databases (future)
|
||||
version: '3.8'
|
||||
services:
|
||||
deltabot:
|
||||
build: .
|
||||
environment:
|
||||
- DATABASE_BACKEND=postgresql
|
||||
- POSTGRES_URL=postgresql://user:pass@postgres:5432/deltabot
|
||||
depends_on:
|
||||
- postgres
|
||||
restart: unless-stopped
|
||||
|
||||
postgres:
|
||||
image: postgres:13
|
||||
environment:
|
||||
POSTGRES_DB: deltabot
|
||||
POSTGRES_USER: deltauser
|
||||
POSTGRES_PASSWORD: deltapass
|
||||
volumes:
|
||||
- postgres_data:/var/lib/postgresql/data
|
||||
|
||||
volumes:
|
||||
postgres_data:
|
||||
181
migrate_to_database.py
Executable file
181
migrate_to_database.py
Executable file
|
|
@ -0,0 +1,181 @@
|
|||
#!/usr/bin/env python3
|
||||
"""
|
||||
migrate_to_database.py
|
||||
Migration script to move from JSON files to database system
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
from datetime import datetime
|
||||
|
||||
# Add src directory to path for imports
|
||||
sys.path.insert(0, os.path.join(os.path.dirname(__file__), 'src'))
|
||||
|
||||
from database import db_manager
|
||||
from logger import setup_logger
|
||||
|
||||
logger = setup_logger("migration")
|
||||
|
||||
def migrate_user_profiles():
|
||||
"""Migrate user_profiles.json to database"""
|
||||
profiles_path = os.path.join("src", "user_profiles.json")
|
||||
|
||||
if not os.path.exists(profiles_path):
|
||||
logger.info("No user_profiles.json found, skipping user profile migration")
|
||||
return
|
||||
|
||||
try:
|
||||
with open(profiles_path, 'r', encoding='utf-8') as f:
|
||||
profiles = json.load(f)
|
||||
|
||||
migrated_count = 0
|
||||
for user_id, profile in profiles.items():
|
||||
db_manager.save_user_profile(user_id, profile)
|
||||
migrated_count += 1
|
||||
|
||||
logger.info(f"Migrated {migrated_count} user profiles")
|
||||
|
||||
# Backup original file
|
||||
backup_path = f"{profiles_path}.backup.{datetime.now().strftime('%Y%m%d_%H%M%S')}"
|
||||
os.rename(profiles_path, backup_path)
|
||||
logger.info(f"Backed up original file to {backup_path}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to migrate user profiles: {e}")
|
||||
|
||||
def migrate_memory_data():
|
||||
"""Migrate memory.json to database"""
|
||||
memory_path = os.path.join("src", "memory.json")
|
||||
|
||||
if not os.path.exists(memory_path):
|
||||
logger.info("No memory.json found, skipping memory migration")
|
||||
return
|
||||
|
||||
try:
|
||||
with open(memory_path, 'r', encoding='utf-8') as f:
|
||||
memory_data = json.load(f)
|
||||
|
||||
migrated_conversations = 0
|
||||
migrated_user_memories = 0
|
||||
|
||||
# Migrate conversation memories
|
||||
conversations = memory_data.get("conversations", {})
|
||||
for channel_id, memories in conversations.items():
|
||||
for memory in memories:
|
||||
db_manager.store_conversation_memory(
|
||||
channel_id=channel_id,
|
||||
user_id=memory.get("user_id", "unknown"),
|
||||
content=memory.get("content", ""),
|
||||
context=memory.get("context", ""),
|
||||
importance_score=memory.get("importance_score", 0.5)
|
||||
)
|
||||
migrated_conversations += 1
|
||||
|
||||
# Migrate user memories
|
||||
user_memories = memory_data.get("user_memories", {})
|
||||
for user_id, memories in user_memories.items():
|
||||
for memory in memories:
|
||||
db_manager.store_user_memory(
|
||||
user_id=user_id,
|
||||
memory_type=memory.get("type", "general"),
|
||||
content=memory.get("content", ""),
|
||||
importance_score=memory.get("importance_score", 0.5)
|
||||
)
|
||||
migrated_user_memories += 1
|
||||
|
||||
logger.info(f"Migrated {migrated_conversations} conversation memories and {migrated_user_memories} user memories")
|
||||
|
||||
# Backup original file
|
||||
backup_path = f"{memory_path}.backup.{datetime.now().strftime('%Y%m%d_%H%M%S')}"
|
||||
os.rename(memory_path, backup_path)
|
||||
logger.info(f"Backed up original file to {backup_path}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to migrate memory data: {e}")
|
||||
|
||||
def verify_migration():
|
||||
"""Verify that migration was successful"""
|
||||
logger.info("Verifying migration...")
|
||||
|
||||
# Test user profile operations
|
||||
test_profile = {"name": "test", "display_name": "Test User", "interactions": 5}
|
||||
db_manager.save_user_profile("test_user", test_profile)
|
||||
retrieved = db_manager.get_user_profile("test_user")
|
||||
|
||||
if retrieved and retrieved["interactions"] == 5:
|
||||
logger.info("✓ User profile operations working")
|
||||
else:
|
||||
logger.error("✗ User profile operations failed")
|
||||
return False
|
||||
|
||||
# Test memory operations (only if enabled)
|
||||
if db_manager.is_memory_enabled():
|
||||
db_manager.store_conversation_memory("test_channel", "test_user", "test message", "test context", 0.8)
|
||||
memories = db_manager.get_conversation_context("test_channel", hours=1)
|
||||
|
||||
if memories and len(memories) > 0:
|
||||
logger.info("✓ Memory operations working")
|
||||
else:
|
||||
logger.error("✗ Memory operations failed")
|
||||
return False
|
||||
else:
|
||||
logger.info("- Memory system disabled, skipping memory tests")
|
||||
|
||||
# Clean up test data
|
||||
try:
|
||||
if hasattr(db_manager.backend, 'conn'): # SQLite backend
|
||||
cursor = db_manager.backend.conn.cursor()
|
||||
cursor.execute("DELETE FROM user_profiles WHERE user_id = 'test_user'")
|
||||
cursor.execute("DELETE FROM conversation_memory WHERE channel_id = 'test_channel'")
|
||||
db_manager.backend.conn.commit()
|
||||
else: # JSON backend
|
||||
# Test data will be cleaned up naturally
|
||||
pass
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to clean up test data: {e}")
|
||||
|
||||
logger.info("✓ Migration verification completed successfully")
|
||||
return True
|
||||
|
||||
def main():
|
||||
"""Main migration function"""
|
||||
print("=== Discord Bot Database Migration ===")
|
||||
print()
|
||||
|
||||
# Initialize database (it auto-initializes when imported)
|
||||
logger.info("Initializing database system...")
|
||||
# db_manager auto-initializes when imported
|
||||
|
||||
print(f"Current configuration:")
|
||||
print(f" Backend: {db_manager.get_backend_type()}")
|
||||
print(f" Memory enabled: {db_manager.is_memory_enabled()}")
|
||||
print()
|
||||
|
||||
# Run migrations
|
||||
logger.info("Starting migration process...")
|
||||
migrate_user_profiles()
|
||||
migrate_memory_data()
|
||||
|
||||
# Verify
|
||||
if verify_migration():
|
||||
print()
|
||||
print("✓ Migration completed successfully!")
|
||||
print()
|
||||
print("Next steps:")
|
||||
print("1. Update your bot code to use the new system")
|
||||
print("2. Test the bot to ensure everything works")
|
||||
print("3. Your original JSON files have been backed up")
|
||||
print()
|
||||
print("Configuration file: src/settings.yml")
|
||||
print("You can switch between SQLite and JSON backends in the database section.")
|
||||
else:
|
||||
print()
|
||||
print("✗ Migration verification failed!")
|
||||
print("Please check the logs and try again.")
|
||||
return 1
|
||||
|
||||
return 0
|
||||
|
||||
if __name__ == "__main__":
|
||||
exit(main())
|
||||
BIN
src/__pycache__/database.cpython-312.pyc
Normal file
BIN
src/__pycache__/database.cpython-312.pyc
Normal file
Binary file not shown.
BIN
src/__pycache__/logger.cpython-312.pyc
Normal file
BIN
src/__pycache__/logger.cpython-312.pyc
Normal file
Binary file not shown.
0
src/bot.error.log
Normal file
0
src/bot.error.log
Normal file
8
src/bot.log
Normal file
8
src/bot.log
Normal file
|
|
@ -0,0 +1,8 @@
|
|||
[2025-10-10 12:58:15] [INFO] [database:325] Connected to JSON backend
|
||||
[2025-10-10 12:58:15] [INFO] [database:539] Initialized JSON database backend
|
||||
[2025-10-10 13:00:54] [INFO] [database:81] Connected to SQLite database: data/deltabot.db
|
||||
[2025-10-10 13:00:54] [DEBUG] [database:142] Database tables initialized
|
||||
[2025-10-10 13:00:54] [INFO] [database:536] Initialized SQLite database backend
|
||||
[2025-10-10 13:01:51] [INFO] [database:81] Connected to SQLite database: data/deltabot.db
|
||||
[2025-10-10 13:01:51] [DEBUG] [database:142] Database tables initialized
|
||||
[2025-10-10 13:01:51] [INFO] [database:536] Initialized SQLite database backend
|
||||
|
|
@ -516,7 +516,7 @@ async def list_models(ctx):
|
|||
async def memory_cmd(ctx, action: str = "info", *, target: str = None):
|
||||
"""Memory management: !memory info [@user], !memory cleanup, !memory summary"""
|
||||
from enhanced_ai import get_user_memory_summary
|
||||
from memory import memory_manager
|
||||
from memory_manager import memory_manager
|
||||
|
||||
if action == "info":
|
||||
user_id = str(ctx.author.id)
|
||||
|
|
|
|||
BIN
src/data/deltabot.db
Normal file
BIN
src/data/deltabot.db
Normal file
Binary file not shown.
599
src/database.py
Normal file
599
src/database.py
Normal file
|
|
@ -0,0 +1,599 @@
|
|||
"""
|
||||
database.py
|
||||
Database abstraction layer supporting SQLite and JSON backends
|
||||
"""
|
||||
|
||||
import os
|
||||
import json
|
||||
import sqlite3
|
||||
import yaml
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Dict, List, Any, Optional, Union
|
||||
from abc import ABC, abstractmethod
|
||||
from logger import setup_logger
|
||||
|
||||
logger = setup_logger("database")
|
||||
|
||||
class DatabaseBackend(ABC):
|
||||
"""Abstract base class for database backends"""
|
||||
|
||||
@abstractmethod
|
||||
def connect(self):
|
||||
"""Initialize connection to database"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def close(self):
|
||||
"""Close database connection"""
|
||||
pass
|
||||
|
||||
# User Profile methods
|
||||
@abstractmethod
|
||||
def get_user_profile(self, user_id: str) -> Optional[Dict]:
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def save_user_profile(self, user_id: str, profile: Dict):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_all_user_profiles(self) -> Dict[str, Dict]:
|
||||
pass
|
||||
|
||||
# Memory methods (only if memory is enabled)
|
||||
@abstractmethod
|
||||
def store_conversation_memory(self, channel_id: str, user_id: str, content: str,
|
||||
context: str, importance: float, timestamp: str):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_conversation_context(self, channel_id: str, hours: int = 24) -> List[Dict]:
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def store_user_memory(self, user_id: str, memory_type: str, content: str,
|
||||
importance: float, timestamp: str):
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def get_user_context(self, user_id: str) -> List[Dict]:
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def cleanup_old_memories(self, days: int = 30):
|
||||
pass
|
||||
|
||||
|
||||
class SQLiteBackend(DatabaseBackend):
|
||||
"""SQLite database backend"""
|
||||
|
||||
def __init__(self, db_path: str = "data/deltabot.db"):
|
||||
self.db_path = db_path
|
||||
self.connection = None
|
||||
self.connect()
|
||||
self._init_tables()
|
||||
|
||||
def connect(self):
|
||||
"""Initialize SQLite connection"""
|
||||
os.makedirs(os.path.dirname(self.db_path), exist_ok=True)
|
||||
self.connection = sqlite3.connect(self.db_path, check_same_thread=False)
|
||||
self.connection.row_factory = sqlite3.Row # Enable dict-like access
|
||||
logger.info(f"Connected to SQLite database: {self.db_path}")
|
||||
|
||||
def close(self):
|
||||
"""Close SQLite connection"""
|
||||
if self.connection:
|
||||
self.connection.close()
|
||||
self.connection = None
|
||||
|
||||
def _init_tables(self):
|
||||
"""Initialize database tables"""
|
||||
cursor = self.connection.cursor()
|
||||
|
||||
# User profiles table
|
||||
cursor.execute('''
|
||||
CREATE TABLE IF NOT EXISTS user_profiles (
|
||||
user_id TEXT PRIMARY KEY,
|
||||
name TEXT,
|
||||
display_name TEXT,
|
||||
first_seen TEXT,
|
||||
last_seen TEXT,
|
||||
last_message TEXT,
|
||||
interactions INTEGER DEFAULT 0,
|
||||
pronouns TEXT,
|
||||
avatar_url TEXT,
|
||||
custom_prompt TEXT,
|
||||
profile_data TEXT -- JSON string for additional data
|
||||
)
|
||||
''')
|
||||
|
||||
# Conversation memories table
|
||||
cursor.execute('''
|
||||
CREATE TABLE IF NOT EXISTS conversation_memories (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
channel_id TEXT,
|
||||
user_id TEXT,
|
||||
content TEXT,
|
||||
context TEXT,
|
||||
importance REAL,
|
||||
timestamp TEXT,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
''')
|
||||
|
||||
# User memories table
|
||||
cursor.execute('''
|
||||
CREATE TABLE IF NOT EXISTS user_memories (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
user_id TEXT,
|
||||
memory_type TEXT,
|
||||
content TEXT,
|
||||
importance REAL,
|
||||
timestamp TEXT,
|
||||
created_at DATETIME DEFAULT CURRENT_TIMESTAMP
|
||||
)
|
||||
''')
|
||||
|
||||
# Create indexes for better performance
|
||||
cursor.execute('CREATE INDEX IF NOT EXISTS idx_conv_channel_time ON conversation_memories(channel_id, timestamp)')
|
||||
cursor.execute('CREATE INDEX IF NOT EXISTS idx_user_mem_user_time ON user_memories(user_id, timestamp)')
|
||||
|
||||
self.connection.commit()
|
||||
logger.debug("Database tables initialized")
|
||||
|
||||
def get_user_profile(self, user_id: str) -> Optional[Dict]:
|
||||
"""Get user profile from SQLite"""
|
||||
cursor = self.connection.cursor()
|
||||
cursor.execute('SELECT * FROM user_profiles WHERE user_id = ?', (user_id,))
|
||||
row = cursor.fetchone()
|
||||
|
||||
if row:
|
||||
profile = dict(row)
|
||||
# Parse JSON profile_data if exists
|
||||
if profile.get('profile_data'):
|
||||
try:
|
||||
extra_data = json.loads(profile['profile_data'])
|
||||
profile.update(extra_data)
|
||||
except:
|
||||
pass
|
||||
del profile['profile_data'] # Remove the JSON field
|
||||
return profile
|
||||
return None
|
||||
|
||||
def save_user_profile(self, user_id: str, profile: Dict):
|
||||
"""Save user profile to SQLite"""
|
||||
cursor = self.connection.cursor()
|
||||
|
||||
# Separate known fields from extra data
|
||||
known_fields = {
|
||||
'name', 'display_name', 'first_seen', 'last_seen', 'last_message',
|
||||
'interactions', 'pronouns', 'avatar_url', 'custom_prompt'
|
||||
}
|
||||
|
||||
base_profile = {k: v for k, v in profile.items() if k in known_fields}
|
||||
extra_data = {k: v for k, v in profile.items() if k not in known_fields}
|
||||
|
||||
cursor.execute('''
|
||||
INSERT OR REPLACE INTO user_profiles
|
||||
(user_id, name, display_name, first_seen, last_seen, last_message,
|
||||
interactions, pronouns, avatar_url, custom_prompt, profile_data)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
''', (
|
||||
user_id,
|
||||
base_profile.get('name'),
|
||||
base_profile.get('display_name'),
|
||||
base_profile.get('first_seen'),
|
||||
base_profile.get('last_seen'),
|
||||
base_profile.get('last_message'),
|
||||
base_profile.get('interactions', 0),
|
||||
base_profile.get('pronouns'),
|
||||
base_profile.get('avatar_url'),
|
||||
base_profile.get('custom_prompt'),
|
||||
json.dumps(extra_data) if extra_data else None
|
||||
))
|
||||
|
||||
self.connection.commit()
|
||||
|
||||
def get_all_user_profiles(self) -> Dict[str, Dict]:
|
||||
"""Get all user profiles from SQLite"""
|
||||
cursor = self.connection.cursor()
|
||||
cursor.execute('SELECT * FROM user_profiles')
|
||||
profiles = {}
|
||||
|
||||
for row in cursor.fetchall():
|
||||
profile = dict(row)
|
||||
user_id = profile.pop('user_id')
|
||||
|
||||
# Parse JSON profile_data if exists
|
||||
if profile.get('profile_data'):
|
||||
try:
|
||||
extra_data = json.loads(profile['profile_data'])
|
||||
profile.update(extra_data)
|
||||
except:
|
||||
pass
|
||||
if 'profile_data' in profile:
|
||||
del profile['profile_data']
|
||||
|
||||
profiles[user_id] = profile
|
||||
|
||||
return profiles
|
||||
|
||||
def store_conversation_memory(self, channel_id: str, user_id: str, content: str,
|
||||
context: str, importance: float, timestamp: str):
|
||||
"""Store conversation memory in SQLite"""
|
||||
cursor = self.connection.cursor()
|
||||
cursor.execute('''
|
||||
INSERT INTO conversation_memories
|
||||
(channel_id, user_id, content, context, importance, timestamp)
|
||||
VALUES (?, ?, ?, ?, ?, ?)
|
||||
''', (channel_id, user_id, content, context[:500], importance, timestamp))
|
||||
|
||||
# Keep only last 100 memories per channel
|
||||
cursor.execute('''
|
||||
DELETE FROM conversation_memories
|
||||
WHERE channel_id = ? AND id NOT IN (
|
||||
SELECT id FROM conversation_memories
|
||||
WHERE channel_id = ?
|
||||
ORDER BY timestamp DESC LIMIT 100
|
||||
)
|
||||
''', (channel_id, channel_id))
|
||||
|
||||
self.connection.commit()
|
||||
|
||||
def get_conversation_context(self, channel_id: str, hours: int = 24) -> List[Dict]:
|
||||
"""Get recent conversation memories from SQLite"""
|
||||
cursor = self.connection.cursor()
|
||||
cutoff_time = (datetime.utcnow() - timedelta(hours=hours)).isoformat()
|
||||
|
||||
cursor.execute('''
|
||||
SELECT * FROM conversation_memories
|
||||
WHERE channel_id = ? AND timestamp > ?
|
||||
ORDER BY importance DESC, timestamp DESC
|
||||
LIMIT 10
|
||||
''', (channel_id, cutoff_time))
|
||||
|
||||
return [dict(row) for row in cursor.fetchall()]
|
||||
|
||||
def store_user_memory(self, user_id: str, memory_type: str, content: str,
|
||||
importance: float, timestamp: str):
|
||||
"""Store user memory in SQLite"""
|
||||
cursor = self.connection.cursor()
|
||||
cursor.execute('''
|
||||
INSERT INTO user_memories
|
||||
(user_id, memory_type, content, importance, timestamp)
|
||||
VALUES (?, ?, ?, ?, ?)
|
||||
''', (user_id, memory_type, content, importance, timestamp))
|
||||
|
||||
# Keep only last 50 memories per user
|
||||
cursor.execute('''
|
||||
DELETE FROM user_memories
|
||||
WHERE user_id = ? AND id NOT IN (
|
||||
SELECT id FROM user_memories
|
||||
WHERE user_id = ?
|
||||
ORDER BY timestamp DESC LIMIT 50
|
||||
)
|
||||
''', (user_id, user_id))
|
||||
|
||||
self.connection.commit()
|
||||
|
||||
def get_user_context(self, user_id: str) -> List[Dict]:
|
||||
"""Get user memories from SQLite"""
|
||||
cursor = self.connection.cursor()
|
||||
cursor.execute('''
|
||||
SELECT * FROM user_memories
|
||||
WHERE user_id = ?
|
||||
ORDER BY importance DESC, timestamp DESC
|
||||
LIMIT 5
|
||||
''', (user_id,))
|
||||
|
||||
return [dict(row) for row in cursor.fetchall()]
|
||||
|
||||
def cleanup_old_memories(self, days: int = 30):
|
||||
"""Clean up old memories from SQLite"""
|
||||
cursor = self.connection.cursor()
|
||||
cutoff_time = (datetime.utcnow() - timedelta(days=days)).isoformat()
|
||||
|
||||
# Clean conversation memories
|
||||
cursor.execute('''
|
||||
DELETE FROM conversation_memories
|
||||
WHERE timestamp < ?
|
||||
''', (cutoff_time,))
|
||||
|
||||
# Clean user memories (keep important ones longer)
|
||||
cursor.execute('''
|
||||
DELETE FROM user_memories
|
||||
WHERE timestamp < ? AND importance <= 0.7
|
||||
''', (cutoff_time,))
|
||||
|
||||
deleted_conv = cursor.rowcount
|
||||
self.connection.commit()
|
||||
|
||||
logger.info(f"Cleaned up {deleted_conv} old memories from SQLite")
|
||||
|
||||
|
||||
class JSONBackend(DatabaseBackend):
|
||||
"""JSON file-based backend (existing system)"""
|
||||
|
||||
def __init__(self, profiles_path: str = None, memory_path: str = None):
|
||||
self.profiles_path = profiles_path or os.path.join(os.path.dirname(__file__), "user_profiles.json")
|
||||
self.memory_path = memory_path or os.path.join(os.path.dirname(__file__), "memory.json")
|
||||
self.connect()
|
||||
|
||||
def connect(self):
|
||||
"""Initialize JSON backend"""
|
||||
self._ensure_files()
|
||||
logger.info("Connected to JSON backend")
|
||||
|
||||
def close(self):
|
||||
"""JSON backend doesn't need explicit closing"""
|
||||
pass
|
||||
|
||||
def _ensure_files(self):
|
||||
"""Ensure JSON files exist"""
|
||||
# Ensure profiles file
|
||||
if not os.path.exists(self.profiles_path):
|
||||
with open(self.profiles_path, "w", encoding="utf-8") as f:
|
||||
json.dump({}, f, indent=2)
|
||||
|
||||
# Ensure memory file
|
||||
if not os.path.exists(self.memory_path):
|
||||
initial_data = {
|
||||
"conversations": {},
|
||||
"user_memories": {},
|
||||
"global_events": []
|
||||
}
|
||||
with open(self.memory_path, "w", encoding="utf-8") as f:
|
||||
json.dump(initial_data, f, indent=2)
|
||||
|
||||
def _load_profiles(self) -> Dict:
|
||||
"""Load profiles from JSON"""
|
||||
try:
|
||||
with open(self.profiles_path, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
except:
|
||||
return {}
|
||||
|
||||
def _save_profiles(self, profiles: Dict):
|
||||
"""Save profiles to JSON"""
|
||||
with open(self.profiles_path, "w", encoding="utf-8") as f:
|
||||
json.dump(profiles, f, indent=2)
|
||||
|
||||
def _load_memory(self) -> Dict:
|
||||
"""Load memory from JSON"""
|
||||
try:
|
||||
with open(self.memory_path, "r", encoding="utf-8") as f:
|
||||
return json.load(f)
|
||||
except:
|
||||
return {"conversations": {}, "user_memories": {}, "global_events": []}
|
||||
|
||||
def _save_memory(self, memory: Dict):
|
||||
"""Save memory to JSON"""
|
||||
with open(self.memory_path, "w", encoding="utf-8") as f:
|
||||
json.dump(memory, f, indent=2)
|
||||
|
||||
def get_user_profile(self, user_id: str) -> Optional[Dict]:
|
||||
"""Get user profile from JSON"""
|
||||
profiles = self._load_profiles()
|
||||
return profiles.get(user_id)
|
||||
|
||||
def save_user_profile(self, user_id: str, profile: Dict):
|
||||
"""Save user profile to JSON"""
|
||||
profiles = self._load_profiles()
|
||||
profiles[user_id] = profile
|
||||
self._save_profiles(profiles)
|
||||
|
||||
def get_all_user_profiles(self) -> Dict[str, Dict]:
|
||||
"""Get all user profiles from JSON"""
|
||||
return self._load_profiles()
|
||||
|
||||
def store_conversation_memory(self, channel_id: str, user_id: str, content: str,
|
||||
context: str, importance: float, timestamp: str):
|
||||
"""Store conversation memory in JSON"""
|
||||
memory_data = self._load_memory()
|
||||
|
||||
memory_entry = {
|
||||
"timestamp": timestamp,
|
||||
"user_id": user_id,
|
||||
"content": content,
|
||||
"context": context[:500],
|
||||
"importance": importance,
|
||||
"id": f"{channel_id}_{int(datetime.fromisoformat(timestamp).timestamp())}"
|
||||
}
|
||||
|
||||
channel_key = str(channel_id)
|
||||
if channel_key not in memory_data["conversations"]:
|
||||
memory_data["conversations"][channel_key] = []
|
||||
|
||||
memory_data["conversations"][channel_key].append(memory_entry)
|
||||
memory_data["conversations"][channel_key] = memory_data["conversations"][channel_key][-100:]
|
||||
|
||||
self._save_memory(memory_data)
|
||||
|
||||
def get_conversation_context(self, channel_id: str, hours: int = 24) -> List[Dict]:
|
||||
"""Get recent conversation memories from JSON"""
|
||||
memory_data = self._load_memory()
|
||||
channel_key = str(channel_id)
|
||||
|
||||
if channel_key not in memory_data["conversations"]:
|
||||
return []
|
||||
|
||||
cutoff_time = datetime.utcnow() - timedelta(hours=hours)
|
||||
recent_memories = []
|
||||
|
||||
for memory in memory_data["conversations"][channel_key]:
|
||||
memory_time = datetime.fromisoformat(memory["timestamp"])
|
||||
if memory_time > cutoff_time:
|
||||
recent_memories.append(memory)
|
||||
|
||||
recent_memories.sort(key=lambda x: (x["importance"], x["timestamp"]), reverse=True)
|
||||
return recent_memories[:10]
|
||||
|
||||
def store_user_memory(self, user_id: str, memory_type: str, content: str,
|
||||
importance: float, timestamp: str):
|
||||
"""Store user memory in JSON"""
|
||||
memory_data = self._load_memory()
|
||||
|
||||
user_key = str(user_id)
|
||||
if user_key not in memory_data["user_memories"]:
|
||||
memory_data["user_memories"][user_key] = []
|
||||
|
||||
memory_entry = {
|
||||
"timestamp": timestamp,
|
||||
"type": memory_type,
|
||||
"content": content,
|
||||
"importance": importance,
|
||||
"id": f"{user_id}_{memory_type}_{int(datetime.fromisoformat(timestamp).timestamp())}"
|
||||
}
|
||||
|
||||
memory_data["user_memories"][user_key].append(memory_entry)
|
||||
memory_data["user_memories"][user_key] = memory_data["user_memories"][user_key][-50:]
|
||||
|
||||
self._save_memory(memory_data)
|
||||
|
||||
def get_user_context(self, user_id: str) -> List[Dict]:
|
||||
"""Get user memories from JSON"""
|
||||
memory_data = self._load_memory()
|
||||
user_key = str(user_id)
|
||||
|
||||
if user_key not in memory_data["user_memories"]:
|
||||
return []
|
||||
|
||||
user_memories = memory_data["user_memories"][user_key]
|
||||
user_memories.sort(key=lambda x: (x["importance"], x["timestamp"]), reverse=True)
|
||||
return user_memories[:5]
|
||||
|
||||
def cleanup_old_memories(self, days: int = 30):
|
||||
"""Clean up old memories from JSON"""
|
||||
memory_data = self._load_memory()
|
||||
cutoff_time = datetime.utcnow() - timedelta(days=days)
|
||||
cleaned = False
|
||||
|
||||
# Clean conversation memories
|
||||
for channel_id in memory_data["conversations"]:
|
||||
original_count = len(memory_data["conversations"][channel_id])
|
||||
memory_data["conversations"][channel_id] = [
|
||||
memory for memory in memory_data["conversations"][channel_id]
|
||||
if datetime.fromisoformat(memory["timestamp"]) > cutoff_time
|
||||
]
|
||||
if len(memory_data["conversations"][channel_id]) < original_count:
|
||||
cleaned = True
|
||||
|
||||
# Clean user memories (keep important ones longer)
|
||||
for user_id in memory_data["user_memories"]:
|
||||
original_count = len(memory_data["user_memories"][user_id])
|
||||
memory_data["user_memories"][user_id] = [
|
||||
memory for memory in memory_data["user_memories"][user_id]
|
||||
if (datetime.fromisoformat(memory["timestamp"]) > cutoff_time or
|
||||
memory["importance"] > 0.7)
|
||||
]
|
||||
if len(memory_data["user_memories"][user_id]) < original_count:
|
||||
cleaned = True
|
||||
|
||||
if cleaned:
|
||||
self._save_memory(memory_data)
|
||||
logger.info(f"Cleaned up old memories from JSON files")
|
||||
|
||||
|
||||
class DatabaseManager:
|
||||
"""Main database manager that handles backend selection and configuration"""
|
||||
|
||||
def __init__(self):
|
||||
self.backend = None
|
||||
self.memory_enabled = True
|
||||
self._load_config()
|
||||
self._init_backend()
|
||||
|
||||
def _load_config(self):
|
||||
"""Load database configuration from settings"""
|
||||
try:
|
||||
settings_path = os.path.join(os.path.dirname(__file__), "settings.yml")
|
||||
with open(settings_path, "r", encoding="utf-8") as f:
|
||||
settings = yaml.safe_load(f)
|
||||
|
||||
db_config = settings.get("database", {})
|
||||
|
||||
# Allow environment variable overrides for Docker
|
||||
self.backend_type = os.getenv("DATABASE_BACKEND", db_config.get("backend", "json")).lower()
|
||||
self.memory_enabled = os.getenv("MEMORY_ENABLED", "true").lower() == "true" if os.getenv("MEMORY_ENABLED") else settings.get("memory", {}).get("enabled", True)
|
||||
|
||||
# SQLite specific config
|
||||
self.sqlite_path = os.getenv("SQLITE_PATH", db_config.get("sqlite_path", "data/deltabot.db"))
|
||||
|
||||
# JSON specific config
|
||||
self.profiles_path = db_config.get("profiles_path", "src/user_profiles.json")
|
||||
self.memory_path = db_config.get("memory_path", "src/memory.json")
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to load database config: {e}, using defaults")
|
||||
self.backend_type = "json"
|
||||
self.memory_enabled = True
|
||||
self.sqlite_path = "data/deltabot.db"
|
||||
self.profiles_path = "src/user_profiles.json"
|
||||
self.memory_path = "src/memory.json"
|
||||
|
||||
def _init_backend(self):
|
||||
"""Initialize the selected backend"""
|
||||
if self.backend_type == "sqlite":
|
||||
self.backend = SQLiteBackend(self.sqlite_path)
|
||||
logger.info("Initialized SQLite database backend")
|
||||
else:
|
||||
self.backend = JSONBackend(self.profiles_path, self.memory_path)
|
||||
logger.info("Initialized JSON database backend")
|
||||
|
||||
def close(self):
|
||||
"""Close database connection"""
|
||||
if self.backend:
|
||||
self.backend.close()
|
||||
|
||||
# User Profile methods
|
||||
def get_user_profile(self, user_id: str) -> Optional[Dict]:
|
||||
return self.backend.get_user_profile(str(user_id))
|
||||
|
||||
def save_user_profile(self, user_id: str, profile: Dict):
|
||||
self.backend.save_user_profile(str(user_id), profile)
|
||||
|
||||
def get_all_user_profiles(self) -> Dict[str, Dict]:
|
||||
return self.backend.get_all_user_profiles()
|
||||
|
||||
# Memory methods (only if memory is enabled)
|
||||
def store_conversation_memory(self, channel_id: str, user_id: str, content: str,
|
||||
context: str, importance: float):
|
||||
if not self.memory_enabled:
|
||||
return
|
||||
|
||||
timestamp = datetime.utcnow().isoformat()
|
||||
self.backend.store_conversation_memory(
|
||||
str(channel_id), str(user_id), content, context, importance, timestamp
|
||||
)
|
||||
|
||||
def get_conversation_context(self, channel_id: str, hours: int = 24) -> List[Dict]:
|
||||
if not self.memory_enabled:
|
||||
return []
|
||||
return self.backend.get_conversation_context(str(channel_id), hours)
|
||||
|
||||
def store_user_memory(self, user_id: str, memory_type: str, content: str, importance: float):
|
||||
if not self.memory_enabled:
|
||||
return
|
||||
|
||||
timestamp = datetime.utcnow().isoformat()
|
||||
self.backend.store_user_memory(str(user_id), memory_type, content, importance, timestamp)
|
||||
|
||||
def get_user_context(self, user_id: str) -> List[Dict]:
|
||||
if not self.memory_enabled:
|
||||
return []
|
||||
return self.backend.get_user_context(str(user_id))
|
||||
|
||||
def cleanup_old_memories(self, days: int = 30):
|
||||
if not self.memory_enabled:
|
||||
return
|
||||
self.backend.cleanup_old_memories(days)
|
||||
|
||||
def is_memory_enabled(self) -> bool:
|
||||
return self.memory_enabled
|
||||
|
||||
def get_backend_type(self) -> str:
|
||||
return self.backend_type
|
||||
|
||||
|
||||
# Global database manager instance
|
||||
db_manager = DatabaseManager()
|
||||
|
|
@ -3,7 +3,7 @@
|
|||
# This extends your existing ai.py without breaking it
|
||||
|
||||
from ai import get_ai_response as base_get_ai_response, get_model_name, load_model
|
||||
from memory import memory_manager
|
||||
from memory_manager import memory_manager
|
||||
from personality import load_persona
|
||||
from logger import setup_logger, generate_req_id, log_llm_request, log_llm_response
|
||||
import requests
|
||||
|
|
|
|||
16
src/memory.json
Normal file
16
src/memory.json
Normal file
|
|
@ -0,0 +1,16 @@
|
|||
{
|
||||
"conversations": {
|
||||
"test_channel": [
|
||||
{
|
||||
"timestamp": "2025-10-10T16:57:27.533778",
|
||||
"user_id": "test_user",
|
||||
"content": "test message",
|
||||
"context": "test context",
|
||||
"importance": 0.8,
|
||||
"id": "test_channel_1760129847"
|
||||
}
|
||||
]
|
||||
},
|
||||
"user_memories": {},
|
||||
"global_events": []
|
||||
}
|
||||
5
src/memory.json.backup.20251010_125624
Normal file
5
src/memory.json.backup.20251010_125624
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
{
|
||||
"conversations": {},
|
||||
"user_memories": {},
|
||||
"global_events": []
|
||||
}
|
||||
5
src/memory.json.backup.20251010_125727
Normal file
5
src/memory.json.backup.20251010_125727
Normal file
|
|
@ -0,0 +1,5 @@
|
|||
{
|
||||
"conversations": {},
|
||||
"user_memories": {},
|
||||
"global_events": []
|
||||
}
|
||||
|
|
@ -1,4 +1,6 @@
|
|||
# memory.py
|
||||
# DEPRECATED - Use memory_manager.py instead
|
||||
# This file is kept for backward compatibility
|
||||
# Enhanced memory system building on existing user_profiles.py
|
||||
|
||||
import os
|
||||
|
|
|
|||
155
src/memory_manager.py
Normal file
155
src/memory_manager.py
Normal file
|
|
@ -0,0 +1,155 @@
|
|||
"""
|
||||
memory_manager.py
|
||||
Unified memory management using database abstraction layer
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
from typing import List, Dict, Optional
|
||||
from database import db_manager
|
||||
from logger import setup_logger
|
||||
|
||||
logger = setup_logger("memory_manager")
|
||||
|
||||
class UnifiedMemoryManager:
|
||||
"""Memory manager that works with any database backend"""
|
||||
|
||||
def __init__(self):
|
||||
self.db = db_manager
|
||||
|
||||
def analyze_and_store_message(self, message, context_messages: List = None):
|
||||
"""Analyze a message and determine if it should be stored as memory"""
|
||||
if not self.db.is_memory_enabled():
|
||||
return
|
||||
|
||||
content = message.content.lower()
|
||||
user_id = str(message.author.id)
|
||||
channel_id = str(message.channel.id)
|
||||
|
||||
# Determine importance based on content analysis
|
||||
importance_score = self._calculate_importance(content)
|
||||
|
||||
if importance_score > 0.3: # Only store moderately important+ messages
|
||||
context_str = ""
|
||||
if context_messages:
|
||||
context_str = " | ".join([f"{msg.author.display_name}: {msg.content[:100]}"
|
||||
for msg in context_messages[-3:]]) # Last 3 messages for context
|
||||
|
||||
self.db.store_conversation_memory(
|
||||
channel_id, user_id, message.content, context_str, importance_score
|
||||
)
|
||||
|
||||
# Extract personal information for user memory
|
||||
self._extract_user_details(message)
|
||||
|
||||
def _calculate_importance(self, content: str) -> float:
|
||||
"""Calculate importance score for a message (0.0 to 1.0)"""
|
||||
importance = 0.0
|
||||
|
||||
# Personal information indicators
|
||||
personal_keywords = ['i am', 'my name', 'i love', 'i hate', 'my favorite',
|
||||
'i work', 'i study', 'my job', 'birthday', 'anniversary']
|
||||
for keyword in personal_keywords:
|
||||
if keyword in content:
|
||||
importance += 0.4
|
||||
|
||||
# Emotional indicators
|
||||
emotional_keywords = ['love', 'hate', 'excited', 'sad', 'angry', 'happy',
|
||||
'frustrated', 'amazing', 'terrible', 'awesome']
|
||||
for keyword in emotional_keywords:
|
||||
if keyword in content:
|
||||
importance += 0.2
|
||||
|
||||
# Question indicators (important for context)
|
||||
if '?' in content:
|
||||
importance += 0.1
|
||||
|
||||
# Length bonus (longer messages often more important)
|
||||
if len(content) > 100:
|
||||
importance += 0.1
|
||||
|
||||
# Direct mentions of Delta or bot commands
|
||||
if 'delta' in content or content.startswith('!'):
|
||||
importance += 0.3
|
||||
|
||||
return min(importance, 1.0) # Cap at 1.0
|
||||
|
||||
def _extract_user_details(self, message):
|
||||
"""Extract and store personal details from user messages"""
|
||||
if not self.db.is_memory_enabled():
|
||||
return
|
||||
|
||||
content = message.content.lower()
|
||||
user_id = str(message.author.id)
|
||||
|
||||
# Simple pattern matching for common personal info
|
||||
patterns = {
|
||||
'interest': ['i love', 'i like', 'i enjoy', 'my favorite'],
|
||||
'personal': ['i am', 'my name is', 'i work at', 'my job'],
|
||||
'preference': ['i prefer', 'i usually', 'i always', 'i never']
|
||||
}
|
||||
|
||||
for memory_type, keywords in patterns.items():
|
||||
for keyword in keywords:
|
||||
if keyword in content:
|
||||
# Extract the relevant part of the message
|
||||
start_idx = content.find(keyword)
|
||||
relevant_part = content[start_idx:start_idx+200] # Next 200 chars
|
||||
|
||||
self.db.store_user_memory(user_id, memory_type, relevant_part, 0.5)
|
||||
break # Only store one per message to avoid spam
|
||||
|
||||
def get_conversation_context(self, channel_id: str, hours: int = 24) -> List[Dict]:
|
||||
"""Get recent conversation memories for context"""
|
||||
return self.db.get_conversation_context(channel_id, hours)
|
||||
|
||||
def get_user_context(self, user_id: str) -> List[Dict]:
|
||||
"""Get user-specific memories for personalization"""
|
||||
return self.db.get_user_context(user_id)
|
||||
|
||||
def format_memory_for_prompt(self, user_id: str, channel_id: str) -> str:
|
||||
"""Format memory for inclusion in AI prompts"""
|
||||
if not self.db.is_memory_enabled():
|
||||
return ""
|
||||
|
||||
lines = []
|
||||
|
||||
# Add conversation context
|
||||
conv_memories = self.get_conversation_context(channel_id, hours=48)
|
||||
if conv_memories:
|
||||
lines.append("[Recent Conversation Context]")
|
||||
for memory in conv_memories[:3]: # Top 3 most important
|
||||
timestamp = datetime.fromisoformat(memory["timestamp"]).strftime("%m/%d %H:%M")
|
||||
lines.append(f"- {timestamp}: {memory['content'][:150]}")
|
||||
|
||||
# Add user context
|
||||
user_memories = self.get_user_context(user_id)
|
||||
if user_memories:
|
||||
lines.append("[User Context]")
|
||||
for memory in user_memories[:3]: # Top 3 most important
|
||||
memory_type = memory.get('type', memory.get('memory_type', 'unknown'))
|
||||
lines.append(f"- {memory_type.title()}: {memory['content'][:100]}")
|
||||
|
||||
return "\n".join(lines) if lines else ""
|
||||
|
||||
def cleanup_old_memories(self, days: int = 30):
|
||||
"""Clean up memories older than specified days"""
|
||||
if not self.db.is_memory_enabled():
|
||||
return
|
||||
|
||||
self.db.cleanup_old_memories(days)
|
||||
logger.info(f"Cleaned up memories older than {days} days")
|
||||
|
||||
def is_enabled(self) -> bool:
|
||||
"""Check if memory system is enabled"""
|
||||
return self.db.is_memory_enabled()
|
||||
|
||||
def get_backend_info(self) -> Dict[str, str]:
|
||||
"""Get information about current backend"""
|
||||
return {
|
||||
"backend_type": self.db.get_backend_type(),
|
||||
"memory_enabled": str(self.db.is_memory_enabled())
|
||||
}
|
||||
|
||||
|
||||
# Global memory manager instance
|
||||
memory_manager = UnifiedMemoryManager()
|
||||
|
|
@ -15,6 +15,12 @@ context:
|
|||
enabled: true # now working with memory system
|
||||
max_messages: 15 # max messages to keep in context
|
||||
|
||||
database:
|
||||
backend: "json" # Options: "json", "sqlite"
|
||||
sqlite_path: "data/deltabot.db" # SQLite database file path
|
||||
profiles_path: "user_profiles.json" # JSON profiles file (for JSON backend)
|
||||
memory_path: "memory.json" # JSON memory file (for JSON backend)
|
||||
|
||||
memory:
|
||||
enabled: true
|
||||
importance_threshold: 0.3 # minimum importance to store (0.0-1.0)
|
||||
|
|
|
|||
|
|
@ -1,24 +1,7 @@
|
|||
{
|
||||
"161149541171593216": {
|
||||
"name": "themiloverse",
|
||||
"display_name": "Miguel",
|
||||
"first_seen": "2025-05-15T03:16:30.011640",
|
||||
"last_seen": "2025-09-20T19:04:27.735898",
|
||||
"last_message": "2025-09-20T19:04:27.735898",
|
||||
"interactions": 364,
|
||||
"pronouns": "he/him",
|
||||
"avatar_url": "https://cdn.discordapp.com/avatars/161149541171593216/fb0553a29d9f73175cb6aea24d0e19ec.png?size=1024",
|
||||
"custom_prompt": "delta is very nice to me since I am her master, and creator"
|
||||
},
|
||||
"1370422629340811405": {
|
||||
"name": "PLEX",
|
||||
"display_name": "PLEX",
|
||||
"first_seen": "2025-09-21T04:14:15.752764",
|
||||
"last_seen": "2025-09-27T14:54:42.041092",
|
||||
"last_message": "2025-09-27T14:54:42.041092",
|
||||
"interactions": 19,
|
||||
"pronouns": null,
|
||||
"avatar_url": "https://cdn.discordapp.com/embed/avatars/0.png",
|
||||
"custom_prompt": null
|
||||
"test_user": {
|
||||
"name": "test",
|
||||
"display_name": "Test User",
|
||||
"interactions": 5
|
||||
}
|
||||
}
|
||||
24
src/user_profiles.json.backup.20251010_125727
Normal file
24
src/user_profiles.json.backup.20251010_125727
Normal file
|
|
@ -0,0 +1,24 @@
|
|||
{
|
||||
"161149541171593216": {
|
||||
"name": "themiloverse",
|
||||
"display_name": "Miguel",
|
||||
"first_seen": "2025-05-15T03:16:30.011640",
|
||||
"last_seen": "2025-09-20T19:04:27.735898",
|
||||
"last_message": "2025-09-20T19:04:27.735898",
|
||||
"interactions": 364,
|
||||
"pronouns": "he/him",
|
||||
"avatar_url": "https://cdn.discordapp.com/avatars/161149541171593216/fb0553a29d9f73175cb6aea24d0e19ec.png?size=1024",
|
||||
"custom_prompt": "delta is very nice to me since I am her master, and creator"
|
||||
},
|
||||
"1370422629340811405": {
|
||||
"name": "PLEX",
|
||||
"display_name": "PLEX",
|
||||
"first_seen": "2025-09-21T04:14:15.752764",
|
||||
"last_seen": "2025-09-27T14:54:42.041092",
|
||||
"last_message": "2025-09-27T14:54:42.041092",
|
||||
"interactions": 19,
|
||||
"pronouns": null,
|
||||
"avatar_url": "https://cdn.discordapp.com/embed/avatars/0.png",
|
||||
"custom_prompt": null
|
||||
}
|
||||
}
|
||||
114
src/user_profiles_new.py
Normal file
114
src/user_profiles_new.py
Normal file
|
|
@ -0,0 +1,114 @@
|
|||
# user_profiles_new.py
|
||||
# Modern user profiles using database abstraction
|
||||
# This will eventually replace user_profiles.py
|
||||
|
||||
from datetime import datetime
|
||||
from database import db_manager
|
||||
from logger import setup_logger
|
||||
|
||||
logger = setup_logger("user_profiles_new")
|
||||
|
||||
def load_user_profile(user):
|
||||
"""Load user profile from database, creating if it doesn't exist"""
|
||||
user_id = str(user.id)
|
||||
|
||||
# Try to get existing profile
|
||||
profile = db_manager.get_user_profile(user_id)
|
||||
|
||||
if profile:
|
||||
# Update existing profile with current session data
|
||||
profile.update({
|
||||
"name": user.name,
|
||||
"display_name": user.display_name,
|
||||
"avatar_url": str(user.display_avatar.url),
|
||||
"last_seen": datetime.utcnow().isoformat(),
|
||||
"last_message": datetime.utcnow().isoformat(),
|
||||
"interactions": profile.get("interactions", 0) + 1
|
||||
})
|
||||
else:
|
||||
# Create new profile
|
||||
now = datetime.utcnow().isoformat()
|
||||
profile = {
|
||||
"name": user.name,
|
||||
"display_name": user.display_name,
|
||||
"first_seen": now,
|
||||
"last_seen": now,
|
||||
"last_message": now,
|
||||
"interactions": 1,
|
||||
"pronouns": None,
|
||||
"avatar_url": str(user.display_avatar.url),
|
||||
"custom_prompt": None
|
||||
}
|
||||
|
||||
# Save updated profile
|
||||
db_manager.save_user_profile(user_id, profile)
|
||||
return profile
|
||||
|
||||
def update_last_seen(user_id):
|
||||
"""Update last seen timestamp for user"""
|
||||
profile = db_manager.get_user_profile(str(user_id))
|
||||
if profile:
|
||||
profile["last_seen"] = datetime.utcnow().isoformat()
|
||||
db_manager.save_user_profile(str(user_id), profile)
|
||||
|
||||
def increment_interactions(user_id):
|
||||
"""Increment interaction count for user"""
|
||||
profile = db_manager.get_user_profile(str(user_id))
|
||||
if profile:
|
||||
profile["interactions"] = profile.get("interactions", 0) + 1
|
||||
db_manager.save_user_profile(str(user_id), profile)
|
||||
|
||||
def set_pronouns(user, pronouns):
|
||||
"""Set pronouns for user"""
|
||||
user_id = str(user.id)
|
||||
profile = db_manager.get_user_profile(user_id) or {}
|
||||
profile["pronouns"] = pronouns
|
||||
|
||||
# Ensure basic profile data exists
|
||||
if not profile.get("name"):
|
||||
profile.update({
|
||||
"name": user.name,
|
||||
"display_name": user.display_name,
|
||||
"avatar_url": str(user.display_avatar.url),
|
||||
"first_seen": datetime.utcnow().isoformat(),
|
||||
"last_seen": datetime.utcnow().isoformat(),
|
||||
"interactions": 0
|
||||
})
|
||||
|
||||
db_manager.save_user_profile(user_id, profile)
|
||||
return True
|
||||
|
||||
def set_custom_prompt(user_id, prompt):
|
||||
"""Set custom prompt for user"""
|
||||
user_id = str(user_id)
|
||||
profile = db_manager.get_user_profile(user_id)
|
||||
if profile:
|
||||
profile["custom_prompt"] = prompt
|
||||
db_manager.save_user_profile(user_id, profile)
|
||||
|
||||
def format_profile_for_block(profile):
|
||||
"""Format profile data for inclusion in AI prompts"""
|
||||
lines = ["[User Profile]"]
|
||||
lines.append(f"- Name: {profile.get('display_name', 'Unknown')}")
|
||||
if profile.get("pronouns"):
|
||||
lines.append(f"- Pronouns: {profile['pronouns']}")
|
||||
lines.append(f"- Interactions: {profile.get('interactions', 0)}")
|
||||
if profile.get("custom_prompt"):
|
||||
lines.append(f"- Custom Prompt: {profile['custom_prompt']}")
|
||||
return "\n".join(lines)
|
||||
|
||||
# Backward compatibility functions - these use the old JSON system if database is disabled
|
||||
def load_profiles():
|
||||
"""Legacy function for backward compatibility"""
|
||||
logger.warning("load_profiles() is deprecated. Use individual profile functions instead.")
|
||||
return {}
|
||||
|
||||
def save_profiles(profiles):
|
||||
"""Legacy function for backward compatibility"""
|
||||
logger.warning("save_profiles() is deprecated. Use individual profile functions instead.")
|
||||
pass
|
||||
|
||||
def ensure_profile_file():
|
||||
"""Legacy function for backward compatibility"""
|
||||
logger.warning("ensure_profile_file() is deprecated. Database handles initialization.")
|
||||
pass
|
||||
Loading…
Reference in a new issue