Docker compose inclu9de#181
Conversation
Tweaks to get all memory providers working
remopved openmemory to make submodule
WalkthroughThis PR introduces Mycelia memory provider integration, a setup wizard for multi-environment configuration, Docker Compose modularization, extensive documentation for deployment and setup, and frontend UI enhancements including timeline visualizations. It adds Git submodules for Mycelia and OpenMemory, restructures the backend memory service architecture, and includes significant configuration and script additions for secrets management, Tailscale integration, and SSL/TLS setup. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Areas requiring extra attention:
Possibly related PRs
Suggested reviewers
Poem
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Comment |
|
@RabbitAI review |
There was a problem hiding this comment.
Actionable comments posted: 0
Note
Due to the large number of review comments, Critical severity comments were prioritized as inline comments.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (6)
backends/advanced/src/advanced_omi_backend/config.py (1)
134-140: Add missingmin_durationto return value.The function doesn't return
min_durationeven though it's defined inDEFAULT_SPEECH_DETECTION_SETTINGS(line 35). This creates an inconsistency where:
conversation_utils.pyexpectsmin_durationin the settings dict (line 104)- The value will always fall back to the hardcoded
10.0instead of using the configured default- No environment variable support for
SPEECH_DETECTION_MIN_DURATIONThis breaks the configuration pattern and prevents runtime configurability.
Apply this diff:
def get_speech_detection_settings(): """Get speech detection settings from environment or defaults.""" return { "min_words": int(os.getenv("SPEECH_DETECTION_MIN_WORDS", DEFAULT_SPEECH_DETECTION_SETTINGS["min_words"])), "min_confidence": float(os.getenv("SPEECH_DETECTION_MIN_CONFIDENCE", DEFAULT_SPEECH_DETECTION_SETTINGS["min_confidence"])), + "min_duration": float(os.getenv("SPEECH_DETECTION_MIN_DURATION", DEFAULT_SPEECH_DETECTION_SETTINGS["min_duration"])), }backends/advanced/src/advanced_omi_backend/services/memory/prompts.py (1)
180-180: Stale date in prompt: datetime.now() captured at import time.The
datetime.now()call in the f-string is evaluated once when the module is imported, so the "current date" in the prompt becomes stale and incorrect as time passes. This affects the LLM's temporal reasoning for fact extraction.Apply this diff to make the date dynamic:
-Remember the following: -- Today's date is {datetime.now().strftime("%Y-%m-%d")}. -- Do not return anything from the custom few shot example prompts provided above. +Remember the following: +- Today's date is {{current_date}}. +- Do not return anything from the custom few shot example prompts provided above.Then update the usage sites to format the prompt with the current date when it's actually used. Alternatively, convert
FACT_RETRIEVAL_PROMPTto a function that generates the prompt with the current date, similar tobuild_temporal_extraction_prompt.backends/advanced/src/advanced_omi_backend/workers/memory_jobs.py (1)
145-151: Remove duplicate import.
get_memory_serviceis already imported at line 49. This redundant import inside the try block is unnecessary.# Determine memory provider from memory service memory_provider = conversation_model.MemoryProvider.FRIEND_LITE # Default try: - from advanced_omi_backend.services.memory import get_memory_service memory_service_obj = get_memory_service() provider_name = memory_service_obj.__class__.__name__ if "OpenMemory" in provider_name: memory_provider = conversation_model.MemoryProvider.OPENMEMORY_MCP except Exception: passbackends/advanced/src/advanced_omi_backend/services/memory/providers/openmemory_mcp.py (1)
369-388: Unuseduser_idanduser_emailparameters in delete_memory.The
delete_memorymethod acceptsuser_idanduser_emailparameters but doesn't use them. The method usesself.mcp_client.delete_memory(memory_id)without passing user context. This appears inconsistent with other methods that switch user context.Consider either:
- Removing the unused parameters
- Applying user context switching like other methods
- async def delete_memory(self, memory_id: str, user_id: Optional[str] = None, user_email: Optional[str] = None) -> bool: + async def delete_memory(self, memory_id: str, user_id: Optional[str] = None, user_email: Optional[str] = None) -> bool: """Delete a specific memory by ID. ... """ if not self._initialized: await self.initialize() + # Update MCP client user context for this operation + original_user_id = self.mcp_client.user_id + self.mcp_client.user_id = user_id or self.user_id + try: success = await self.mcp_client.delete_memory(memory_id) if success: memory_logger.info(f"🗑️ Deleted memory {memory_id} via MCP") return success except Exception as e: - memory_logger.error(f"Delete memory failed: {e}") + memory_logger.exception(f"Delete memory failed: {e}") return False + finally: + self.mcp_client.user_id = original_user_idbackends/advanced/src/advanced_omi_backend/services/memory/providers/mcp_client.py (1)
343-345: Moveimport reto the top of the file.Per coding guidelines, all imports must be at the top of the file. The
reimport insidedelete_all_memoriesviolates this rule.import logging import uuid +import re from typing import List, Dict, Any, Optional import httpxThen remove line 343:
- import reAs per coding guidelines, NEVER import modules in the middle of functions.
Makefile (1)
339-344: Useuv run pythoninstead ofpython3for Python scripts.Per the coding guidelines, Python scripts should be executed using
uv run pythoninstead of directpythonorpython3commands.config-docker: ## Generate Docker Compose configuration files @echo "🐳 Generating Docker Compose configuration files..." - @CONFIG_FILE=config.env.dev python3 scripts/generate-docker-configs.py + @CONFIG_FILE=config.env.dev uv run python scripts/generate-docker-configs.py @echo "✅ Docker Compose configuration files generated" config-k8s: ## Generate Kubernetes configuration files (ConfigMap/Secret only - no .env files) @echo "☸️ Generating Kubernetes configuration files..." - @python3 scripts/generate-k8s-configs.py + @uv run python scripts/generate-k8s-configs.pyAs per coding guidelines.
🟠 Major comments (25)
backends/advanced/src/advanced_omi_backend/utils/conversation_utils.py-103-111 (1)
103-111: Fix:min_durationmissing fromget_speech_detection_settings()return value.The duration check logic is correct, but
settings.get("min_duration", 10.0)will always use the fallback value becauseget_speech_detection_settings()inconfig.py(lines 134-140) doesn't returnmin_duration. This means:
- The hardcoded fallback
10.0will always be used- Environment variable
SPEECH_DETECTION_MIN_DURATION(if added) would be ignored- Runtime configuration changes won't take effect
This breaks the configuration pattern used for
min_wordsandmin_confidence.The fix should be applied in
backends/advanced/src/advanced_omi_backend/config.py:def get_speech_detection_settings(): """Get speech detection settings from environment or defaults.""" return { "min_words": int(os.getenv("SPEECH_DETECTION_MIN_WORDS", DEFAULT_SPEECH_DETECTION_SETTINGS["min_words"])), "min_confidence": float(os.getenv("SPEECH_DETECTION_MIN_CONFIDENCE", DEFAULT_SPEECH_DETECTION_SETTINGS["min_confidence"])), + "min_duration": float(os.getenv("SPEECH_DETECTION_MIN_DURATION", DEFAULT_SPEECH_DETECTION_SETTINGS["min_duration"])), }Committable suggestion skipped: line range outside the PR's diff.
backends/advanced/src/advanced_omi_backend/workers/transcription_jobs.py-502-505 (1)
502-505: Move import to the top of the file.The
check_job_aliveimport is placed inside thewhileloop, which is inefficient and violates coding guidelines.Add at the top of the file with other imports:
from advanced_omi_backend.utils.job_utils import check_job_aliveThen simplify the loop:
while True: # Check if job still exists in Redis (detect zombie state) - from advanced_omi_backend.utils.job_utils import check_job_alive if not await check_job_alive(redis_client, current_job): breakAs per coding guidelines, ALL imports must be at the top of the file.
backends/advanced/src/advanced_omi_backend/workers/audio_jobs.py-263-273 (1)
263-273: Move imports to the top of the file and ensure proper zombie detection.The imports for
get_current_jobandcheck_job_aliveare placed inside the function body. Per coding guidelines, all imports should be at the top of the file.Add at the top of the file (around line 19):
from rq import get_current_job from advanced_omi_backend.utils.job_utils import check_job_aliveThen simplify the function body:
- # Get current job for zombie detection - from rq import get_current_job - from advanced_omi_backend.utils.job_utils import check_job_alive - current_job = get_current_job() + # Get current job for zombie detection + current_job = get_current_job()The resource cleanup logic (closing
file_sinkbefore breaking) is correct and important for preventing resource leaks.As per coding guidelines, ALL imports must be at the top of the file.
backends/advanced/src/advanced_omi_backend/workers/transcription_jobs.py-196-221 (1)
196-221: Move imports to the top of the file.Multiple imports are placed inside the function body at lines 197, 215-216, and 221. Per coding guidelines, all imports must be at the top of the file.
Add these imports at the top of the file with other imports:
from rq.job import Job from advanced_omi_backend.utils.conversation_utils import analyze_speech, mark_conversation_deletedNote:
get_current_jobis already imported at line 407, so consider moving that import to the top as well for consistency.As per coding guidelines, ALL imports must be at the top of the file.
backends/advanced/src/advanced_omi_backend/workers/conversation_jobs.py-284-287 (1)
284-287: Move import to the top of the file.The
check_job_aliveimport is placed inside thewhileloop, which violates the coding guidelines and is inefficient (re-imported on every iteration). Move this import to the top of the file with the other imports.Add at the top of the file (around line 27):
from advanced_omi_backend.utils.job_utils import check_job_aliveThen simplify the loop:
while True: - # Check if job still exists in Redis (detect zombie state) - from advanced_omi_backend.utils.job_utils import check_job_alive - if not await check_job_alive(redis_client, current_job): + # Check if job still exists in Redis (detect zombie state) + if not await check_job_alive(redis_client, current_job): breakAs per coding guidelines, ALL imports must be at the top of the file after the docstring.
backends/advanced/webui/nginx.conf-29-31 (1)
29-31: SanitizeX-Forwarded-Prefixbefore injecting into HTML to mitigate potential XSSYou're directly reflecting
$http_x_forwarded_prefixinto an HTML attribute viasub_filter. If this nginx instance is ever reachable by clients who can set arbitrary HTTP headers (or if upstream proxies don't strip/validateX-Forwarded-Prefix), a malicious client can craft this header with quotes or angle brackets and get it echoed into<base href="...">, creating a reflected XSS vector.Recommended mitigation:
Add a sanitized prefix variable at the
httplevel in your main nginx config:map $http_x_forwarded_prefix $ui_prefix { default "/"; ~^/[0-9A-Za-z_/-]*$ $http_x_forwarded_prefix; }Then update the
sub_filterdirective to use the sanitized variable:- sub_filter '<head>' '<head><base href="$http_x_forwarded_prefix/">'; + sub_filter '<head>' '<head><base href="$ui_prefix/">';This ensures that only path-like values matching the regex are used; malformed or dangerous input falls back safely to
"/".Note:
sub_filteris provided byngx_http_sub_module. This module is not compiled into nginx by default when building from source (requires--with-http_sub_module), but it is typically included in pre-built distributions (Debian, Ubuntu, official nginx packages). If your Docker image uses a minimal or custom nginx build, verify the module is available; otherwise the container will fail to start withunknown directive "sub_filter". You can check inside the container withnginx -V 2>&1 | grep http_sub_module.backends/advanced/scripts/create_mycelia_api_key.py-265-270 (1)
265-270: Remove duplicateimport base64statement.This is a duplicate import inside the function. Move it to the top with other imports as noted above.
- # Create API key document (matches Mycelia's format) - import base64 api_key_doc = {Committable suggestion skipped: line range outside the PR's diff.
backends/advanced/scripts/create_mycelia_api_key.py-23-31 (1)
23-31: Moveimport base64to the top of the file.Per coding guidelines, imports should never be in the middle of functions. The
base64import at line 26 and again at line 266 should be moved to the top of the file with other imports.import os import sys import secrets import hashlib +import base64 from pymongo import MongoClient from bson import ObjectId from datetime import datetime ... def hash_api_key_with_salt(api_key: str, salt: bytes) -> str: """Hash API key with salt (matches Mycelia's hashApiKey function).""" # SHA256(salt + apiKey) in base64 - import base64 h = hashlib.sha256() h.update(salt) h.update(api_key.encode('utf-8')) return base64.b64encode(h.digest()).decode('utf-8') # Use base64 like MyceliaAlso remove the duplicate import at line 266.
backends/advanced/src/advanced_omi_backend/app_factory.py-33-34 (1)
33-34: Move Mycelia sync import to module top and preferlogging.exceptionin new blockFunctionality looks good (non-fatal Mycelia OAuth admin sync on startup), but two small changes would better match project guidelines:
- Avoid mid-function imports
Current code:
try: from advanced_omi_backend.services.mycelia_sync import sync_admin_on_startup await sync_admin_on_startup()Per the Python guidelines here, imports should live at the top of the file unless there’s a hard circular‑import constraint. You can move the import up with the other
advanced_omi_backendimports and just await the function insidelifespan:-from advanced_omi_backend.client_manager import get_client_manager -from advanced_omi_backend.services.memory import get_memory_service, shutdown_memory_service +from advanced_omi_backend.client_manager import get_client_manager +from advanced_omi_backend.services.memory import get_memory_service, shutdown_memory_service +from advanced_omi_backend.services.mycelia_sync import sync_admin_on_startup @@ - # Sync admin user with Mycelia OAuth (if using Mycelia memory provider) - try: - from advanced_omi_backend.services.mycelia_sync import sync_admin_on_startup - await sync_admin_on_startup() - except Exception as e: - application_logger.error(f"Failed to sync admin with Mycelia OAuth: {e}") + # Sync admin user with Mycelia OAuth (if using Mycelia memory provider) + try: + await sync_admin_on_startup() + except Exception: + application_logger.exception("Failed to sync admin with Mycelia OAuth") # Don't raise here as this is not critical for startup
- Use
logging.exceptionfor the new except blockUsing
application_logger.exception(...)here logs the full traceback, which matches the preferred pattern for new code in this repo and addresses the TRY400 hint. Based on learnings, this also improves debuggability for startup issues involving Mycelia.Catching bare
Exceptionis probably acceptable here (you explicitly don’t want startup to fail), but if you know the likely failure modes, narrowing the exception types would further reduce noise.Also applies to: 76-82
backends/advanced/src/advanced_omi_backend/models/job.py-48-52 (1)
48-52: Fix error handling logic for database initialization.The current code attempts a fallback assignment but then immediately re-raises the exception, making the fallback ineffective. The
get_default_database()call requires the connection string to include a default database name. When it fails withConfigurationError, the fallback toclient[mongodb_database]should be used to continue initialization, not raise.Apply this diff to fix the error handling:
client = AsyncIOMotorClient(mongodb_uri) try: database = client.get_default_database(mongodb_database) except ConfigurationError: + # Fallback: URI doesn't specify a default DB, use explicit name database = client[mongodb_database] - raiseBased on learnings, when re-raising is necessary, use
raise ... from eto preserve context. However, in this case, the fallback should handle the error gracefully rather than re-raising.DOCKER-COMPOSE.md-79-83 (1)
79-83: Qdrant port numbers in the docs don’t match the compose configurationHere the docs say Qdrant runs on
6033/6034, butbackends/advanced/docker-compose-test.ymlexplicitly maps:# gRPC - avoid conflict with dev 6333 - "6337:6333" # HTTP - avoid conflict with dev 6334 - "6338:6334"which implies the dev stack uses 6333 (gRPC) and 6334 (HTTP), not 6033/6034. To avoid confusion when debugging connectivity, please either:
- Update this guide to refer to 6333/6334, or
- Align the dev compose to actually expose 6033/6034 and adjust the test comments accordingly.
Right now the mismatch makes it hard to know which ports to hit.
Also applies to: 241-242
backends/advanced/src/advanced_omi_backend/services/memory/providers/openmemory_mcp.py-362-367 (1)
362-367: Uselogging.exception()in update_memory error handler.except Exception as e: - memory_logger.error(f"Failed to update memory: {e}") + memory_logger.exception(f"Failed to update memory: {e}") return FalseBased on learnings, this captures the full stack trace.
backends/advanced/src/advanced_omi_backend/services/memory/providers/mcp_client.py-437-442 (1)
437-442: Uselogging.exception()in update_memory error handlers.except httpx.HTTPStatusError as e: - memory_logger.error(f"HTTP error updating memory: {e.response.status_code}") + memory_logger.exception(f"HTTP error updating memory: {e.response.status_code}") return False except Exception as e: - memory_logger.error(f"Error updating memory: {e}") + memory_logger.exception(f"Error updating memory: {e}") return FalseBased on learnings, this captures the full stack trace.
backends/advanced/src/advanced_omi_backend/services/memory/providers/openmemory_mcp.py-230-235 (1)
230-235: Uselogging.exception()for better stack traces.Per project conventions, prefer
logging.exception()inside except blocks to automatically log the full stack trace.except MCPError as e: - memory_logger.error(f"Search memories failed: {e}") + memory_logger.exception(f"Search memories failed: {e}") return [] except Exception as e: - memory_logger.error(f"Search memories failed: {e}") + memory_logger.exception(f"Search memories failed: {e}") return []Based on learnings, this improves debuggability across Python files.
backends/advanced/src/advanced_omi_backend/services/memory/providers/openmemory_mcp.py-317-322 (1)
317-322: Uselogging.exception()in get_memory error handler.except Exception as e: - memory_logger.error(f"Failed to get memory: {e}") + memory_logger.exception(f"Failed to get memory: {e}") return NoneBased on learnings, this captures the full stack trace.
backends/advanced/src/advanced_omi_backend/services/memory/providers/mcp_client.py-388-395 (1)
388-395: Uselogging.exception()in get_memory error handlers.except httpx.HTTPStatusError as e: if e.response.status_code == 404: return None - memory_logger.error(f"HTTP error getting memory: {e}") + memory_logger.exception(f"HTTP error getting memory: {e}") return None except Exception as e: - memory_logger.error(f"Error getting memory: {e}") + memory_logger.exception(f"Error getting memory: {e}") return NoneBased on learnings, this captures the full stack trace.
backends/advanced/webui/src/pages/ConversationsTimeline.tsx-219-230 (1)
219-230: Operator precedence issue in timezone detection logic.The condition on line 221 has unclear operator precedence that may not behave as intended. The
&&and||operators need explicit grouping to correctly detect ISO 8601 timestamps with timezone offsets.const formatDate = (timestamp: number | string): Date => { if (typeof timestamp === 'string') { - const isoString = timestamp.endsWith('Z') || timestamp.includes('+') || timestamp.includes('T') && timestamp.split('T')[1].includes('-') + const hasTimezone = timestamp.endsWith('Z') || + timestamp.includes('+') || + (timestamp.includes('T') && timestamp.split('T')[1]?.includes('-')) + const isoString = hasTimezone ? timestamp : timestamp + 'Z' return new Date(isoString) }The current logic evaluates as:
A || B || (C && D)but the intent appears to be checking if any timezone indicator is present.backends/advanced/src/advanced_omi_backend/services/memory/providers/mcp_client.py-350-352 (1)
350-352: Uselogging.exception()for better stack traces.except Exception as e: - memory_logger.error(f"Error deleting all memories: {e}") + memory_logger.exception(f"Error deleting all memories: {e}") return 0Based on learnings, prefer
logging.exception()to automatically log the full stack trace.backends/advanced/src/advanced_omi_backend/services/memory/providers/mycelia.py-109-117 (1)
109-117: Move imports to top of file.Importing
generate_jwt_for_userandUserinside_get_user_jwtviolates the coding guidelines requiring all imports at the top of the file.from ..base import MemoryEntry, MemoryServiceBase from ..prompts import FACT_RETRIEVAL_PROMPT, TEMPORAL_ENTITY_EXTRACTION_PROMPT, TemporalEntity from ..config import MemoryConfig from .llm_providers import _get_openai_client +from advanced_omi_backend.auth import generate_jwt_for_user +from advanced_omi_backend.users import UserThen remove the inline imports:
async def _get_user_jwt(self, user_id: str, user_email: Optional[str] = None) -> str: - from advanced_omi_backend.auth import generate_jwt_for_user - # If email not provided, lookup user if not user_email: - from advanced_omi_backend.users import User user = await User.get(user_id)As per coding guidelines, imports must be at the top of the file.
backends/advanced/src/advanced_omi_backend/services/memory/providers/mycelia.py-218-223 (1)
218-223: Uselogging.exception()and chain exceptions properly.except httpx.HTTPStatusError as e: - memory_logger.error(f"Mycelia API error: {e.response.status_code} - {e.response.text}") - raise RuntimeError(f"Mycelia API error: {e.response.status_code}") + memory_logger.exception(f"Mycelia API error: {e.response.status_code}") + raise RuntimeError(f"Mycelia API error: {e.response.status_code}") from e except Exception as e: - memory_logger.error(f"Failed to call Mycelia resource: {e}") - raise RuntimeError(f"Mycelia API call failed: {e}") + memory_logger.exception("Failed to call Mycelia resource") + raise RuntimeError(f"Mycelia API call failed: {e}") from eBased on learnings.
backends/advanced/src/advanced_omi_backend/services/memory/providers/mycelia.py-820-827 (1)
820-827:shutdown()cannot properly close the async HTTP client.The
shutdownmethod is synchronous buthttpx.AsyncClientrequiresawait client.aclose()for proper cleanup. This could cause resource leaks.- def shutdown(self) -> None: + async def shutdown(self) -> None: """Shutdown Mycelia client and cleanup resources.""" memory_logger.info("Shutting down Mycelia memory service") if self._client: - # Note: httpx AsyncClient should be closed in an async context - # In practice, this will be called during shutdown so we log a warning - memory_logger.warning("HTTP client should be closed with await client.aclose()") + await self._client.aclose() self._initialized = FalseIf the base class requires a sync method, consider implementing
__aenter__and__aexit__for context manager support, or track whether proper async cleanup occurred.Committable suggestion skipped: line range outside the PR's diff.
backends/advanced/src/advanced_omi_backend/services/mycelia_sync.py-79-95 (1)
79-95: MongoClient instances are never closed, causing potential resource leaks.Multiple
MongoClientinstances are created throughout this file (lines 80, 144, 186) without being closed. Each unclosed client holds open connections to MongoDB.Consider using a context manager or storing the client as an instance variable:
class MyceliaSyncService: """Synchronize Friend-Lite users with Mycelia OAuth API keys.""" def __init__(self): """Initialize the sync service.""" # MongoDB configuration self.mongo_url = os.getenv("MONGODB_URI", os.getenv("MONGO_URL", "mongodb://localhost:27017")) + self._client: Optional[MongoClient] = None # ... rest of init ... + def _get_client(self) -> MongoClient: + """Get or create MongoDB client.""" + if self._client is None: + self._client = MongoClient(self.mongo_url) + return self._client + + def close(self) -> None: + """Close MongoDB client.""" + if self._client: + self._client.close() + self._client = NoneCommittable suggestion skipped: line range outside the PR's diff.
backends/advanced/src/advanced_omi_backend/services/memory/providers/mycelia.py-82-94 (1)
82-94: Uselogging.exception()and chain exceptions withraise ... from e.Per retrieved learnings, use
logging.exception()inside except blocks to automatically include stack traces, and chain exceptions withraise ... from eto preserve context.except httpx.HTTPError as e: - raise RuntimeError(f"Failed to connect to Mycelia service: {e}") + raise RuntimeError(f"Failed to connect to Mycelia service: {e}") from e self._initialized = True memory_logger.info("✅ Mycelia memory service initialized successfully") except Exception as e: - memory_logger.error(f"❌ Failed to initialize Mycelia service: {e}") - raise RuntimeError(f"Mycelia initialization failed: {e}") + memory_logger.exception("❌ Failed to initialize Mycelia service") + raise RuntimeError(f"Mycelia initialization failed: {e}") from eBased on learnings: prefer
logging.exception()in except blocks and chain exceptions withraise ... from e.backends/advanced/src/advanced_omi_backend/services/mycelia_sync.py-148-148 (1)
148-148: Move import to top of file.Importing
bson.ObjectIdinsidesync_user_to_myceliaviolates the coding guidelines which require all imports at the top of the file.import logging import os import secrets import hashlib import base64 from typing import Optional, Tuple from pymongo import MongoClient from datetime import datetime +from bson import ObjectIdThen update line 148:
- from bson import ObjectId users_collection.update_one( {"_id": ObjectId(user_id)},As per coding guidelines, imports must be at the top of the file.
Docs/setup/prerequisites.md-106-106 (1)
106-106: Wrap bare URLs in Markdown link syntax.The static analysis tool flagged 12 bare URLs that should be wrapped in Markdown link syntax for proper rendering. Replace each bare URL with
[text](url)format.Apply these diffs to wrap the bare URLs:
- **Sign up**: https://platform.openai.com/signup + **Sign up**: [https://platform.openai.com/signup](https://platform.openai.com/signup) - 1. Go to https://platform.openai.com/api-keys + 1. Go to [https://platform.openai.com/api-keys](https://platform.openai.com/api-keys) - **Sign up**: https://console.deepgram.com/signup + **Sign up**: [https://console.deepgram.com/signup](https://console.deepgram.com/signup) - 1. Go to https://console.deepgram.com/ + 1. Go to [https://console.deepgram.com/](https://console.deepgram.com/) - **Sign up**: https://console.mistral.ai/ + **Sign up**: [https://console.mistral.ai/](https://console.mistral.ai/) - 1. Go to https://console.mistral.ai/api-keys + 1. Go to [https://console.mistral.ai/api-keys](https://console.mistral.ai/api-keys) - **Sign up**: https://huggingface.co/join + **Sign up**: [https://huggingface.co/join](https://huggingface.co/join) - 1. Go to https://huggingface.co/settings/tokens + 1. Go to [https://huggingface.co/settings/tokens](https://huggingface.co/settings/tokens) - **Sign up**: https://console.groq.com/ + **Sign up**: [https://console.groq.com/](https://console.groq.com/) - **Sign up**: https://login.tailscale.com/start + **Sign up**: [https://login.tailscale.com/start](https://login.tailscale.com/start) - - Check your console dashboard at https://console.deepgram.com/ + - Check your console dashboard at [https://console.deepgram.com/](https://console.deepgram.com/) - **GitHub Issues**: https://github.com/BasedHardware/Friend/issues + **GitHub Issues**: [https://github.com/BasedHardware/Friend/issues](https://github.com/BasedHardware/Friend/issues)Also applies to: 109-109, 123-123, 126-126, 146-146, 149-149, 159-159, 162-162, 173-173, 190-190, 333-333, 345-345
🟡 Minor comments (18)
backends/advanced/webui/src/hooks/useD3Zoom.ts-57-75 (1)
57-75: Removetransformfrom dependency array to avoid redundant effect execution.Including
transformin dependencies causes this effect to run on every zoom event (sincehandleZoomcallssetTransform). The effect's purpose is to attach the zoom behavior, not to respond to transform changes—D3 already maintains__zoominternally during zoom events.Additionally, the
as anyassertion on line 69 bypasses type safety.useEffect(() => { if (!svgRef.current) return const svg = d3.select(svgRef.current) const node = svg.node() if (node) { node.style.touchAction = 'none' - node.style.webkitUserSelect = 'none' + ;(node.style as CSSStyleDeclaration & { webkitUserSelect: string }).webkitUserSelect = 'none' node.style.userSelect = 'none' } - svg.call(zoomBehavior as any) - svg.property('__zoom', transform) + svg.call(zoomBehavior) return () => { svg.on('.zoom', null) } - }, [zoomBehavior, transform]) + }, [zoomBehavior])If you need to programmatically set the initial transform, consider doing so only on mount or when explicitly requested, rather than on every transform change.
Committable suggestion skipped: line range outside the PR's diff.
backends/advanced/webui/src/pages/MyceliaTimeline.tsx-142-148 (1)
142-148: MissingloadMemoriesin useEffect dependency array.The
loadMemoriesfunction is called inside this effect but not listed in the dependency array. This could cause stale closures. Either addloadMemoriesto the deps or wrap it inuseCallback.useEffect(() => { if (!useDemoData) { loadMemories() } else { setMemories(getDemoMemories()) } - }, [user?.id, useDemoData]) + }, [user?.id, useDemoData]) // Consider wrapping loadMemories in useCallback and adding to depsCommittable suggestion skipped: line range outside the PR's diff.
backends/advanced/webui/src/pages/Memories.tsx-475-479 (1)
475-479: Add keyboard accessibility to the clickable memory card.The memory card content area is clickable but lacks keyboard accessibility. Users should be able to navigate and activate the card using keyboard.
{/* Clickable Memory Content Area */} <div onClick={() => navigate(`/memories/${memory.id}`)} - className="p-6 cursor-pointer" + className="p-6 cursor-pointer" + role="button" + tabIndex={0} + onKeyDown={(e) => { + if (e.key === 'Enter' || e.key === ' ') { + e.preventDefault() + navigate(`/memories/${memory.id}`) + } + }} >backends/advanced/webui/public/frappe-gantt.css-1-1 (1)
1-1: Fix CSS variable usage and invalidbackground-colorvalueTwo small correctness issues in this minified stylesheet:
- CSS variable missing
var()
--g-weekend-label-coloris used directly as a color:.gantt-container .holiday-label{ ... background:--g-weekend-label-color; ... }This should go through
var():-.gantt-container .holiday-label{position:absolute;top:0;left:0;opacity:0;z-index:1000;background:--g-weekend-label-color;border-radius:5px;padding:2px 5px} +.gantt-container .holiday-label{position:absolute;top:0;left:0;opacity:0;z-index:1000;background:var(--g-weekend-label-color);border-radius:5px;padding:2px 5px}
- Invalid
background-color:brightness(97%)
brightness()is a filter function, not a color value:.gantt-container .popup-wrapper .action-btn:hover{background-color:brightness(97%)}You likely intended to dim via a filter:
-.gantt-container .popup-wrapper .action-btn:hover{background-color:brightness(97%)} +.gantt-container .popup-wrapper .action-btn:hover{filter:brightness(0.97)}These changes should clear the linter warning and ensure expected hover and holiday-label styling.
INSTALL.md-96-96 (1)
96-96: Fix inconsistent directory name.This line clones the repository to
friend-lite, while all other examples in this document usechronicle. This inconsistency could confuse users following different sections.Apply this diff to maintain consistency:
# Clone and setup Chronicle -git clone https://github.com/BasedHardware/Friend.git friend-lite -cd friend-lite +git clone https://github.com/BasedHardware/Friend.git chronicle +cd chronicle make wizardCommittable suggestion skipped: line range outside the PR's diff.
backends/advanced/src/advanced_omi_backend/services/memory/service_factory.py-56-67 (1)
56-67: Add exception chaining for better debuggability.The Mycelia provider integration follows the established pattern correctly. However, the exception handling should chain exceptions to preserve the original context.
Based on learnings, prefer exception chaining with 'raise ... from e' to preserve the original context.
Apply this diff to improve exception handling:
try: from .providers.mycelia import MyceliaMemoryService except ImportError as e: - raise RuntimeError(f"Mycelia memory service not available: {e}") + raise RuntimeError(f"Mycelia memory service not available: {e}") from ebackends/advanced/src/advanced_omi_backend/services/memory/service_factory.py-44-54 (1)
44-54: Add exception chaining for OpenMemory MCP provider.The exception handling should chain exceptions to preserve context, consistent with the coding guidelines.
Based on learnings, prefer exception chaining with 'raise ... from e'.
Apply this diff:
try: from .providers.openmemory_mcp import OpenMemoryMCPService except ImportError as e: - raise RuntimeError(f"OpenMemory MCP service not available: {e}") + raise RuntimeError(f"OpenMemory MCP service not available: {e}") from ebackends/advanced/src/advanced_omi_backend/routers/modules/system_routes.py-131-145 (1)
131-145: Align POST /admin/memory/provider docstring with actual behaviorThe route only updates MEMORY_PROVIDER and returns a message that a restart is required; it does not itself restart backend services. Consider updating the docstring to avoid implying an automatic restart. The
current_userargument being unused in the body is fine here since it exists purely to enforce admin auth viacurrent_superuser.backends/advanced/webui/src/pages/MemoryDetail.tsx-89-91 (1)
89-91: MissingloadMemoryin useEffect dependency array.The
loadMemoryfunction is called but not listed as a dependency. This violates React's exhaustive-deps rule and can cause stale closure issues.Wrap
loadMemoryinuseCallbackor inline the effect logic:+ const loadMemory = useCallback(async () => { + // ... existing implementation + }, [user?.id, id]) + useEffect(() => { loadMemory() - }, [id, user?.id]) + }, [loadMemory])Alternatively, add
// eslint-disable-next-line react-hooks/exhaustive-depswith justification if intentional.Committable suggestion skipped: line range outside the PR's diff.
WIZARD.md-289-293 (1)
289-293: Add language specifier to fenced code block.The command output example should have a language specifier.
-``` +```text anubis 100.83.66.30 anubis.tail12345.ts.net linux -WIZARD.md-251-267 (1)
251-267: Add language specifier to fenced code block.The directory structure code block should have a language specifier for consistent markdown rendering.
-``` +```text friend-lite/ ├── .env.secrets # Secrets (gitignored)backends/advanced/webui/src/pages/FrappeGanttTimeline.tsx-214-220 (1)
214-220:loadMemoriesfunction should be in useEffect dependency array or wrapped withuseCallback.Similar to the issue in ReactGanttTimeline, the effect calls
loadMemoriesbut it's not in the dependency array.Docs/development/DOCKER-SETUP-SUMMARY.md-7-17 (1)
7-17: Replace user-specific absolute path with a generic placeholder.The path
/Users/stu/repos/friend-lite/is specific to one developer's machine. Use a generic path or placeholder for documentation.-/Users/stu/repos/friend-lite/ # PROJECT ROOT +/path/to/friend-lite/ # PROJECT ROOTbackends/advanced/webui/src/pages/ReactGanttTimeline.tsx-150-150 (1)
150-150: Inconsistent API call:user.idnot passed tomemoriesApi.getAll().This component calls
memoriesApi.getAll()without the user ID, whereasFrappeGanttTimeline.tsx(line 157) passesuser.id. This inconsistency may cause different data fetching behavior between the two timeline implementations.- const response = await memoriesApi.getAll() + const response = await memoriesApi.getAll(user?.id)backends/advanced/webui/src/pages/FrappeGanttTimeline.tsx-606-606 (1)
606-606: Duplicate CSS class:focus:ring-2appears twice.- className="px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-gray-100 text-sm focus:ring-2 focus:ring-2 focus:ring-blue-500 focus:border-blue-500" + className="px-3 py-2 border border-gray-300 dark:border-gray-600 rounded-lg bg-white dark:bg-gray-700 text-gray-900 dark:text-gray-100 text-sm focus:ring-2 focus:ring-blue-500 focus:border-blue-500"backends/advanced/webui/src/pages/ReactGanttTimeline.tsx-179-183 (1)
179-183: MissingfetchMemoriesWithTimeRangesin useEffect dependency array.The effect depends on
userbut callsfetchMemoriesWithTimeRanges. If the function reference changes, the effect won't re-run. Either wrap the function withuseCallbackor include it in the dependency array.useEffect(() => { if (user) { fetchMemoriesWithTimeRanges() } - }, [user]) + }, [user]) // Consider wrapping fetchMemoriesWithTimeRanges in useCallbackCommittable suggestion skipped: line range outside the PR's diff.
backends/advanced/Docs/mycelia-auth-and-ownership.md-165-198 (1)
165-198: Add a language specifier to the fenced code block for the diagram.The ASCII diagram block lacks a language specifier. While the diagram renders fine, adding a
textor empty language identifier satisfies linting rules.-``` +```text ┌─────────────────────────────────────────────────────────────┐ │ 1. HTTP Request with JWT Token │backends/advanced/Docs/mycelia-auth-and-ownership.md-242-247 (1)
242-247: Convert bare URL to proper markdown link.The bare URL should be wrapped in a markdown link for better formatting and accessibility.
-Access Mycelia through Friend-Lite at **http://localhost:5173/memories** +Access Mycelia through Friend-Lite at **[http://localhost:5173/memories](http://localhost:5173/memories)**
Summary by CodeRabbit
New Features
Improvements
Documentation
✏️ Tip: You can customize this high-level summary in your review settings.