Skip to content

Conversation

@dhyaneesh
Copy link

Summary
Issues - #178
This PR adds a first version of an export/import pipeline for Memori, plus tests and a small helper on SQLAlchemyDatabaseManager.

Changes

  • New module memori/database/export_import.py:
    • ExportManager for exporting data to JSON/SQL/SQLite (supports compression, batching, and encryption for SQLite exports).
    • ImportManager for importing data back into a target SQLAlchemyDatabaseManager, with basic failure‑mode handling.
  • DB helper SQLAlchemyDatabaseManager.get_session():
    • Simple wrapper around SessionLocal() used by export/import logic and tests that need explicit transactions.
  • Tests
    • tests/export_import/test_export_import.py:
      • JSON export/import with compression.
      • SQLite export with encryption and import.
      • Large dataset export/import (streaming).
      • Resume‑token JSON import.
      • Failure modes (corrupted file, unsupported compression, etc.).
    • tests/integration/test_export_import_cross_db.py:
      • SQLite ↔ PostgreSQL export/import (skipped unless MEMORI_TEST_POSTGRES_URL is set).
  • Test helper
    • tests/conftest.py: create_simple_memory() to build ProcessedLongTermMemory objects for integration tests.

Testing

uv run --active pytest tests/export_import/test_export_import.py -v
uv run --active pytest tests/integration/test_export_import_cross_db.py -v  # with MEMORI_TEST_POSTGRES_URL set
uv run --active pytest -v -k "not azure_openai_env_test"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant