From Mem0
Migrate per-user memories from Mem0 into Sonzai. Each Mem0 memory becomes a text block that Sonzai re-ingests, deduplicates, and links into its constellation.
What you're migrating
Mem0 stores memory as a flat list of short strings, one per assertion, keyed by user_id. Sonzai stores facts as a deduplicated graph with supersession (old assertions are retired when newer ones contradict them). The mapping is straightforward:
| Mem0 | Sonzai |
|---|---|
user_id | user_id (keep it identical) |
| Memory string | text content block |
| Metadata dict | custom metadata |
| Categories | can be flattened into the type field or included in the body |
Why Sonzai re-extracts instead of treating Mem0's strings as final facts: Sonzai's dedup runs on normalized fact assertions, not free text. Passing the Mem0 strings through the extractor produces consistent shapes that dedupe correctly on subsequent imports or live chat.
1. Export from Mem0
from mem0 import MemoryClient
mem0 = MemoryClient(api_key=os.environ["MEM0_API_KEY"])
def export_mem0_user(user_id: str):
memories = mem0.get_all(user_id=user_id)
# memories is a list of dicts: {id, memory, metadata, categories, created_at, ...}
return memories2. Map to Sonzai's import shape
Each Mem0 memory becomes a text block. If you have categories, prepend them to the body for context — the extractor will key off them.
{
"source": "mem0",
"users": [
{
"user_id": "user_123",
"metadata": { "custom": { "mem0_user_id": "user_123" } },
"content": [
{ "type": "text", "body": "[food] Allergic to peanuts." },
{ "type": "text", "body": "[hobbies] Loves hiking on weekends." },
{ "type": "text", "body": "[work] Lead platform engineer at Acme." }
]
}
]
}3. Import into Sonzai
import os
from sonzai import Sonzai
sonzai = Sonzai(api_key=os.environ["SONZAI_API_KEY"])
AGENT_ID = "agent_abc"
def migrate_mem0_users(user_ids):
users = []
for uid in user_ids:
memories = export_mem0_user(uid)
content = []
for m in memories:
cats = ",".join(m.get("categories") or [])
prefix = f"[{cats}] " if cats else ""
content.append({
"type": "text",
"body": prefix + m["memory"],
})
users.append({
"user_id": uid,
"metadata": {"custom": {"mem0_user_id": uid}},
"content": content,
})
return sonzai.agents.priming.batch_import(
AGENT_ID, source="mem0", users=users,
)4. Verify
# Job status
curl -s https://api.sonz.ai/api/v1/agents/agent_abc/users/import/$JOB_ID \
-H "Authorization: Bearer $SONZAI_API_KEY" | jq '{status,facts_stored,facts_deduped}'
# Facts for one migrated user
curl -s "https://api.sonz.ai/api/v1/agents/agent_abc/memory/facts?user_id=user_123&limit=50" \
-H "Authorization: Bearer $SONZAI_API_KEY" | jq '.facts[].content'Expect facts_stored to be roughly the same as the number of Mem0 memories you shipped, minus any that collapsed into duplicates.
Tips
- Metadata dicts from Mem0. If a Mem0 memory had a
metadatadict (e.g.{"importance": "high"}), inline it into the body as"[importance=high] ...". Sonzai'scustommetadata is user-level, not fact-level, so fact-scoped context belongs in the body. - Dedup against existing Sonzai memory. If you're importing into an agent that already has some facts for a user (e.g. from live chat), Sonzai will supersede rather than duplicate. The
facts_dedupedcounter in the job status tells you how many collapsed. - Order. Sonzai doesn't require a specific order, but if your Mem0 data has
created_attimestamps and you care about temporal recency, send memories oldest-first — the extractor stamps its own timestamps but content arriving later is treated as more recent.
What's next
- Memory — how supersession works inside the constellation.
- Self-improvement — how agents use migrated memory to adapt over time.
From Zep (getzep)
Migrate users, sessions, and extracted facts from Zep's memory graph into Sonzai. Preserve either the raw chat history, the already-extracted facts, or both.
From Letta (MemGPT)
Migrate core memory blocks and archival memory from Letta (formerly MemGPT) agents into Sonzai. Human blocks become user metadata; archival rows become content.