From Letta (MemGPT)
Migrate core memory blocks and archival memory from Letta (formerly MemGPT) agents into Sonzai. Human blocks become user metadata; archival rows become content.
What you're migrating
Letta (formerly MemGPT) stores memory in two layers:
- Core memory — structured blocks always in context. Default blocks are
human(what the agent knows about the user) andpersona(who the agent is). - Archival memory — an append-only store of longer recollections the agent can search on demand.
Sonzai keeps the user and the agent strictly separate — never migrate persona blocks into a user. The mapping:
| Letta | Sonzai |
|---|---|
Core memory human block | User metadata + one text content block |
Core memory persona block | Not migrated — configure in your agent personality instead |
| Archival memory rows | text content blocks on the user |
| Recall memory (message history) | chat_transcript content blocks |
| Letta user / agent IDs | custom metadata for cross-reference |
1. Export from Letta
from letta_client import Letta
letta = Letta(base_url=os.environ["LETTA_BASE_URL"],
token=os.environ["LETTA_API_KEY"])
def export_letta_agent(agent_id: str):
# Core memory blocks
blocks = letta.agents.core_memory.list_blocks(agent_id=agent_id)
human_block = next((b for b in blocks if b.label == "human"), None)
# Archival memory (paginated)
archival = []
cursor = None
while True:
page = letta.agents.archival_memory.list(
agent_id=agent_id, limit=100, after=cursor,
)
archival.extend(page)
if len(page) < 100:
break
cursor = page[-1].id
# Recall / message history
messages = letta.agents.messages.list(agent_id=agent_id, limit=500)
return {
"human": human_block.value if human_block else "",
"archival": [row.text for row in archival],
"messages": messages,
}2. Map to Sonzai's import shape
One Letta agent typically represents one user relationship, so you end up with one Sonzai user per Letta agent.
{
"source": "letta",
"users": [
{
"user_id": "user_123",
"display_name": "Mia Tanaka",
"metadata": {
"custom": {
"letta_agent_id": "agent-abc",
"letta_user_id": "user-xyz"
}
},
"content": [
{ "type": "text", "body": "Mia Tanaka is a platform engineer at Acme, lives in Tokyo, allergic to peanuts, loves hiking." },
{ "type": "text", "body": "Mia mentioned she's planning a trip to Hokkaido in March." },
{ "type": "chat_transcript", "body": "User: ...\nAgent: ..." }
]
}
]
}The human block goes in as a single text block — it's usually a paragraph of assertions written in natural language, which is exactly what the extractor prefers.
3. Import into Sonzai
import os
from sonzai import Sonzai
sonzai = Sonzai(api_key=os.environ["SONZAI_API_KEY"])
AGENT_ID = "agent_abc"
def migrate_letta_agents(letta_agent_map):
"""letta_agent_map: [{letta_agent_id, user_id, display_name, letta_user_id}]"""
users = []
for row in letta_agent_map:
exp = export_letta_agent(row["letta_agent_id"])
content = []
if exp["human"].strip():
content.append({"type": "text", "body": exp["human"]})
for text in exp["archival"]:
content.append({"type": "text", "body": text})
# Build a role-tagged transcript from Letta's messages
if exp["messages"]:
lines = []
for m in exp["messages"]:
role = "User" if m.role == "user" else "Agent"
lines.append(f"{role}: {m.text or ''}")
content.append({"type": "chat_transcript", "body": "\n".join(lines)})
users.append({
"user_id": row["user_id"],
"display_name": row["display_name"],
"metadata": {
"custom": {
"letta_agent_id": row["letta_agent_id"],
"letta_user_id": row["letta_user_id"],
},
},
"content": content,
})
return sonzai.agents.priming.batch_import(
AGENT_ID, source="letta", users=users,
)4. Verify
curl -s https://api.sonz.ai/api/v1/agents/agent_abc/users/import/$JOB_ID \
-H "Authorization: Bearer $SONZAI_API_KEY" | jq '{status,facts_stored,errors}'
curl -s "https://api.sonz.ai/api/v1/agents/agent_abc/memory/facts?user_id=user_123&limit=50" \
-H "Authorization: Bearer $SONZAI_API_KEY" | jq '.facts[].content'Tips
- Persona blocks stay behind. The
personacore-memory block describes the agent — import it into your Sonzai agent's personality and bio via the Agents API, not into a user. - Sleeptime blocks. Letta's sleeptime-agent output is high-quality summary text — import it as
textblocks alongside archival rows. Sonzai will dedupe overlap with archival automatically. - Tool-call traces in Letta's message history can be noisy. Filter out
tool_call/tool_returnmessages before building the transcript unless the tool outcomes are semantically important (e.g. the agent booked a flight). - One Letta agent = one Sonzai user. If you're tempted to import many Letta agents as one Sonzai user, don't — per-user mood, relationship state, and dedup all key off
user_id. Sonzai handles multiple users per agent natively.
What's next
- Personality — where the persona block belongs in Sonzai.
- Memory — how archival and recall collapse into one constellation.
From Mem0
Migrate per-user memories from Mem0 into Sonzai. Each Mem0 memory becomes a text block that Sonzai re-ingests, deduplicates, and links into its constellation.
From LangChain memory
Migrate chat history from LangChain's ConversationBufferMemory, ChatMessageHistory, and persistent backing stores (Redis, Postgres, DynamoDB) into Sonzai.