From LangChain memory
Migrate chat history from LangChain's ConversationBufferMemory, ChatMessageHistory, and persistent backing stores (Redis, Postgres, DynamoDB) into Sonzai.
What you're migrating
LangChain's memory layer is mostly a wrapper around an ordered list of BaseMessage objects per session. The common classes all expose the same read surface:
ConversationBufferMemory— keeps the full message list in memory (or in the attached history store).ConversationSummaryMemory/ConversationSummaryBufferMemory— keeps a rolling summary plus recent messages.ChatMessageHistorybacked byRedisChatMessageHistory,PostgresChatMessageHistory,DynamoDBChatMessageHistory, etc. — the same messages persisted across restarts.
The mapping into Sonzai:
| LangChain | Sonzai |
|---|---|
session_id (or LangGraph thread_id) | session_id — reuse it verbatim on sessions.start / sessions.end once you're live. For the bulk historical import below, each session becomes one chat_transcript block; the session ID is preserved as a header inside the block. |
User identity (whatever your app keys on — user_id field, session owner, etc.) | user_id — must be stable across all of that user's sessions so Sonzai can build per-user memory. |
HumanMessage / AIMessage list | one chat_transcript content block |
Summary (from ConversationSummaryMemory) | text content block |
additional_kwargs metadata | fold into transcript or drop |
If one user owns multiple LangChain sessions (common), pack them all into the same user's content array — one chat_transcript block per session — so Sonzai's extractor can dedupe facts across them. For live chat going forward, wrap each conversation in sessions.start / sessions.end and Sonzai will tag every extracted fact with its source session_id automatically. See Sessions.
1. Export from LangChain
from langchain_community.chat_message_histories import (
RedisChatMessageHistory, # swap for Postgres/DynamoDB/etc.
)
def export_langchain_session(session_id: str) -> str:
"""Return a role-tagged transcript string for one LangChain session."""
history = RedisChatMessageHistory(
session_id=session_id,
url=os.environ["REDIS_URL"],
)
lines = []
for msg in history.messages:
role = "User" if msg.type == "human" else "Agent"
lines.append(f"{role}: {msg.content}")
return "\n".join(lines)If you're on ConversationSummaryMemory rather than raw history, read memory.moving_summary_buffer and pass it as a separate text block:
summary = memory.moving_summary_buffer # string2. Map to Sonzai's import shape
One user with multiple past sessions:
{
"source": "langchain",
"users": [
{
"user_id": "user_123",
"metadata": { "custom": { "langchain_session_ids": "sess-a,sess-b,sess-c" } },
"content": [
{ "type": "chat_transcript", "body": "[session sess-a]\nUser: ...\nAgent: ..." },
{ "type": "chat_transcript", "body": "[session sess-b]\nUser: ...\nAgent: ..." },
{ "type": "text", "body": "Summary: User has been asking about migrating off LangChain for two weeks." }
]
}
]
}3. Import into Sonzai
import os
from sonzai import Sonzai
sonzai = Sonzai(api_key=os.environ["SONZAI_API_KEY"])
AGENT_ID = "agent_abc"
def migrate_langchain_users(user_session_map):
"""user_session_map: { user_id: [session_id, ...] }"""
users = []
for user_id, session_ids in user_session_map.items():
content = []
for sid in session_ids:
transcript = export_langchain_session(sid)
if transcript:
content.append({
"type": "chat_transcript",
"body": f"[session {sid}]\n{transcript}",
})
users.append({
"user_id": user_id,
"metadata": {
"custom": {"langchain_session_ids": ",".join(session_ids)},
},
"content": content,
})
return sonzai.agents.priming.batch_import(
AGENT_ID, source="langchain", users=users,
)4. Verify
curl -s https://api.sonz.ai/api/v1/agents/agent_abc/users/import/$JOB_ID \
-H "Authorization: Bearer $SONZAI_API_KEY" | jq '{status,facts_stored,errors}'Tips
SystemMessageshould be stripped. System prompts describe your chain, not the user. They're noise for the extractor. Filtermsg.type === "system"before joining.- Tool-call messages (
FunctionMessage/ToolMessage) depend on whether the tool outcome is a user-relevant fact. If the tool looked up the user's order history and the result made it into the assistant's reply, the reply already carries it — drop the raw tool message. If the tool output stands alone, keep it as atextblock. - LangGraph checkpointers (Postgres / SQLite / Redis) persist conversation state keyed by
thread_id. The message list inside each checkpoint is the sameBaseMessage[]— extract viagraph.get_state(config).values["messages"]and proceed as above. - If you're using a vector store for RAG, that's documents, not memory. Migrate those via Knowledge Base instead.
What's next
- Sessions — how to wrap post-migration live chat so session attribution is preserved on every extracted fact.
- Custom JSON — if your LangChain chain wraps a homegrown store that isn't one of the standard backends.
- Memory — how Sonzai handles ongoing memory after migration.
From Letta (MemGPT)
Migrate core memory blocks and archival memory from Letta (formerly MemGPT) agents into Sonzai. Human blocks become user metadata; archival rows become content.
From Character.AI / Replika
Migrate companion character chat exports from Character.AI, Replika, Chai, and similar companion apps into Sonzai. One character becomes one Sonzai agent; one user's conversation history becomes one Sonzai user.