From OpenAI Assistants
Migrate threads and messages from the OpenAI Assistants API into Sonzai. One thread per user, full message history preserved as chat transcripts.
What you're migrating
The OpenAI Assistants API persists conversation history inside threads. Each thread holds an ordered list of messages. Typically one thread maps to one end-user of your product, so the natural migration is:
- One OpenAI thread → one Sonzai user (keyed by your own
user_id) - Thread messages → one
chat_transcriptcontent block per thread (or several, if the thread is very long) - Assistant instructions stay in your Sonzai agent config — they don't move with the user.
Heads up
OpenAI's Assistants API was announced for deprecation in favour of the Responses API. Thread and message endpoints remain readable for now, and the export snippets below work against either. If your app has moved to Responses, see Custom JSON instead.
1. Export threads and messages from OpenAI
You need the OpenAI thread ID and, for each thread, the end-user identifier you want to key Sonzai memory against. That mapping usually lives in your own database (a users table that stores openai_thread_id).
import OpenAI from "openai";
const openai = new OpenAI();
// threadMap: your DB mapping of your user_id -> OpenAI thread_id
async function exportThread(threadId: string): Promise<string> {
const all: string[] = [];
let after: string | undefined;
while (true) {
const page = await openai.beta.threads.messages.list(threadId, {
order: "asc",
limit: 100,
...(after ? { after } : {}),
});
for (const m of page.data) {
const text = m.content
.filter(c => c.type === "text")
.map(c => (c as any).text.value)
.join("\n");
all.push(`${m.role === "user" ? "User" : "Agent"}: ${text}`);
}
if (!page.has_more) break;
after = page.data[page.data.length - 1].id;
}
return all.join("\n\n");
}2. Map to Sonzai's batch import shape
Wrap each exported transcript in a chat_transcript content block and assemble the batch. Keep user_id stable across systems — if your app already uses user_123 as the key, use it here too.
{
"source": "openai_assistants",
"users": [
{
"user_id": "user_123",
"display_name": "Mia Tanaka",
"metadata": {
"email": "[email protected]",
"custom": { "openai_thread_id": "thread_abc123" }
},
"content": [
{
"type": "chat_transcript",
"body": "User: My name is Mia...\nAgent: Nice to meet you..."
}
]
}
]
}The openai_thread_id in custom is optional but useful for debugging — Sonzai stores custom fields verbatim so you can cross-reference back to OpenAI if needed.
3. Import into Sonzai
Batch up to a few hundred users per call. Bigger migrations should page through in batches of ~200.
import { Sonzai } from "@sonzai-labs/agents";
const sonzai = new Sonzai({ apiKey: process.env.SONZAI_API_KEY! });
const AGENT_ID = "agent_abc";
// userMap comes from your DB: { user_id, display_name, email, openai_thread_id }
async function migrateBatch(userMap: UserRow[]) {
const users = await Promise.all(userMap.map(async (u) => ({
user_id: u.user_id,
display_name: u.display_name,
metadata: {
email: u.email,
custom: { openai_thread_id: u.openai_thread_id },
},
content: [{
type: "chat_transcript",
body: await exportThread(u.openai_thread_id),
}],
})));
const job = await sonzai.agents.priming.batchImport(AGENT_ID, {
source: "openai_assistants",
users,
});
console.log(`Queued ${job.total_users} users, facts from metadata: ${job.facts_created}`);
return job.job_id;
}4. Verify
Poll until the job completes, then spot-check a user.
curl -s https://api.sonz.ai/api/v1/agents/agent_abc/users/import/$JOB_ID \
-H "Authorization: Bearer $SONZAI_API_KEY" | jq '.status,.facts_stored,.errors'
# Spot-check extracted facts for one user
curl -s "https://api.sonz.ai/api/v1/agents/agent_abc/memory/facts?user_id=user_123&limit=20" \
-H "Authorization: Bearer $SONZAI_API_KEY" | jq '.facts[].content'A healthy job ends with status: "completed", errors: [], and a non-zero facts_stored.
Tips
- Run assistant messages aren't included by default.
messages.listreturns only the assistant's final replies and user turns, which is exactly what you want for a transcript. If you also want tool-call traces in Sonzai's memory, append them as separate{ "type": "note", "body": "..." }blocks. - Long threads. If a thread exceeds ~30k tokens, split it into multiple
chat_transcriptblocks (e.g. one per week). The extractor processes them independently. - Personas. OpenAI's assistant-level instructions describe the agent, not the user. Don't import them here — put them in your Sonzai agent's personality config instead.
What's next
- Custom JSON — if you're on the Responses API or have your own store.
- Conversations — how ongoing chats add to memory automatically after migration.
Migrating to Sonzai
Bring users, chat history, memories, and documents from your existing AI stack into Sonzai's memory constellation. One overview page plus a tutorial for every common source system.
From Zep (getzep)
Migrate users, sessions, and extracted facts from Zep's memory graph into Sonzai. Preserve either the raw chat history, the already-extracted facts, or both.