Skip to content

Commit 499ec35

Browse files
Aegisclaude
authored andcommitted
fix: handle Workers AI non-string response in conversation-facts
Workers AI can return string or object depending on model. Handle both to prevent 70% error rate seen in aegis#416. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
1 parent 997bc55 commit 499ec35

1 file changed

Lines changed: 4 additions & 2 deletions

File tree

web/src/kernel/scheduled/conversation-facts.ts

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -89,8 +89,10 @@ async function askAi(
8989
const result = await env.ai.run(
9090
'@cf/meta/llama-3.3-70b-instruct-fp8-fast' as Parameters<Ai['run']>[0],
9191
{ messages: [{ role: 'system', content: system }, { role: 'user', content: user }] },
92-
) as { response?: string; choices?: Array<{ message?: { content?: string } }> };
93-
return result.choices?.[0]?.message?.content ?? result.response ?? '';
92+
);
93+
if (typeof result === 'string') return result;
94+
const obj = result as { response?: string; choices?: Array<{ message?: { content?: string } }> };
95+
return obj.choices?.[0]?.message?.content ?? obj.response ?? '';
9496
}
9597
return askGroq(env.groqApiKey, env.groqResponseModel, system, user, env.groqBaseUrl);
9698
}

0 commit comments

Comments
 (0)