Skip to content

fix(#32): surface LLM errors instead of '(no response)'#33

Merged
arniesaha merged 1 commit intomainfrom
fix/issue-32-surface-llm-errors
Apr 13, 2026
Merged

fix(#32): surface LLM errors instead of '(no response)'#33
arniesaha merged 1 commit intomainfrom
fix/issue-32-surface-llm-errors

Conversation

@arniesaha
Copy link
Copy Markdown
Owner

Summary

  • The pi-agent-core framework never throws on LLM errors -- it creates AssistantMessage with stopReason: "error" and errorMessage. The code was ignoring these fields, showing "(no response)" to users.
  • Add extractErrorFromTurn() helper to detect LLM errors in the message stream
  • Telegram: show LLM error: <reason> and mark task failed
  • A2A sync: return JSON-RPC error instead of success with empty text
  • Worker: post error event instead of complete with empty result

Test plan

  • 72/72 tests passing (6 new tests for extractErrorFromTurn)
  • Tests pass in CI-like environment (no ~/max/data dir)
  • Verify in production: trigger a rate limit or bad API key and confirm Telegram shows the error

Closes #32

🤖 Generated with Claude Code

The pi-agent-core framework never throws on LLM errors (rate limits,
timeouts, 500s). It creates AssistantMessage objects with
stopReason: "error" and an errorMessage field. The code was ignoring
these fields, causing users to see "(no response)" in Telegram.

- Add extractErrorFromTurn() helper in response.ts
- Telegram handler: check for LLM error before "(no response)" fallback,
  show "LLM error: <reason>" and mark task failed
- A2A sync handler: return JSON-RPC error instead of empty success
- Worker: post error event instead of complete with empty result
- Update test mocks for new extractErrorFromTurn export

Closes #32

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@arniesaha arniesaha merged commit db22dc1 into main Apr 13, 2026
1 check passed
@arniesaha arniesaha deleted the fix/issue-32-surface-llm-errors branch April 13, 2026 16:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

bug: upstream LLM errors surface as '(no response)' instead of error message

1 participant