feat: add AWS Bedrock provider support#18
Conversation
- Add AWS SDK v2 dependencies (bedrockruntime, config) - Implement Bedrock provider with Converse API - Support AWS credential chain authentication - Add region configuration (AWS_REGION env var or config.toml) - Implement request/response conversion for Bedrock format - Add tool use support via Bedrock Converse API - Map AWS errors to Axe error categories - Update documentation (README, AGENTS.md, CONTRIBUTING.md) - Add comprehensive tests for Bedrock provider - Support model format: bedrock/model-id All tests pass. Ready for use with Claude, Llama, Mistral, and other models available through AWS Bedrock.
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Organization UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
📝 WalkthroughWalkthroughAdds AWS Bedrock provider and region resolution, updates CLI credential/error handling, expands documentation (architecture, contributing, README), introduces region-aware config resolution and Bedrock provider implementation/tests, and adds AWS SDK v2 dependencies; includes corresponding test and registry updates. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 12
🧹 Nitpick comments (5)
.agents/summary/codebase_info.md (1)
40-63: Add language specifier to directory tree.Static analysis flagged the missing language specifier. For directory trees,
textor leaving it blank both work, but being explicit is cleaner.📝 Suggested fix
-``` +```text axe/ ├── cmd/ # CLI commands and entry points🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.agents/summary/codebase_info.md around lines 40 - 63, Update the fenced directory-tree block in .agents/summary/codebase_info.md to include a language specifier by changing the opening fence from ``` to ```text so the tree (starting with "axe/") is explicitly marked as text; ensure the rest of the block (the tree contents and closing ```) remain unchanged.internal/provider/bedrock.go (2)
43-46: AWS config loading usescontext.Background().The constructor uses
context.Background()when loading AWS config, which means if AWS credential resolution hangs (e.g., waiting for IMDS on non-EC2 instances), it cannot be cancelled. Consider accepting a context parameter or using a timeout.💡 One approach
// Option 1: Add timeout ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second) defer cancel() cfg, err := config.LoadDefaultConfig(ctx, config.WithRegion(b.region)) // Option 2: Accept context as parameter (breaking change) func NewBedrock(ctx context.Context, region string, opts ...BedrockOption) (*Bedrock, error)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@internal/provider/bedrock.go` around lines 43 - 46, Replace the use of context.Background() when calling config.LoadDefaultConfig to avoid un-cancellable AWS credential resolution; either change the constructor NewBedrock to accept a context.Context parameter and pass that ctx into config.LoadDefaultConfig(ctx, config.WithRegion(b.region)), or (non-breaking) create a bounded context via context.WithTimeout inside the NewBedrock function (e.g., a 10s timeout), defer cancel(), and pass that ctx to config.LoadDefaultConfig to ensure credential resolution can be cancelled; update any callers if you choose the context-accepting option and keep the symbol config.LoadDefaultConfig as the call site to modify.
192-199: Silently ignoring UnmarshalSmithyDocument error.If deserialization fails, the error is swallowed and tool call arguments will be empty. This could mask bugs or unexpected input formats. At minimum, consider logging the error for debugging.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@internal/provider/bedrock.go` around lines 192 - 199, The code currently swallows errors from v.Value.Input.UnmarshalSmithyDocument leaving args silently empty; change the unwrap logic to handle the error instead of ignoring it by checking if err != nil and logging or returning it (e.g., use the provider's logger) before proceeding — replace the current if err := v.Value.Input.UnmarshalSmithyDocument(&inputMap); err == nil { ... } with an explicit if err := v.Value.Input.UnmarshalSmithyDocument(&inputMap); err != nil { logger.Errorf("UnmarshalSmithyDocument failed for v.Value.Input: %v", err) /* or return err */ } else { for k, val := range inputMap { args[k] = fmt.Sprintf("%v", val) } } so the error is surfaced and you still populate args when decoding succeeds.CONTRIBUTING.md (2)
192-198: Add language specifier to code block.The static analysis flagged this fenced code block (and similar ones at lines 209, 249) as missing a language specifier. Adding a language hint helps syntax highlighting and linting tools.
📝 Suggested fix
-``` +```text type(scope): brief description🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@CONTRIBUTING.md` around lines 192 - 198, The fenced code block showing the commit message template "type(scope): brief description" is missing a language specifier; update that and other similar blocks (the ones around the same template at lines referenced in the review) by changing the opening backticks to include a language hint such as "text" (e.g., ```text) so linters and syntax highlighters recognize it; locate the blocks by searching for the exact string "type(scope): brief description" in CONTRIBUTING.md and add the language specifier to each opening triple-backtick fence.
92-94: Resolution order documentation may be inconsistent.Line 93 states: "Flags override TOML overrides environment variables override defaults."
However, the coding guidelines specify: "command-line flags override TOML configuration, which overrides defaults" — with no mention of environment variables in that chain.
Worth clarifying the canonical resolution order to avoid confusion for contributors. The actual implementation in
internal/config/config.goshows env vars override config file for API keys/base URLs, so the documented order might need adjustment.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@CONTRIBUTING.md` around lines 92 - 94, Update the "Resolution Order" section to match the actual precedence implemented in your configuration logic: review internal/config/config.go where environment variables override values from the config file (e.g., API keys/base URLs) and change the sentence to state the canonical order used across the codebase (for example, "command-line flags override environment variables which override TOML configuration which override defaults"); if you prefer the opposite precedence, instead modify the config resolution logic in the code to match the documented order and keep the docs consistent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.agents/summary/codebase_info.md:
- Line 67: The "Multi-provider LLM support" provider list is inconsistent: it
currently omits AWS Bedrock while the earlier providers section includes it;
update the providers line that reads "Multi-provider LLM support (Anthropic,
OpenAI, Ollama, OpenCode)" to include AWS Bedrock so it matches the earlier list
(e.g., "Anthropic, OpenAI, Ollama, OpenCode, AWS Bedrock"), ensuring the two
sections use the same provider set.
In `@AGENTS.md`:
- Line 42: Update the comment that documents the provider directory so it
includes Bedrock; locate the comment string "provider/ # LLM provider
implementations (Anthropic, OpenAI, Ollama, OpenCode)" in AGENTS.md and add
"Bedrock" to the list so the comment reads something like "...(Anthropic,
OpenAI, Ollama, OpenCode, Bedrock)".
In `@cmd/gc_test.go`:
- Around line 428-430: The test assertion is correct (mapProviderError returns
exit code 3 for provider 500) but the failure message is wrong; update the error
string in the t.Errorf call that references exitErr.Code to say "expected exit
code 3, got %d" (referring to exitErr.Code) so the message matches the assertion
and avoids confusion when the test fails; verify mapProviderError/provider 500
logic remains unchanged.
In `@cmd/gc.go`:
- Around line 122-130: When provName == "bedrock" and you resolve a real region
(variable region) then assign apiKey = region and also clear any configured
Bedrock base URL so it doesn't get forwarded as the third New(...) argument;
specifically set the baseURL variable (or equivalent that holds
providers.bedrock.base_url / AXE_BEDROCK_BASE_URL) to an empty string after
setting apiKey, so registry.New(...) receives the real region and not a stray
base_url.
In `@cmd/run_integration_test.go`:
- Around line 586-588: The test failure messages assert against exitErr.Code but
print the wrong expected value; update the t.Errorf calls that currently say
"expected exit code 2" to state the correct expected code "3" so they match the
asserted check (exitErr.Code != 3) and the mapProviderError semantics; search
for t.Errorf usages near the checks of exitErr.Code (e.g., the blocks using
exitErr.Code and mapProviderError for timeout/auth/rate-limit/server failures)
and change the error message text accordingly in each occurrence.
In `@cmd/run.go`:
- Around line 216-224: The Bedrock branch currently sets apiKey=region but
leaves any configured baseURL to be passed into provider.New, which registry.New
treats as a secondary region source and can overwrite the resolved region;
update the Bedrock handling (the block using provName == "bedrock" that sets
apiKey and reads region) to explicitly clear or unset the baseURL before
creating the provider/client (or before calling
provider.New/provider.NewProvider), ensuring only the resolved region is
forwarded and baseURL is not forwarded for Bedrock.
In `@internal/provider/bedrock.go`:
- Around line 174-177: The code currently checks only that usage is non-nil
before dereferencing usage.InputTokens and usage.OutputTokens; update the block
(around the usages of resp.InputTokens and resp.OutputTokens) to also check that
usage.InputTokens and usage.OutputTokens are non-nil before
converting/dereferencing them (e.g., only assign resp.InputTokens =
int(*usage.InputTokens) if usage.InputTokens != nil, and likewise for
usage.OutputTokens) so you avoid a nil pointer dereference.
- Around line 269-282: Replace the custom contains and findSubstr
implementations with the standard library: remove the functions contains and
findSubstr from internal/provider/bedrock.go, add an import of "strings" if not
already present, and update all call sites that use contains(errMsg, "...") (or
contains(s, substr) generally) to use strings.Contains(errMsg, "...") instead;
ensure no references to findSubstr remain.
In `@internal/provider/registry.go`:
- Around line 50-60: In the "bedrock" case the code incorrectly treats baseURL
as a fallback/override for region: remove the fallback and the opts override so
region is taken solely from apiKey as resolved by cmd/run.go; specifically, in
the case "bedrock" branch stop using baseURL to set region or append
WithBedrockRegion(baseURL), keep region := apiKey (and leave NewBedrock(region,
opts...) without adding baseURL) so NewBedrock and WithBedrockRegion are only
used with a proper region value.
In `@README.md`:
- Line 239: The README's env var name and the implementation differ: update
internal/tool/web_search.go (the code that reads the API key, currently using
"TAVILY_API_KEY") to use "AXE_WEB_SEARCH_API_KEY" so it matches the
documentation; adjust any error/message text that references the old name
accordingly, and optionally accept the legacy "TAVILY_API_KEY" as a fallback
(e.g., check AXE_WEB_SEARCH_API_KEY first then TAVILY_API_KEY) so existing
setups don't break.
- Line 32: Update the README sentence that currently claims "all LLM calls use
the standard library" to reflect the new AWS SDK v2 dependency for Bedrock by
replacing that phrase with a softer statement like "small dependency surface" or
"minimal runtime dependencies" and optionally note that some providers (e.g.,
AWS Bedrock via AWS SDK v2) are added; locate the line containing the exact
phrase "all LLM calls use the standard library" and change the wording to avoid
implying zero external SDKs.
- Line 30: The README overstates the sandboxing: update the sentence listing
built-in tools so it clarifies that file operations (read, write, edit, list)
are path-sandboxed to the agent's working directory, while run_command can
access arbitrary filesystem paths and url_fetch / web_search are network tools
and not scoped to the workdir; edit the line containing the tool list (the one
that currently names "Built-in tools — file operations..., run_command,
url_fetch, web_search") to reflect these distinctions and avoid implying a
universal workdir sandbox.
---
Nitpick comments:
In @.agents/summary/codebase_info.md:
- Around line 40-63: Update the fenced directory-tree block in
.agents/summary/codebase_info.md to include a language specifier by changing the
opening fence from ``` to ```text so the tree (starting with "axe/") is
explicitly marked as text; ensure the rest of the block (the tree contents and
closing ```) remain unchanged.
In `@CONTRIBUTING.md`:
- Around line 192-198: The fenced code block showing the commit message template
"type(scope): brief description" is missing a language specifier; update that
and other similar blocks (the ones around the same template at lines referenced
in the review) by changing the opening backticks to include a language hint such
as "text" (e.g., ```text) so linters and syntax highlighters recognize it;
locate the blocks by searching for the exact string "type(scope): brief
description" in CONTRIBUTING.md and add the language specifier to each opening
triple-backtick fence.
- Around line 92-94: Update the "Resolution Order" section to match the actual
precedence implemented in your configuration logic: review
internal/config/config.go where environment variables override values from the
config file (e.g., API keys/base URLs) and change the sentence to state the
canonical order used across the codebase (for example, "command-line flags
override environment variables which override TOML configuration which override
defaults"); if you prefer the opposite precedence, instead modify the config
resolution logic in the code to match the documented order and keep the docs
consistent.
In `@internal/provider/bedrock.go`:
- Around line 43-46: Replace the use of context.Background() when calling
config.LoadDefaultConfig to avoid un-cancellable AWS credential resolution;
either change the constructor NewBedrock to accept a context.Context parameter
and pass that ctx into config.LoadDefaultConfig(ctx,
config.WithRegion(b.region)), or (non-breaking) create a bounded context via
context.WithTimeout inside the NewBedrock function (e.g., a 10s timeout), defer
cancel(), and pass that ctx to config.LoadDefaultConfig to ensure credential
resolution can be cancelled; update any callers if you choose the
context-accepting option and keep the symbol config.LoadDefaultConfig as the
call site to modify.
- Around line 192-199: The code currently swallows errors from
v.Value.Input.UnmarshalSmithyDocument leaving args silently empty; change the
unwrap logic to handle the error instead of ignoring it by checking if err !=
nil and logging or returning it (e.g., use the provider's logger) before
proceeding — replace the current if err :=
v.Value.Input.UnmarshalSmithyDocument(&inputMap); err == nil { ... } with an
explicit if err := v.Value.Input.UnmarshalSmithyDocument(&inputMap); err != nil
{ logger.Errorf("UnmarshalSmithyDocument failed for v.Value.Input: %v", err) /*
or return err */ } else { for k, val := range inputMap { args[k] =
fmt.Sprintf("%v", val) } } so the error is surfaced and you still populate args
when decoding succeeds.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 7575301a-7b12-4d6b-b9c2-bfb45b791a1b
⛔ Files ignored due to path filters (1)
go.sumis excluded by!**/*.sum
📒 Files selected for processing (16)
.agents/summary/codebase_info.mdAGENTS.mdCONTRIBUTING.mdREADME.mdcmd/gc.gocmd/gc_test.gocmd/run.gocmd/run_integration_test.gocmd/run_test.gocmd/smoke_test.gogo.modinternal/config/config.gointernal/provider/bedrock.gointernal/provider/bedrock_test.gointernal/provider/registry.gointernal/provider/registry_test.go
| if exitErr.Code != 3 { | ||
| t.Errorf("expected exit code 3, got %d", exitErr.Code) | ||
| t.Errorf("expected exit code 2, got %d", exitErr.Code) | ||
| } |
There was a problem hiding this comment.
The failure message drifted from the assertion.
This path still exercises a provider 500, and mapProviderError keeps that at exit code 3. The check is still correct; only the message changed, so a failing run would send the reader on a small goose chase.
💡 Suggested fix
if exitErr.Code != 3 {
- t.Errorf("expected exit code 2, got %d", exitErr.Code)
+ t.Errorf("expected exit code 3, got %d", exitErr.Code)
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if exitErr.Code != 3 { | |
| t.Errorf("expected exit code 3, got %d", exitErr.Code) | |
| t.Errorf("expected exit code 2, got %d", exitErr.Code) | |
| } | |
| if exitErr.Code != 3 { | |
| t.Errorf("expected exit code 3, got %d", exitErr.Code) | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@cmd/gc_test.go` around lines 428 - 430, The test assertion is correct
(mapProviderError returns exit code 3 for provider 500) but the failure message
is wrong; update the error string in the t.Errorf call that references
exitErr.Code to say "expected exit code 3, got %d" (referring to exitErr.Code)
so the message matches the assertion and avoids confusion when the test fails;
verify mapProviderError/provider 500 logic remains unchanged.
README.md
Outdated
| | `AXE_ANTHROPIC_BASE_URL` | No | Override Anthropic API endpoint | | ||
| | `AXE_OPENAI_BASE_URL` | No | Override OpenAI API endpoint | | ||
| | `AXE_OPENCODE_BASE_URL` | No | Override OpenCode API endpoint | | ||
| | `AXE_WEB_SEARCH_API_KEY` | If using web_search | Web search API key | |
There was a problem hiding this comment.
The documented web search API key name doesn’t match the implementation.
README.md tells users to set AXE_WEB_SEARCH_API_KEY, but internal/tool/web_search.go:43-89 currently reads TAVILY_API_KEY and fails if that variable is missing. Right now the docs would send users on a fun but unproductive adventure.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@README.md` at line 239, The README's env var name and the implementation
differ: update internal/tool/web_search.go (the code that reads the API key,
currently using "TAVILY_API_KEY") to use "AXE_WEB_SEARCH_API_KEY" so it matches
the documentation; adjust any error/message text that references the old name
accordingly, and optionally accept the legacy "TAVILY_API_KEY" as a fallback
(e.g., check AXE_WEB_SEARCH_API_KEY first then TAVILY_API_KEY) so existing
setups don't break.
- Replace custom contains() with strings.Contains from stdlib - Fix nil pointer dereference on token counts (check InputTokens/OutputTokens) - Simplify bedrock registry logic - remove confusing baseURL fallback - Clear baseURL for bedrock in run.go and gc.go to prevent region override - Fix documentation inconsistencies (add Bedrock to all provider lists) - Clarify sandboxing only applies to file operations, not all tools - Update minimal dependencies claim (AWS SDK v2 for Bedrock) - Fix test error messages to match actual exit codes (3 not 2) Note: Web search API key mismatch (TAVILY_API_KEY vs AXE_WEB_SEARCH_API_KEY) is a pre-existing issue in the codebase, not introduced by this PR.
The actual implementation uses TAVILY_API_KEY (not AXE_WEB_SEARCH_API_KEY). Updated README.md and AGENTS.md to reflect the correct environment variable.
There was a problem hiding this comment.
Actionable comments posted: 13
🧹 Nitpick comments (4)
.kiro/settings/lsp.json (1)
1-198: Consider keeping.kiro/config out of product PRs unless the repo explicitly supports it.This reads like IDE furniture sneaking into a Bedrock change. If shared Kiro settings are intentional, I'd document that and treat them as supported tooling; otherwise this will just create noisy, editor-driven churn in future PRs.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.kiro/settings/lsp.json around lines 1 - 198, The .kiro/settings/lsp.json appears to be an editor/IDE config accidentally included; either remove it from the PR and add .kiro/ to .gitignore (or add this specific file to .git/info/exclude) to prevent future noise, or if shared Kiro settings are intentional, keep the file but add a short note to the repo docs (CONTRIBUTING.md or README) explaining that .kiro/ is a supported, shared configuration and why; locate the .kiro/settings/lsp.json entry in the diff and apply one of those two fixes so the repo policy is explicit and consistent..agents/summary/data_models.md (1)
289-308: Add language specifier to dry-run output block.The dry-run output example is missing a language identifier. Plain
textwould work here.📝 Suggested fix
-``` +```text Agent: agent-name Model: provider/model-name🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.agents/summary/data_models.md around lines 289 - 308, The fenced dry-run example block starting with "Agent: agent-name" needs a language specifier; update the opening fence to use ```text so the snippet renders as plain text (keep the existing content like "Agent: agent-name", "Model: provider/model-name", "System Prompt", "Tools", "Sub-agents", "MCP Servers", and ensure the closing ``` remains unchanged).AGENTS.md (1)
33-54: Add language specifier to the directory structure block.Similar to the other doc file, this code block could use a
textlanguage identifier to keep the markdown linters from grumbling.📝 Suggested fix
-``` +```text axe/ ├── cmd/ # CLI commands (run, agents, config, gc, version)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@AGENTS.md` around lines 33 - 54, The fenced directory-structure code block that begins with "axe/" (the tree showing entries like "├── cmd/" and "└── main.go") needs a language specifier to satisfy markdown linters; change the opening backticks from ``` to ```text so the block reads as a plain-text code fence (e.g., replace the code block starting with ``` and the "axe/" tree with ```text ... ```)..agents/summary/codebase_info.md (1)
40-63: Add a language specifier to the fenced code block.The directory structure code block is missing a language identifier. While it's just a plain text tree, adding a language hint helps with consistent rendering and keeps the linters happy.
📝 Suggested fix
-``` +```text axe/ ├── cmd/ # CLI commands and entry points🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.agents/summary/codebase_info.md around lines 40 - 63, Update the fenced code block that starts with the "axe/" directory tree so the opening triple backticks include a language specifier (e.g., change ``` to ```text) to ensure consistent rendering and satisfy linters; locate the block containing the "axe/" tree in the file and modify only the opening fence to add the language hint.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.agents/summary/components.md:
- Around line 60-67: Update the internal/provider section in components.md to
reflect the actual Provider API and files: list the new bedrock.go alongside
provider.go, anthropic.go, openai.go, ollama.go, opencode.go, registry.go, and
change the documented Send signature to match the codebase: Send(ctx
context.Context, req *Request) (*Response, error). Mention the pointer types for
Request and Response so the architecture index accurately matches
internal/provider/provider.go and the new internal/provider/bedrock.go.
In @.agents/summary/data_models.md:
- Around line 206-214: The documentation's ProviderConfig struct is missing the
Region field added for Bedrock support; update the ProviderConfig definition in
this file to include a Region string field so it matches the implementation (see
ProviderConfig struct in internal/config/config.go), and ensure any sample
configs or examples in this doc reflect the new Region property.
In @.agents/summary/dependencies.md:
- Around line 211-214: The "Only 3 direct dependencies (excluding stdlib
extensions)" statement under the "Minimal Dependencies" heading is now incorrect
because AWS SDK v2 modules were added; update that bullet to reflect the current
direct dependency count or remove the hard-coded number and replace it with a
dynamic/qualitative statement (e.g., "Minimal direct dependencies; avoid
unnecessary transitive deps" or "X direct dependencies as of <date>"). Locate
the "Minimal Dependencies" section and the bullet text "Only 3 direct
dependencies (excluding stdlib extensions)" and change it to either the accurate
count including AWS SDK v2 modules or a non-numeric claim, and optionally add a
brief note about how AWS SDK modules are counted to avoid future drift.
- Around line 3-35: Update the dependencies documentation to include the newly
added AWS SDK v2 modules: add entries for github.com/aws/aws-sdk-go-v2/config
and github.com/aws/aws-sdk-go-v2/service/bedrockruntime, describing their
purpose, where they are used (reference internal/provider/bedrock.go and
specifically the Bedrock provider implementation and credential resolution), and
why they are required for Bedrock support; mirror the style/format of the
existing entries (Purpose, Used In, Why This Library) and insert them after the
golang.org/x/net section in .agents/summary/dependencies.md.
- Around line 156-160: The "Web Search API" docs claim the default is Brave with
AXE_WEB_SEARCH_API_KEY, but the code/README uses Tavily; update the "Web Search
API" section to match the implementation by setting Default to Tavily, replacing
the endpoint with Tavily's search endpoint, and changing the Authentication env
var from AXE_WEB_SEARCH_API_KEY to TAVILY_API_KEY (and keep "Used For:
web_search" as-is), or alternatively change the implementation to Brave if you
prefer Brave as the true default—ensure the docs' Default, Endpoint, and
Authentication entries match the actual implementation referenced in the README
and code.
In @.agents/summary/index.md:
- Around line 103-116: Update the dependencies.md section (the "###
dependencies.md" block) to include the AWS SDK v2 as a direct dependency and
note its use for Bedrock support; add a brief line describing why it’s required
(e.g., Bedrock-compatible AWS SDK v2 for API calls and credential handling)
alongside the existing items like cobra, toml, go-sdk, and golang.org/x/net so
the summary accurately reflects the current direct dependencies.
In @.agents/summary/interfaces.md:
- Around line 7-49: Update the interface and types in this doc to match the
actual provider package: change Provider.Send to Send(ctx context.Context, req
*Request) (*Response, error), make Request/Response pointer usage consistent,
and rename the tool-related fields to match the implementation (use
Tool.Parameters, ToolCall.Arguments, and ToolResult.CallID) so the documented
structs (Request, Message, Tool, ToolCall, ToolResult) reflect the real symbols
used by the Bedrock implementation.
- Around line 319-333: Update the cheat-sheet env examples to match the current
runtime names and Bedrock usage: add AWS_REGION to the top-level env list,
replace the obsolete AXE_WEB_SEARCH_API_KEY with the current env var name used
by the codebase (e.g., AXE_WEB_SEARCH_KEY) wherever the API key is shown, and
change any model examples that omit Bedrock to include a bedrock model id (e.g.,
use the pattern bedrock/model-id for AXE_BEDROCK_MODEL or similar model vars);
make the same corrections in the other identical block further down.
In @.agents/summary/review_notes.md:
- Around line 17-20: Update the Provider Interface section text that currently
reads "All four providers (Anthropic, OpenAI, Ollama, OpenCode)" to include
Bedrock and adjust the count (e.g., "All five providers (Anthropic, OpenAI,
Ollama, OpenCode, Bedrock)"). Locate the string in
.agents/summary/review_notes.md and modify the sentence so the provider list and
numeric count reflect the new Bedrock provider consistently.
- Around line 67-70: In the "Provider Integration" section update the provider
count text: locate the list item containing the exact phrase "All 4 providers
documented" and change it to "All 5 providers documented" (section heading
"Provider Integration"); ensure any nearby summary lines or counts that
reference "4 providers" are similarly updated to reflect Bedrock being added.
In @.kiro/settings/lsp.json:
- Around line 183-186: The diagnostics configuration uses an unrecognized key
"enableExperimental"; update the "diagnostics" object so that the experimental
flag is nested under "experimental" (i.e., replace
diagnostics.enableExperimental with diagnostics.experimental.enable) so
rust-analyzer will pick up the experimental diagnostics setting.
In `@internal/provider/bedrock.go`:
- Around line 31-46: The constructor NewBedrock currently validates the input
region before applying BedrockOption functions (e.g., WithBedrockRegion), which
lets an empty constructor arg block valid options and also lets options erase a
valid region without detection; change the logic in NewBedrock to apply all opts
(iterating opt(b)) first and then validate b.region (the struct field) for empty
string, and update the error message to be actionable (e.g., "region is
required; set via NewBedrock or WithBedrockRegion") so callers know how to fix
it.
- Around line 219-271: mapBedrockError currently relies on string matching
against err.Error(), which misclassifies wrapped/structured AWS errors; update
it to unwrap and inspect error types and interfaces instead (use errors.As /
errors.Is) to detect AWS SDK v2 structured errors (e.g., smithy.APIError or the
AWS service error interface) and extract the error code to map to
ErrCategoryAuth, ErrCategoryRateLimit, ErrCategoryBadRequest, ErrCategoryServer,
and ErrCategoryTimeout; also use errors.Is to detect context.DeadlineExceeded
and context.Canceled, ensure the returned ProviderError preserves the original
error in Err, and keep a sensible fallback that returns the original error
message if no structured match is found.
---
Nitpick comments:
In @.agents/summary/codebase_info.md:
- Around line 40-63: Update the fenced code block that starts with the "axe/"
directory tree so the opening triple backticks include a language specifier
(e.g., change ``` to ```text) to ensure consistent rendering and satisfy
linters; locate the block containing the "axe/" tree in the file and modify only
the opening fence to add the language hint.
In @.agents/summary/data_models.md:
- Around line 289-308: The fenced dry-run example block starting with "Agent:
agent-name" needs a language specifier; update the opening fence to use ```text
so the snippet renders as plain text (keep the existing content like "Agent:
agent-name", "Model: provider/model-name", "System Prompt", "Tools",
"Sub-agents", "MCP Servers", and ensure the closing ``` remains unchanged).
In @.kiro/settings/lsp.json:
- Around line 1-198: The .kiro/settings/lsp.json appears to be an editor/IDE
config accidentally included; either remove it from the PR and add .kiro/ to
.gitignore (or add this specific file to .git/info/exclude) to prevent future
noise, or if shared Kiro settings are intentional, keep the file but add a short
note to the repo docs (CONTRIBUTING.md or README) explaining that .kiro/ is a
supported, shared configuration and why; locate the .kiro/settings/lsp.json
entry in the diff and apply one of those two fixes so the repo policy is
explicit and consistent.
In `@AGENTS.md`:
- Around line 33-54: The fenced directory-structure code block that begins with
"axe/" (the tree showing entries like "├── cmd/" and "└── main.go") needs a
language specifier to satisfy markdown linters; change the opening backticks
from ``` to ```text so the block reads as a plain-text code fence (e.g., replace
the code block starting with ``` and the "axe/" tree with ```text ... ```).
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 623ecf07-684a-4834-a419-3a1b96fbe707
📒 Files selected for processing (17)
.agents/summary/architecture.md.agents/summary/codebase_info.md.agents/summary/components.md.agents/summary/data_models.md.agents/summary/dependencies.md.agents/summary/index.md.agents/summary/interfaces.md.agents/summary/review_notes.md.agents/summary/workflows.md.kiro/settings/lsp.jsonAGENTS.mdREADME.mdcmd/gc.gocmd/gc_test.gocmd/run.gointernal/provider/bedrock.gointernal/provider/registry.go
✅ Files skipped from review due to trivial changes (2)
- .agents/summary/workflows.md
- .agents/summary/architecture.md
🚧 Files skipped from review as they are similar to previous changes (3)
- cmd/run.go
- internal/provider/registry.go
- cmd/gc_test.go
.agents/summary/components.md
Outdated
| ### internal/provider | ||
| **Files:** provider.go, anthropic.go, openai.go, ollama.go, opencode.go, registry.go | ||
|
|
||
| **Provider Interface:** | ||
| ```go | ||
| type Provider interface { | ||
| Send(ctx context.Context, req Request) (Response, error) | ||
| } |
There was a problem hiding this comment.
Provider summary is missing the new file and the real Send signature.
internal/provider/provider.go uses Send(ctx context.Context, req *Request) (*Response, error), and this PR adds internal/provider/bedrock.go, but this section still shows value types and leaves the new provider file out of the package inventory. For an architecture index, that’s a pretty misleading treasure map.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.agents/summary/components.md around lines 60 - 67, Update the
internal/provider section in components.md to reflect the actual Provider API
and files: list the new bedrock.go alongside provider.go, anthropic.go,
openai.go, ollama.go, opencode.go, registry.go, and change the documented Send
signature to match the codebase: Send(ctx context.Context, req *Request)
(*Response, error). Mention the pointer types for Request and Response so the
architecture index accurately matches internal/provider/provider.go and the new
internal/provider/bedrock.go.
.agents/summary/data_models.md
Outdated
| type Config struct { | ||
| Providers map[string]ProviderConfig | ||
| } | ||
|
|
||
| type ProviderConfig struct { | ||
| APIKey string | ||
| BaseURL string | ||
| } | ||
| ``` |
There was a problem hiding this comment.
ProviderConfig is missing the Region field.
The ProviderConfig struct documented here is missing the Region field that was added for Bedrock support. The actual implementation in internal/config/config.go includes Region string in the struct.
📝 Suggested fix
type ProviderConfig struct {
APIKey string
BaseURL string
+ Region string
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| type Config struct { | |
| Providers map[string]ProviderConfig | |
| } | |
| type ProviderConfig struct { | |
| APIKey string | |
| BaseURL string | |
| } | |
| ``` | |
| type Config struct { | |
| Providers map[string]ProviderConfig | |
| } | |
| type ProviderConfig struct { | |
| APIKey string | |
| BaseURL string | |
| Region string | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.agents/summary/data_models.md around lines 206 - 214, The documentation's
ProviderConfig struct is missing the Region field added for Bedrock support;
update the ProviderConfig definition in this file to include a Region string
field so it matches the implementation (see ProviderConfig struct in
internal/config/config.go), and ensure any sample configs or examples in this
doc reflect the new Region property.
.agents/summary/dependencies.md
Outdated
| ## Direct Dependencies | ||
|
|
||
| ### github.com/BurntSushi/toml (v1.6.0) | ||
| **Purpose:** TOML configuration file parsing | ||
| **Used In:** | ||
| - `internal/agent/agent.go` - Parse agent TOML files | ||
| - `internal/config/config.go` - Parse global config.toml | ||
|
|
||
| **Why This Library:** Mature, well-tested TOML parser with good error messages | ||
|
|
||
| ### github.com/spf13/cobra (v1.10.2) | ||
| **Purpose:** CLI framework | ||
| **Used In:** | ||
| - `cmd/*.go` - All CLI commands and subcommands | ||
| - Command structure, flag parsing, help generation | ||
|
|
||
| **Why This Library:** Industry standard for Go CLI applications, excellent documentation | ||
|
|
||
| ### github.com/modelcontextprotocol/go-sdk (v1.4.0) | ||
| **Purpose:** Model Context Protocol client implementation | ||
| **Used In:** | ||
| - `internal/mcpclient/mcpclient.go` - MCP server communication | ||
| - Tool discovery and invocation | ||
|
|
||
| **Why This Library:** Official MCP SDK for Go | ||
|
|
||
| ### golang.org/x/net (v0.51.0) | ||
| **Purpose:** HTTP/2 support and network utilities | ||
| **Used In:** | ||
| - Provider HTTP clients | ||
| - MCP HTTP transport | ||
|
|
||
| **Why This Library:** Official Go extended networking library |
There was a problem hiding this comment.
AWS SDK v2 dependencies are missing from this documentation.
The PR adds AWS SDK v2 dependencies (github.com/aws/aws-sdk-go-v2/service/bedrockruntime and github.com/aws/aws-sdk-go-v2/config) for Bedrock support, but they're not documented here. This should be updated to include them as direct dependencies.
📝 Suggested addition after golang.org/x/net section
### github.com/aws/aws-sdk-go-v2/config
**Purpose:** AWS SDK configuration and credential loading
**Used In:**
- `internal/provider/bedrock.go` - AWS credential chain resolution
**Why This Library:** Official AWS SDK for Go v2, required for Bedrock integration
### github.com/aws/aws-sdk-go-v2/service/bedrockruntime
**Purpose:** AWS Bedrock Converse API client
**Used In:**
- `internal/provider/bedrock.go` - Bedrock provider implementation
**Why This Library:** Official AWS Bedrock client library🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.agents/summary/dependencies.md around lines 3 - 35, Update the dependencies
documentation to include the newly added AWS SDK v2 modules: add entries for
github.com/aws/aws-sdk-go-v2/config and
github.com/aws/aws-sdk-go-v2/service/bedrockruntime, describing their purpose,
where they are used (reference internal/provider/bedrock.go and specifically the
Bedrock provider implementation and credential resolution), and why they are
required for Bedrock support; mirror the style/format of the existing entries
(Purpose, Used In, Why This Library) and insert them after the golang.org/x/net
section in .agents/summary/dependencies.md.
.agents/summary/dependencies.md
Outdated
| **Web Search API** | ||
| - Default: Brave Search API | ||
| - Endpoint: `https://api.search.brave.com/res/v1/web/search` | ||
| - Authentication: `AXE_WEB_SEARCH_API_KEY` | ||
| - Used For: `web_search` tool |
There was a problem hiding this comment.
Web search API documentation doesn't match implementation.
The docs say "Default: Brave Search API" with a Brave endpoint, but the implementation uses Tavily (as noted in the README with TAVILY_API_KEY). This is likely a pre-existing documentation issue, but worth flagging for consistency.
📝 Suggested fix
**Web Search API**
-- Default: Brave Search API
-- Endpoint: `https://api.search.brave.com/res/v1/web/search`
-- Authentication: `AXE_WEB_SEARCH_API_KEY`
+- Default: Tavily Search API
+- Endpoint: Configurable via `AXE_WEB_SEARCH_BASE_URL`
+- Authentication: `TAVILY_API_KEY`
- Used For: `web_search` tool📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| **Web Search API** | |
| - Default: Brave Search API | |
| - Endpoint: `https://api.search.brave.com/res/v1/web/search` | |
| - Authentication: `AXE_WEB_SEARCH_API_KEY` | |
| - Used For: `web_search` tool | |
| **Web Search API** | |
| - Default: Tavily Search API | |
| - Endpoint: Configurable via `AXE_WEB_SEARCH_BASE_URL` | |
| - Authentication: `TAVILY_API_KEY` | |
| - Used For: `web_search` tool |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.agents/summary/dependencies.md around lines 156 - 160, The "Web Search API"
docs claim the default is Brave with AXE_WEB_SEARCH_API_KEY, but the code/README
uses Tavily; update the "Web Search API" section to match the implementation by
setting Default to Tavily, replacing the endpoint with Tavily's search endpoint,
and changing the Authentication env var from AXE_WEB_SEARCH_API_KEY to
TAVILY_API_KEY (and keep "Used For: web_search" as-is), or alternatively change
the implementation to Brave if you prefer Brave as the true default—ensure the
docs' Default, Endpoint, and Authentication entries match the actual
implementation referenced in the README and code.
.agents/summary/dependencies.md
Outdated
| ### Minimal Dependencies | ||
| - Only 3 direct dependencies (excluding stdlib extensions) | ||
| - Avoid unnecessary transitive dependencies | ||
| - Prefer standard library when possible |
There was a problem hiding this comment.
Dependency count claim is outdated.
The "Only 3 direct dependencies" claim needs updating now that AWS SDK v2 packages have been added. The exact count depends on how you count AWS SDK modules, but it's definitely more than 3.
📝 Suggested fix
### Minimal Dependencies
-- Only 3 direct dependencies (excluding stdlib extensions)
+- Core dependencies: cobra, toml, go-sdk (plus golang.org/x/net)
+- AWS SDK v2 for Bedrock provider support
- Avoid unnecessary transitive dependencies
- Prefer standard library when possible📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ### Minimal Dependencies | |
| - Only 3 direct dependencies (excluding stdlib extensions) | |
| - Avoid unnecessary transitive dependencies | |
| - Prefer standard library when possible | |
| ### Minimal Dependencies | |
| - Core dependencies: cobra, toml, go-sdk (plus golang.org/x/net) | |
| - AWS SDK v2 for Bedrock provider support | |
| - Avoid unnecessary transitive dependencies | |
| - Prefer standard library when possible |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.agents/summary/dependencies.md around lines 211 - 214, The "Only 3 direct
dependencies (excluding stdlib extensions)" statement under the "Minimal
Dependencies" heading is now incorrect because AWS SDK v2 modules were added;
update that bullet to reflect the current direct dependency count or remove the
hard-coded number and replace it with a dynamic/qualitative statement (e.g.,
"Minimal direct dependencies; avoid unnecessary transitive deps" or "X direct
dependencies as of <date>"). Locate the "Minimal Dependencies" section and the
bullet text "Only 3 direct dependencies (excluding stdlib extensions)" and
change it to either the accurate count including AWS SDK v2 modules or a
non-numeric claim, and optionally add a brief note about how AWS SDK modules are
counted to avoid future drift.
.agents/summary/review_notes.md
Outdated
| **Provider Interface:** | ||
| - Request/Response structures consistent | ||
| - Error categories match across provider implementations | ||
| - All four providers (Anthropic, OpenAI, Ollama, OpenCode) documented |
There was a problem hiding this comment.
Provider count is outdated — Bedrock wants in on the party!
The provider interface section mentions "All four providers (Anthropic, OpenAI, Ollama, OpenCode)" but the PR adds Bedrock as the 5th provider. This documentation should reflect the new reality.
📝 Suggested fix
**Provider Interface:**
- Request/Response structures consistent
- Error categories match across provider implementations
-- All four providers (Anthropic, OpenAI, Ollama, OpenCode) documented
+- All five providers (Anthropic, OpenAI, Ollama, OpenCode, Bedrock) documented📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| **Provider Interface:** | |
| - Request/Response structures consistent | |
| - Error categories match across provider implementations | |
| - All four providers (Anthropic, OpenAI, Ollama, OpenCode) documented | |
| **Provider Interface:** | |
| - Request/Response structures consistent | |
| - Error categories match across provider implementations | |
| - All five providers (Anthropic, OpenAI, Ollama, OpenCode, Bedrock) documented |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.agents/summary/review_notes.md around lines 17 - 20, Update the Provider
Interface section text that currently reads "All four providers (Anthropic,
OpenAI, Ollama, OpenCode)" to include Bedrock and adjust the count (e.g., "All
five providers (Anthropic, OpenAI, Ollama, OpenCode, Bedrock)"). Locate the
string in .agents/summary/review_notes.md and modify the sentence so the
provider list and numeric count reflect the new Bedrock provider consistently.
.agents/summary/review_notes.md
Outdated
| 4. **Provider Integration** | ||
| - All 4 providers documented | ||
| - Request/response formats specified | ||
| - Error handling categorized |
There was a problem hiding this comment.
Same story here — update the provider count.
Line 68 says "All 4 providers documented" which should be updated to 5 now that Bedrock has joined the crew.
📝 Suggested fix
4. **Provider Integration**
- - All 4 providers documented
+ - All 5 providers documented
- Request/response formats specified
- Error handling categorized🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.agents/summary/review_notes.md around lines 67 - 70, In the "Provider
Integration" section update the provider count text: locate the list item
containing the exact phrase "All 4 providers documented" and change it to "All 5
providers documented" (section heading "Provider Integration"); ensure any
nearby summary lines or counts that reference "4 providers" are similarly
updated to reflect Bedrock being added.
.kiro/settings/lsp.json
Outdated
| "diagnostics": { | ||
| "enable": true, | ||
| "enableExperimental": true | ||
| }, |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
For rust-analyzer LSP settings / initializationOptions, is the correct diagnostics setting diagnostics.enableExperimentalor nesteddiagnostics.experimental.enable?
💡 Result:
Use the nested setting: diagnostics.experimental.enable (VS Code form: rust-analyzer.diagnostics.experimental.enable). The camelCase diagnostics.enableExperimental is not the documented rust-analyzer config key. [1][2]
Example (LSP initializationOptions / settings object without the rust-analyzer. prefix):
{
"diagnostics": {
"experimental": {
"enable": true
}
}
}[1] (rust-analyzer.github.io)
[2] (rust-analyzer.github.io)
Citations:
- 1: https://rust-analyzer.github.io/book/configuration
- 2: https://rust-analyzer.github.io/book/diagnostics.html
Rust Analyzer won't pick up the experimental diagnostics setting—the key shape is wrong.
The enableExperimental key isn't recognized by rust-analyzer. You'll need to nest it under experimental.enable instead. Without this fix, experimental diagnostics won't activate even though the config looks right.
Fix
"diagnostics": {
"enable": true,
- "enableExperimental": true
+ "experimental": {
+ "enable": true
+ }
},📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "diagnostics": { | |
| "enable": true, | |
| "enableExperimental": true | |
| }, | |
| "diagnostics": { | |
| "enable": true, | |
| "experimental": { | |
| "enable": true | |
| } | |
| }, |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.kiro/settings/lsp.json around lines 183 - 186, The diagnostics
configuration uses an unrecognized key "enableExperimental"; update the
"diagnostics" object so that the experimental flag is nested under
"experimental" (i.e., replace diagnostics.enableExperimental with
diagnostics.experimental.enable) so rust-analyzer will pick up the experimental
diagnostics setting.
- Validate region after applying options (fixes WithBedrockRegion("") footgun)
- Use structured AWS SDK error types instead of string matching
- Add better error message for missing region
- Use errors.Is/As for proper error classification
These are local development artifacts and should not be in the PR.
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@internal/provider/bedrock.go`:
- Around line 130-139: ToolCall.Arguments is currently a map[string]string so
the code that builds input (the map copied into document.NewLazyDocument in the
Bedrock provider and stored as types.ToolUseBlock via ToolUseId/Name/Input)
loses original types on the round-trip because the Smithy decode later converts
values with fmt.Sprintf; fix by preserving typed values: change the Bedrock
encoder to construct a map[string]interface{} with properly typed values (e.g.,
parse each string in ToolCall.Arguments as JSON when it represents non-string
types, or add a separate typed field and populate it) and pass that typed map
into document.NewLazyDocument (or else serialize the original typed structure as
JSON and store it as a JSON-typed document so the Smithy unmarshal can restore
original types); update the corresponding decode path (the Smithy/Document
unmarshal logic used when reading ToolUseBlock/Input) to use json.Unmarshal into
map[string]interface{} instead of fmt.Sprintf conversions so
numbers/booleans/arrays/objects are preserved.
- Around line 268-275: The net.Error branch currently classifies all network
errors as ErrCategoryServer; update the branch that assigns netErr (net.Error)
to first check netErr.Timeout() and, if true, return a ProviderError with the
timeout category (e.g., ErrCategoryTimeout) and a "timeout" message (including
the original err), otherwise fall back to returning the ErrCategoryServer
ProviderError as before; modify the code around the netErr handling in the
function that constructs ProviderError to implement this timeout check.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 69e36774-c850-45f9-8dc8-e876207ee67d
📒 Files selected for processing (1)
internal/provider/bedrock.go
| input := make(map[string]interface{}) | ||
| for k, v := range tc.Arguments { | ||
| input[k] = v | ||
| } | ||
|
|
||
| content = append(content, &types.ContentBlockMemberToolUse{ | ||
| Value: types.ToolUseBlock{ | ||
| ToolUseId: &tc.ID, | ||
| Name: &tc.Name, | ||
| Input: document.NewLazyDocument(input), |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
set -euo pipefail
echo '--- ToolCall definition ---'
ast-grep --pattern $'type ToolCall struct { $$$ }'
echo '--- Bedrock tool argument conversions ---'
rg -n -C3 'NewLazyDocument\(input\)|UnmarshalSmithyDocument|fmt\.Sprintf\("%v", val\)' internal/provider
echo '--- Tests touching Bedrock tool args ---'
fd 'bedrock_test.go$' internal/provider -x sed -n '1,260p' {}Repository: jrswab/axe
Length of output: 2169
Bedrock tool arguments won't round-trip losslessly.
ToolCall.Arguments is locked to map[string]string, so when lines 130-139 wrap a map created from it, and lines 199-205 unmarshal the Smithy document back out, the decoding step converts everything with fmt.Sprintf("%v", val). This means:
- Numeric arguments (
123) become strings ("123") - Booleans (
true) become strings ("true") - Nested structures become Go debug text (
map[foo:bar],[1 2 3])
When a follow-up Converse call uses these args, the model gets stringified gibberish instead of the original types. The test file confirms there's no coverage for this scenario—only basic initialization tests.
Either switch to a typed representation (parse JSON back properly) or document the limitation. Right now tool-use in multi-turn conversations is a footgun.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@internal/provider/bedrock.go` around lines 130 - 139, ToolCall.Arguments is
currently a map[string]string so the code that builds input (the map copied into
document.NewLazyDocument in the Bedrock provider and stored as
types.ToolUseBlock via ToolUseId/Name/Input) loses original types on the
round-trip because the Smithy decode later converts values with fmt.Sprintf; fix
by preserving typed values: change the Bedrock encoder to construct a
map[string]interface{} with properly typed values (e.g., parse each string in
ToolCall.Arguments as JSON when it represents non-string types, or add a
separate typed field and populate it) and pass that typed map into
document.NewLazyDocument (or else serialize the original typed structure as JSON
and store it as a JSON-typed document so the Smithy unmarshal can restore
original types); update the corresponding decode path (the Smithy/Document
unmarshal logic used when reading ToolUseBlock/Input) to use json.Unmarshal into
map[string]interface{} instead of fmt.Sprintf conversions so
numbers/booleans/arrays/objects are preserved.
Check netErr.Timeout() to distinguish network-level timeouts (DNS, TCP) from generic network errors. This ensures timeout-specific handling downstream gets the correct error category. Note: Tool argument type preservation (map[string]string) is consistent with all other providers (Anthropic, OpenAI, Ollama, OpenCode). This is an architectural limitation of the Provider interface, not specific to Bedrock.
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@internal/provider/bedrock.go`:
- Around line 311-322: The code treats req.Temperature and req.MaxTokens as
"unset" when equal to 0 by using >0 checks, so explicit zero values
(temperature=0.0 or maxTokens=0) are lost; update the request type to use
pointers (e.g., *float64 for Temperature and *int for MaxTokens) or add explicit
"is set" booleans, then change the logic in the block that builds
types.InferenceConfiguration (inferenceConfig) to test for non-nil (or the
is-set flags) instead of >0 and assign the casted values (float32 for
Temperature, int32 for MaxTokens) to inferenceConfig.Temperature and
inferenceConfig.MaxTokens; finally always set input.InferenceConfig when either
field is explicitly provided.
- Around line 206-212: The code appends ToolCall using direct dereferences of
ToolUseBlock's pointer fields (ToolUseId and Name) which can nil-deref; update
the loop that builds toolCalls (the code creating ToolCall with ID:
*v.Value.ToolUseId, Name: *v.Value.Name) to defensively check v.Value.ToolUseId
and v.Value.Name for nil before dereferencing — if either is nil, either skip
that tool entry or substitute a safe default (e.g., empty string) and log a
warning; ensure you reference the same symbols (ToolUseBlock, ToolCall,
toolCalls) when implementing the checks so no panics occur.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 0850c02a-7061-4128-a0f7-3781b21bd507
📒 Files selected for processing (1)
internal/provider/bedrock.go
| inferenceConfig := &types.InferenceConfiguration{} | ||
| if req.Temperature > 0 { | ||
| temp := float32(req.Temperature) | ||
| inferenceConfig.Temperature = &temp | ||
| } | ||
| if req.MaxTokens > 0 { | ||
| maxTokens := int32(req.MaxTokens) | ||
| inferenceConfig.MaxTokens = &maxTokens | ||
| } | ||
| if req.Temperature > 0 || req.MaxTokens > 0 { | ||
| input.InferenceConfig = inferenceConfig | ||
| } |
There was a problem hiding this comment.
Temperature of exactly 0.0 cannot be explicitly requested.
The check req.Temperature > 0 treats 0.0 as "unset", but temperature=0 is actually meaningful in LLM contexts (it enables deterministic/greedy decoding). If a user explicitly wants temperature: 0 in their config, it won't be sent to Bedrock.
This is a minor edge case—most users won't hit it—but worth knowing about. A common pattern is to use a pointer (*float64) or a separate "is set" flag to distinguish "not provided" from "explicitly zero."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@internal/provider/bedrock.go` around lines 311 - 322, The code treats
req.Temperature and req.MaxTokens as "unset" when equal to 0 by using >0 checks,
so explicit zero values (temperature=0.0 or maxTokens=0) are lost; update the
request type to use pointers (e.g., *float64 for Temperature and *int for
MaxTokens) or add explicit "is set" booleans, then change the logic in the block
that builds types.InferenceConfiguration (inferenceConfig) to test for non-nil
(or the is-set flags) instead of >0 and assign the casted values (float32 for
Temperature, int32 for MaxTokens) to inferenceConfig.Temperature and
inferenceConfig.MaxTokens; finally always set input.InferenceConfig when either
field is explicitly provided.
- Change temperature check from > 0 to != 0 (consistent with other providers) - Add nil checks for ToolUseId and Name to prevent panic on malformed responses
|
Thanks for keeping on top of this @cosmosified |
All tests pass. Ready for use with Claude, Llama, Mistral, and other models available through AWS Bedrock.
Summary
This PR adds full AWS Bedrock provider support to Axe, including AWS SDK v2 integration, Converse API request/response and tool-use support, region-based credential resolution, error mapping, and documentation. It also updates provider initialization/error handling, tests, and README/agent docs. All Bedrock provider tests pass.
Changelog
Added
Changed
Fixed
Removed