Skip to content

feat: include OpenCode in AI provider auto-discovery#93

Open
n00ki wants to merge 3 commits intoerictli:mainfrom
n00ki:feat/opencode-auto-discovery
Open

feat: include OpenCode in AI provider auto-discovery#93
n00ki wants to merge 3 commits intoerictli:mainfrom
n00ki:feat/opencode-auto-discovery

Conversation

@n00ki
Copy link

@n00ki n00ki commented Mar 6, 2026

  • add opencode to AI provider auto-discovery order and checker map in src/services/ai.ts
  • add OpenCode command mapping in src/components/command-palette/CommandPalette.tsx
  • add OpenCodeIcon in src/components/icons/index.tsx for command palette display

Dependency / Stacking

Summary by CodeRabbit

Release Notes

  • New Features
    • Added support for the OpenCode AI provider.
    • Command palette now dynamically discovers and displays only the AI providers available on your system.

@coderabbitai
Copy link

coderabbitai bot commented Mar 6, 2026

📝 Walkthrough

Walkthrough

The changes introduce support for a new OpenCode AI provider by implementing dynamic AI provider discovery. The CommandPalette component fetches available providers at runtime and generates AI commands dynamically, replacing hardcoded provider definitions. A new OpenCodeIcon component and AI service runtime checks support the expanded provider ecosystem.

Changes

Cohort / File(s) Summary
AI Service Enhancement
src/services/ai.ts
Added OpenCode provider to AiProvider type, introduced AI_PROVIDER_ORDER constant, added checkOpenCodeCli() and getAvailableAiProviders() functions for runtime provider capability detection.
Command Palette Dynamic Integration
src/components/command-palette/CommandPalette.tsx
Implemented AI provider discovery effect with state management; replaced static AI commands with dynamically generated aiCommands based on available providers; added availableAiProviders prop to component signature.
Icon Support
src/components/icons/index.tsx
Added OpenCodeIcon component with SVG rendering to support the new OpenCode provider command display.

Sequence Diagram

sequenceDiagram
    actor User
    participant CommandPalette
    participant AI Service
    participant CLI/System
    
    User->>CommandPalette: Opens palette (open=true)
    activate CommandPalette
    CommandPalette->>CommandPalette: Effect triggered (open or currentNote change)
    CommandPalette->>AI Service: getAvailableAiProviders()
    activate AI Service
    AI Service->>CLI/System: checkClaude()
    AI Service->>CLI/System: checkCodex()
    AI Service->>CLI/System: checkOpenCodeCli()
    AI Service->>CLI/System: checkOllama()
    CLI/System-->>AI Service: availability results
    AI Service-->>CommandPalette: [available providers]
    deactivate AI Service
    CommandPalette->>CommandPalette: Generate aiCommands from providers
    CommandPalette->>CommandPalette: Merge into baseCommands
    CommandPalette->>User: Display dynamic AI commands
    deactivate CommandPalette
Loading

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Suggested reviewers

  • erictli

Poem

🐰 A new provider hops into view,
OpenCode joins the AI crew,
Dynamic discovery takes the lead,
Icons shine, commands succeed!
From static chains, now freedom's true. ✨

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main change: adding OpenCode to AI provider auto-discovery, which is reflected throughout the changeset across three files.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (2)
src/services/ai.ts (1)

54-67: Consider logging provider check failures for debugging.

The silent catch block at lines 60-61 prevents discovery failures from surfacing. While this is appropriate for user experience (no error toasts needed), logging the error would aid debugging when a provider unexpectedly fails to be discovered.

🔧 Optional: Add error logging
       try {
         const installed = await providerCheckers[provider]();
         return installed ? provider : null;
-      } catch {
+      } catch (error) {
+        console.warn(`AI provider check failed for ${provider}:`, error);
         return null;
       }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/services/ai.ts` around lines 54 - 67, The provider discovery in
getAvailableAiProviders silences all errors in the try/catch, making failures
hard to debug; update the catch block inside the AI_PROVIDER_ORDER map (where
providerCheckers[provider]() is called) to log the caught error (including the
provider identifier) to the application logger or console.error before returning
null so discovery still fails silently to the user but errors are recorded for
debugging.
src/components/command-palette/CommandPalette.tsx (1)

161-204: Consider a data-driven approach for AI command definitions.

The if/else chain works correctly but could be simplified with a lookup object, making it easier to add new providers in the future.

♻️ Optional: Data-driven provider command mapping
const AI_COMMAND_CONFIG: Record<AiProvider, { id: string; label: string; icon: ReactNode }> = {
  claude: {
    id: "ai-edit-claude",
    label: "Edit with Claude Code",
    icon: <ClaudeIcon className="w-4.5 h-4.5 fill-text-muted" />,
  },
  codex: {
    id: "ai-edit-codex",
    label: "Edit with OpenAI Codex",
    icon: <CodexIcon className="w-4.5 h-4.5 fill-text-muted" />,
  },
  opencode: {
    id: "ai-edit-opencode",
    label: "Edit with OpenCode",
    icon: <OpenCodeIcon className="w-4.5 h-4.5 fill-text-muted" />,
  },
  ollama: {
    id: "ai-edit-ollama",
    label: "Edit with Ollama",
    icon: <OllamaIcon className="w-4.5 h-4.5 fill-text-muted" />,
  },
};

// Then in useMemo:
const aiCommands: Command[] = onOpenAiModal
  ? availableAiProviders.map((provider) => ({
      ...AI_COMMAND_CONFIG[provider],
      action: () => {
        onOpenAiModal(provider);
        onClose();
      },
    }))
  : [];
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/components/command-palette/CommandPalette.tsx` around lines 161 - 204,
Replace the long provider if/else chain that builds aiCommands with a
data-driven mapping: define an AI_COMMAND_CONFIG lookup (keyed by AiProvider
values like "claude", "codex", "opencode", "ollama") that contains id, label and
icon, then map availableAiProviders to Command objects by spreading
AI_COMMAND_CONFIG[provider] and adding the action that calls
onOpenAiModal(provider) and onClose(); update the aiCommands construction (the
useMemo that creates aiCommands) to use this lookup so adding new providers only
requires updating AI_COMMAND_CONFIG.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/services/ai.ts`:
- Around line 39-41: The frontend function checkOpenCodeCli invokes the Tauri
command "ai_check_opencode_cli" which doesn't exist in the backend; add a
corresponding Tauri command handler named ai_check_opencode_cli to the backend
command registry (same style as the existing ai_check_claude_cli,
ai_check_codex_cli, and ai_check_ollama_cli handlers) and implement its logic to
check the OpenCode CLI, then register that command so
invoke("ai_check_opencode_cli") succeeds at runtime.

---

Nitpick comments:
In `@src/components/command-palette/CommandPalette.tsx`:
- Around line 161-204: Replace the long provider if/else chain that builds
aiCommands with a data-driven mapping: define an AI_COMMAND_CONFIG lookup (keyed
by AiProvider values like "claude", "codex", "opencode", "ollama") that contains
id, label and icon, then map availableAiProviders to Command objects by
spreading AI_COMMAND_CONFIG[provider] and adding the action that calls
onOpenAiModal(provider) and onClose(); update the aiCommands construction (the
useMemo that creates aiCommands) to use this lookup so adding new providers only
requires updating AI_COMMAND_CONFIG.

In `@src/services/ai.ts`:
- Around line 54-67: The provider discovery in getAvailableAiProviders silences
all errors in the try/catch, making failures hard to debug; update the catch
block inside the AI_PROVIDER_ORDER map (where providerCheckers[provider]() is
called) to log the caught error (including the provider identifier) to the
application logger or console.error before returning null so discovery still
fails silently to the user but errors are recorded for debugging.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: d0929d86-82af-4e21-ab3d-eb51f69ab5a2

📥 Commits

Reviewing files that changed from the base of the PR and between 3331c1e and 3ba98b1.

📒 Files selected for processing (3)
  • src/components/command-palette/CommandPalette.tsx
  • src/components/icons/index.tsx
  • src/services/ai.ts

Comment on lines +39 to +41
export async function checkOpenCodeCli(): Promise<boolean> {
return invoke("ai_check_opencode_cli");
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Verify ai_check_opencode_cli command registration in Rust backend

echo "=== Searching for ai_check_opencode_cli function definition ==="
rg -n "fn ai_check_opencode_cli" --type rust

echo ""
echo "=== Searching for ai_check_opencode_cli in codebase ==="
rg -n "ai_check_opencode_cli" --type rust

echo ""
echo "=== Listing all ai_check_* commands in lib.rs ==="
rg -n "ai_check_" src-tauri/src/lib.rs | head -30

echo ""
echo "=== Checking invoke_handler registration block ==="
sed -n '2900,2970p' src-tauri/src/lib.rs 2>/dev/null || echo "File or range not found, searching for invoke_handler instead:"
rg -A 80 "fn invoke_handler" src-tauri/src/lib.rs | head -100

Repository: erictli/scratch

Length of output: 3059


Backend command ai_check_opencode_cli is not implemented.

The Tauri command ai_check_opencode_cli is invoked at lines 39-41, but this command is not defined in the Rust backend. Only ai_check_claude_cli, ai_check_codex_cli, and ai_check_ollama_cli are implemented and registered in src-tauri/src/lib.rs.

This violates the coding guideline: "Define all backend operations as Tauri commands in src-tauri/src/lib.rs and call them from frontend via invoke()". The function will fail at runtime until the backend implementation is added as part of PR #90.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/services/ai.ts` around lines 39 - 41, The frontend function
checkOpenCodeCli invokes the Tauri command "ai_check_opencode_cli" which doesn't
exist in the backend; add a corresponding Tauri command handler named
ai_check_opencode_cli to the backend command registry (same style as the
existing ai_check_claude_cli, ai_check_codex_cli, and ai_check_ollama_cli
handlers) and implement its logic to check the OpenCode CLI, then register that
command so invoke("ai_check_opencode_cli") succeeds at runtime.

@n00ki
Copy link
Author

n00ki commented Mar 6, 2026

This PR is intentionally stacked and depends on #89 and #90.

ai_check_opencode_cli is added in #90 (backend src-tauri/src/lib.rs) and is intentionally not duplicated here to keep this PR focused on auto-discovery only.

provider checks are wrapped in getAvailableAiProviders(), so if the OpenCode check command is unavailable, discovery returns null for that provider and OpenCode is simply not shown in the command palette.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant