Conversation
mapJsonToReport() was reading the event-level model_probability (which corresponds to the first outcome) instead of looking up the per-market probability from outcome_probabilities_json. This caused wildly incorrect edge calculations for elections, sports, and entertainment markets. Now checks outcome_probabilities_json for the specific market ticker before falling back to the event-level value.
…t probs hasExplicitModelProb now considers probabilities extracted from outcome_probabilities_json, not just event-level fields. Also adds marketProb assertion to the fallback test.
Calls GET /v1/prediction-markets/events at startup to bulk-populate octagon_reports and edge_history with the latest snapshots for all analyzed events. Runs in background with 1h cooldown to avoid hammering the API on rapid CLI invocations.
Introduce TypeScript interfaces for backtest options, snapshots, resolved/unresolved markets, and results. Add statistical metrics module with Brier skill score, bootstrap confidence intervals, hit rate, and flat-bet P&L computation.
- Create src/backtest/fetcher.ts with fetchEventHistory, fetchAndCacheHistory, and selectSnapshot functions for backtest data retrieval and caching - Add octagon_history table to DB schema with event_ticker/history_id unique index - Add has_history column migration to octagon_reports table
- `bun start backtest` runs both resolved and unresolved views - `--resolved` / `--unresolved` for individual views - `--category`, `--from`, `--to`, `--min-edge`, `--min-hours-before-close` filters - `--export results.csv` for per-market detail export - Parallel Kalshi API calls (10 concurrent) for fast discovery - Prefetch marks events with has_history for efficient resolved discovery
…call The events endpoint now returns has_history per event, so we read it directly during prefetch instead of making a separate filtered request. Gracefully defaults to false when the field is not present yet.
…pshot flag - Filter out multi-market events (brackets, multi-outcome) from both resolved and unresolved views since event-level model_prob does not apply to individual market tickers within the event - Store mutually_exclusive and series_category from events API in octagon_reports - Add category breakdown to resolved scorecard output - Pass actual --min-edge value to renderer header (was hardcoded to 5pp) - Add --snapshot last flag to skip lead-time filter for resolved markets - Show snapshot label (last-24h / last) in resolved output - CSV export verified working
…or handling - Remove logger.warn from getDb() catch and per-page logger.info from events API - Only run prefetchOctagonEvents for the runtime DB path, not test/memory DBs - Validate events API JSON response shape before pagination (data array, has_more boolean, next_cursor presence when has_more=true) - Handle NaN from parseInt in shouldPrefetch cooldown check - Only swallow UNIQUE constraint errors in edge insert catch, rethrow others - Remove unused logger import from octagon-prefetch
…vation fetcher.ts: - Remove logger import and cache log call - Introduce narrow HistorySnapshot type instead of faking OctagonEventEntry - Skip cache when time window (capturedFrom/capturedTo) is provided - Only write to cache for full-history requests metrics.ts: - Validate bootstrapCI parameters (iterations > 0, alpha in (0,1)) - Clamp percentile indices to valid range - Exclude neutral edges (edge_pp === 0) from edge signals - Bootstrap skill CI now resamples both octagon and market brier scores instead of fixing the market denominator - ROI uses totalDeployed (edgeCount * stakePerEdge) as denominator octagon-cache.ts: - Preserve has_history, mutually_exclusive, series_category columns on INSERT OR REPLACE by reading existing values before write
- Normalize date-only --to values to end-of-day (23:59:59.999Z) so markets closing later that day are included in the range - Add CSV cell escaping for values with commas, quotes, or newlines - Reject --resolved and --unresolved used together with a clear error - Validate captured_at and close_time timestamps before computing TTL/epochs, skip events with invalid dates - Replace as-any casts in fetchEventMarkets with proper type narrowing
…tories Add resolved, unresolved, snapshotLast defaults to makeParsedArgs/makeArgs helpers in e2e tests, gateway handler, and commands/index inline objects.
…heck - Remove logger import and all logger.info calls from discovery.ts - Remove dead meSet/meSet2 code (markets.length > 1 catches everything) - Make HistorySnapshot nullable fields match DB schema (name, series_category, confidence_score, edge_pp, close_time) - Replace read-before-write in insertReport with atomic INSERT ON CONFLICT that preserves has_history, mutually_exclusive, series_category - Tighten edge insert catch to only match "UNIQUE constraint failed" - Add Number.isFinite/isInteger validation to bootstrapCI parameters - Remove redundant days>=30 branch in time formatter - Centralize default ParsedArgs via defaultArgs() helper in commands/index.ts
…r display - Include mutually_exclusive in discovery queries and skip those events even when only one market is visible on Kalshi - Pass null instead of empty string for nullable DB columns in history cache - Show minutes (Xm) or < 1m for sub-hour time-until values - Remove all logger imports and calls from backtest command handler - Remove unused historyCount variable from prefetch transaction
SELECT DISTINCT on (event_ticker, series_category, mutually_exclusive) could return duplicate event_tickers if metadata changed between rows. Use GROUP BY event_ticker with MAX() aggregates instead.
parsePrice used || which treats a valid 0 price as falsy; now uses Number.isFinite check. Extracted buildEventQuery helper to de-duplicate the two identical SQL assembly blocks for settled/open discovery.
Added detailed help topic for backtest command with all flags/examples, and added backtest to both CLI and slash command overview sections.
- Add /backtest case to handleSlashCommand with flag parsing for --resolved, --unresolved, --category, --from, --to, --min-edge, --min-hours-before-close, --snapshot last - Round edge_pp to 1 decimal place in both resolved and unresolved paths to eliminate floating point noise in JSON output
- Schema migration adds confidence_score REAL column - Prefetch persists confidence_score from events API response - Backtest unresolved view reads confidence_score from DB instead of hardcoding 0, so the Conf column now shows real values
…outcome events The events and history APIs now return outcome_probabilities with per-market model/market probabilities. This enables backtesting ALL events, not just single-market ones. - Add outcome_probabilities to OctagonEventEntry interface - Schema migration: outcome_probabilities_json on octagon_reports and octagon_history - Prefetch stores outcome_probabilities_json from events API - History fetcher stores/reads outcome_probabilities_json in cache - Backtest resolved/unresolved paths look up per-market probability from outcome_probabilities, falling back to event-level when not available - Remove mutually_exclusive and markets.length > 1 filters from discovery Results: 1437 markets evaluated (up from 5), Skill +9.7% [CI: +1.2%, +17.7%], hit rate 66.9%, ROI +15.9%.
- backtest.ts: Narrow try/catch to only wrap fetchAndCacheHistory, not snapshot selection and market mapping. Remove capturedTo arg so full history is fetched once and cached for repeat runs. - octagon-prefetch.ts: Write metadata (has_history, mutually_exclusive, series_category, confidence_score, outcome_probabilities_json) per report_id atomically in persistEvent instead of event-wide UPDATE that overwrote historical rows. - discovery.ts: Remove unused mutually_exclusive column from queries since per-market probs from outcome_probabilities handle all events. - schema.ts: Add outcome_probabilities_json to CREATE TABLE for octagon_history so fresh installs match migrations. - octagon-events-api.ts: Remove unused pageNum variable.
Added backtest command to commands table, backtest-specific flags to flags table, and a dedicated Backtesting section with usage examples and sample output.
Updated all references across package.json, package-lock.json, README, GUIDE, AGENTS.md, help text, agent prompts, setup wizard, intro screen, and WhatsApp gateway docs.
The /backtest slash command now returns "Running backtest on ..." as an immediate message, then runs the actual backtest as an asyncFollowUp while the spinner stays active. The result is appended when complete. Added asyncFollowUp to CommandResult interface for any slash command that needs to show immediate feedback before a long-running operation.
When the Octagon history API returns 402/403 (paid subscription required), show a clear message instead of silently failing. The unresolved edge scanner (free events API) still runs. The resolved scorecard section displays the upgrade notice with a link to app.octagonai.co. Added SubscriptionRequiredError class and subscription_notice field on BacktestResult for both human and JSON output.
Replaces the separate resolved/unresolved flows with a single unified approach: look back N days, compare model then vs market now. - --days <n> flag replaces --from/--to/--min-hours-before-close/--snapshot - Both resolved and unresolved markets get full metrics (Brier, Skill, Hit Rate, P&L) using the same scoring logic - Resolved: outcome is Kalshi settlement (0 or 100) - Unresolved: outcome is current Kalshi trading price (M2M P&L) - Unified ScoredSignal type with model_prob, market_then, market_now - Single scorecard at top, detail tables for resolved/unresolved below - New P&L formula: C - P for YES, P - C for NO (simpler, works for both) - Updated help text, README, TUI slash command args
…saction - types.ts: Clarify minEdge comment to note fractional 0-1 scale converted to pp by caller - octagon-prefetch.ts: Wrap insertReport + metadata UPDATE in db.transaction so readers cannot see a partial row between the two operations
…gs, add test - package.json: Add kalshi-deep-trading-bot bin alias for backward compat - fetcher.ts: Validate history API response shape (data array, has_more boolean, next_cursor) matching events API validation pattern - parse-args.ts: Remove --from/--to flags (replaced by --days, were silently ignored) - octagon-client.test.ts: Add test for outcome_probabilities_json as array (not string) with case-insensitive ticker matching
New subcommand: search edge scans all cached Octagon events for markets with model edge above a threshold. Reads directly from SQLite — zero API calls, instant response. bun start search edge # top 20, ≥5pp edge bun start search edge --min-edge 30 # ≥30pp edge bun start search edge --limit 50 # top 50 bun start search edge --category crypto # filter by category Scans 14,400+ markets across 725 events in <1 second. Added --limit flag to parse-args. Updated help, dispatch, TUI, README.
Added tryFromPrefetch() to OctagonClient that builds an OctagonReport from the local SQLite cache (octagon_reports with variant_used='events-api') including per-market probabilities from outcome_probabilities_json. Wired into 4 code paths: - Scan loop (edge-computer.ts): skips cache API call when prefetch is fresh, only calls Octagon when shouldRefresh() triggers a refresh - Browse hydration (browse.ts): extractAllOutcomeProbs() reads from prefetch DB first, falls back to API only if not found - Browse report (browse.ts): same optimization via extractAllOutcomeProbs - Analyze command: tries prefetch before calling Octagon API Expected impact: ~90% reduction in individual cache API calls for scan loops (60-150 calls per cycle -> only refreshes), plus savings on browse and analyze paths.
…deduplicate edge query - README.md: Add 'text' language hint to fenced code block (MD040) - backtest.ts: Remove subscriptionNotice guard from unresolved path so the free events API scan always runs even when history API is gated - search-edge.ts: Deduplicate query by event_ticker using GROUP BY + JOIN on max(fetched_at) to avoid inflated market counts
The ⏺ prefix in AnswerBox now animates with braille block frames (⣾⣽⣻⢿⡿⣟⣯⣷) at 80ms during async follow-up operations like /backtest. Reverts to static ⏺ when the operation completes. - AnswerBox: added startSpinner(tui) and stopSpinner() methods - ChatLog: finalizeAnswer now returns the AnswerBoxComponent - cli.ts: starts spinner on progress message, stops on completion
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
WalkthroughRepo rebrand to “Kalshi Trading Bot CLI”; adds backtesting and edge-scanning subsystems, Octagon events prefetch/history caching, DB schema changes, CLI slash-command/autocomplete for Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant CLI
participant DB as SQLite_DB
participant OctagonAPI as Octagon_API
participant Metrics
participant Renderer
User->>CLI: /backtest --days 30 --resolved
CLI->>DB: discoverSettledMarkets(opts)
DB-->>CLI: event tickers
CLI->>OctagonAPI: fetchAndCacheHistory(event_ticker) or DB: read octagon_history
alt Cache exists
DB-->>CLI: Return snapshots
else
OctagonAPI-->>CLI: Return snapshots
CLI->>DB: INSERT snapshots into octagon_history
end
CLI->>CLI: selectSnapshotByDate() -> build ScoredSignals
CLI->>Metrics: computeMetrics(signals, minEdge)
Metrics-->>CLI: BacktestResult
CLI->>Renderer: formatBacktestHuman(result)
Renderer-->>CLI: Human-readable report
CLI-->>User: Display backtest report
opt exportPath provided
CLI->>Renderer: exportCSV(result, path)
Renderer-->>User: CSV written
end
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Possibly related PRs
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 10
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (2)
src/commands/help.ts (1)
126-167:⚠️ Potential issue | 🟡 MinorAdd
search edgeto the overview sections.The detailed
searchtopic documents the new subcommand, but the CLI and slash overviews still omit it, sohelphides one of the new entry points.As per coding guidelines, "
src/commands/help.ts: When a command's flags or signature changes, update detailed help topic and overview section insrc/commands/help.ts."Also applies to: 170-210
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/commands/help.ts` around lines 126 - 167, The help overview string returned by the help command omits the new "search edge" subcommand; update the help text returned (the main help string in src/commands/help.ts) to include "search edge" in the Quick start and Discovery/overview sections (and mirror it in the second overview block later in the file), so both top-level CLI and detailed topic overviews list the new subcommand; locate the returned multi-line string (the function/constant that returns the big help message) and insert entries like "kalshi search edge Search markets by edge" and the matching discovery line where other search variants are listed.README.md (1)
24-69:⚠️ Potential issue | 🟡 MinorAdd a language to the Example Session fence.
The Example Session block is still an untyped fenced block, so markdownlint will keep failing with MD040.
textorconsolewould be enough here.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@README.md` around lines 24 - 69, The fenced Example Session code block under the "## Example Session" heading lacks a language identifier causing markdownlint MD040; update the triple-backtick fence that begins with "$ bun start" to include a language like "text" or "console" (e.g., change ``` to ```text) so the block is typed and MD040 is resolved.
🧹 Nitpick comments (2)
src/scan/octagon-prefetch.ts (1)
71-134: Edge insertion is outside the report transaction—acceptable tradeoff.The report is committed in the transaction (lines 71-106), then
insertEdgeruns separately (lines 110-134). IfinsertEdgefails with a non-UNIQUE error, the report persists but the edge record doesn't. This creates a minor inconsistency, but:
- The UNIQUE constraint failure (most common case) is correctly swallowed
- Edge history is supplementary analytics data, not critical path
- The report will be skipped on retry due to the freshness check at line 65
If perfect consistency is needed later, consider wrapping both operations in a single transaction.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/scan/octagon-prefetch.ts` around lines 71 - 134, The report insertion (insertReport) is committed inside db.transaction but insertEdge runs outside it, leaving a potential inconsistency if insertEdge fails; modify the code so insertEdge is executed inside the same db.transaction callback that calls insertReport (use the existing db.transaction wrapper around insertReport and include the insertEdge call there), preserving the UNIQUE-error swallowing logic for insertEdge and using the same capturedAt/reportId/modelProb/marketProb variables so both report and edge persist atomically.src/commands/backtest.ts (1)
110-168: Load open-market event data once perevent_ticker.This branch re-runs the same
octagon_reportslookup, history probe, history row lookup, and JSON parse for every market in the event. GroupingopenMarketsbyevent_tickerlike the resolved path would cut the SQLite work to once per event and remove a lot of duplicated fallback logic.As per coding guidelines, "Keep files concise; extract helpers rather than duplicating code."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/commands/backtest.ts` around lines 110 - 168, The loop over openMarkets duplicates the same DB work and JSON parsing per market; refactor by grouping openMarkets by event_ticker, then for each event_ticker run the octagon_reports query and the history COUNT + history row lookup (queries against octagon_reports and octagon_history using db.query) once, parse outcome_probabilities_json once, and compute event-level model/mkt snapshots (modelProb, marketThen, confidenceScore) and per-event outcomes; then iterate the markets for that event and call findOutcomeProb for each m.ticker to derive per-market values. Extract this logic into a helper (e.g., buildEventSnapshot or getEventProbabilities) that returns the parsed outcomes and fallback model/market values so the main loop only iterates markets and avoids repeated DB queries and JSON.parse.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@CLAUDE.md`:
- Line 13: Update the sentence on CLAUDE.md that currently mandates adding
`README.md — commands table, flags table, and examples` (line mentioning README)
so it aligns with repo policy: change it from a hard requirement to a
conditional instruction (e.g., only add or update README.md when documentation
changes are explicitly requested) and/or add a short note referencing the "Do
not create README or documentation files unless explicitly asked" guidance;
ensure the change edits the exact text that mentions `README.md — commands
table, flags table, and examples` so contributors understand README updates are
conditional.
In `@package.json`:
- Around line 7-10: The package advertises a `kalshi` CLI but the package.json
"bin" only exposes "kalshi-trading-bot-cli" and "kalshi-deep-trading-bot", so
global installs won't provide the advertised executable; either add a "kalshi":
"./src/index.tsx" entry to the "bin" object in package.json (or rename an
existing bin key to "kalshi") or change the help text in the CLI entrypoint
(index.tsx) to use the actual binaries; update package.json's "bin" to include
the exact executable name your help output references or align the help text in
the CLI code (index.tsx) to the current bin names.
In `@src/backtest/discovery.ts`:
- Around line 31-44: fetchEventMarkets currently swallows all errors from
callKalshiApi and returns an empty array, making outages/rate-limits
indistinguishable from "no markets"; update fetchEventMarkets to surface
failures instead of hiding them by removing the broad try/catch or rethrowing
the caught error with context so callers can detect partial discovery (e.g.,
catch error from callKalshiApi inside fetchEventMarkets and throw a new Error
including eventTicker and the original error), keep the function signature and
return type (Promise<KalshiMarket[]>) but ensure any Kalshi API failure bubbles
up rather than returning [] so callers can handle or log the outage
appropriately.
In `@src/commands/backtest.ts`:
- Around line 39-42: The call to discoverSettledMarkets ignores the lookback
window so resolved runs still load all historical markets; update the call in
backtest (the block using args.unresolved and discoverSettledMarkets) to pass
the lookback/days parameter (e.g., include days: args.days or
lookbackDate/lookbackWindow) in the options object so discoverSettledMarkets can
limit markets by the requested window; ensure any name you pass matches the
parameter expected by discoverSettledMarkets (or adjust that function signature)
so markets with has_history = 1 are filtered by the lookback date before
scoring.
In `@src/commands/search-edge.ts`:
- Around line 32-44: The current inner subquery (variable inner) only picks
MAX(fetched_at) per event_ticker which can return multiple sibling rows with the
same timestamp; change the selection to be deterministic by tie-breaking on
report_id or by using a windowed rank to keep exactly one row per event_ticker.
For example, either include MAX(report_id) alongside MAX(fetched_at) in the
aggregation and join on both fetched_at and report_id, or replace the inner
aggregation with a query that uses ROW_NUMBER() OVER (PARTITION BY event_ticker
ORDER BY fetched_at DESC, report_id DESC) and then join/filter where row_number
= 1 so the final query (query variable) returns exactly one latest row per
octagon_reports.event_ticker.
In `@src/components/chat-log.ts`:
- Around line 256-265: The finalizeAnswer method creates a new
AnswerBoxComponent when this.activeAnswer is falsy but never assigns it to
this.activeAnswer, so later async follow-ups won’t reuse the same box; update
finalizeAnswer to set this.activeAnswer = box immediately after creating and
addChild(box) so the reuse path (this.activeAnswer.setText(...) / clearing
this.activeAnswer) will execute correctly for subsequent updates; keep the
existing behavior of setText and clearing activeAnswer in the else path.
In `@src/components/working-indicator.ts`:
- Around line 64-65: The code bypasses typing by using (this.loader as
any).frames; instead introduce a proper typed augmentation for the spinner and
remove the any cast: declare an interface (e.g., SpinnerWithFrames { frames:
string[] }) and update the WorkingIndicator's loader property type to Loader &
SpinnerWithFrames (or change its declaration to that type), then assign the
frames array directly to this.loader.frames; this keeps strong typing while
allowing frame customization without relying on an any cast or undocumented
runtime assumptions.
In `@src/controllers/browse.ts`:
- Around line 887-915: The prefetch branch in getDb()/browse handler is taking
the first octagon_reports row with outcome_probabilities_json without honoring
TTL — update the logic to only short-circuit when the prefetched snapshot is
unexpired by checking octagon_reports.expires_at (either add "AND expires_at >
CURRENT_TIMESTAMP" to the SQL or read expires_at from the row and verify it's >
Date.now() before using outcome_probabilities_json); keep using the same outcome
parsing and probs.set flow but only return probs when the snapshot is still
valid for use.
In `@src/db/index.ts`:
- Around line 30-34: The background prefetch currently swallows all errors
(catch(() => {})) for prefetchOctagonEvents called when dbPath ===
DEFAULT_DB_PATH which hides failures; change the catch to capture the thrown
error and surface it — e.g., log using your existing logger
(processLogger.error) and store the error in a retrievable place such as an
exported variable (e.g., octagonPrefetchError), a diagnostics API, or attach it
to the db instance (e.g., _db.prefetchError) so callers or diagnostics can
observe the failure; ensure you still run prefetchOctagonEvents(db).catch(err =>
{ processLogger.error('Octagon prefetch failed', err); /* store err in
octagonPrefetchError or _db.prefetchError or report to diagnostics */ }).
In `@src/db/schema.ts`:
- Around line 204-223: The has_history flag in octagon_reports can become stale
because octagon_history rows are inserted in fetcher.ts (fetcher functions
around lines 114-177) but only octagon-prefetch.ts (writer in its 52-106 region)
updates octagon_reports.has_history; to fix, ensure the flag is synchronized
whenever history rows are written by either (A) updating the insert path in
fetcher.ts to also UPDATE octagon_reports SET has_history=1 for the
corresponding report_id after inserting into octagon_history, or (B) add a
DB-level AFTER INSERT trigger on octagon_history that sets
octagon_reports.has_history=1, and ensure any deletions of history rows also
clear the flag or recompute it from octagon_history; pick one approach and apply
it to the functions that insert/delete history rows so the has_history column
always reflects actual octagon_history content.
---
Outside diff comments:
In `@README.md`:
- Around line 24-69: The fenced Example Session code block under the "## Example
Session" heading lacks a language identifier causing markdownlint MD040; update
the triple-backtick fence that begins with "$ bun start" to include a language
like "text" or "console" (e.g., change ``` to ```text) so the block is typed and
MD040 is resolved.
In `@src/commands/help.ts`:
- Around line 126-167: The help overview string returned by the help command
omits the new "search edge" subcommand; update the help text returned (the main
help string in src/commands/help.ts) to include "search edge" in the Quick start
and Discovery/overview sections (and mirror it in the second overview block
later in the file), so both top-level CLI and detailed topic overviews list the
new subcommand; locate the returned multi-line string (the function/constant
that returns the big help message) and insert entries like "kalshi search edge
Search markets by edge" and the matching discovery line where other search
variants are listed.
---
Nitpick comments:
In `@src/commands/backtest.ts`:
- Around line 110-168: The loop over openMarkets duplicates the same DB work and
JSON parsing per market; refactor by grouping openMarkets by event_ticker, then
for each event_ticker run the octagon_reports query and the history COUNT +
history row lookup (queries against octagon_reports and octagon_history using
db.query) once, parse outcome_probabilities_json once, and compute event-level
model/mkt snapshots (modelProb, marketThen, confidenceScore) and per-event
outcomes; then iterate the markets for that event and call findOutcomeProb for
each m.ticker to derive per-market values. Extract this logic into a helper
(e.g., buildEventSnapshot or getEventProbabilities) that returns the parsed
outcomes and fallback model/market values so the main loop only iterates markets
and avoids repeated DB queries and JSON.parse.
In `@src/scan/octagon-prefetch.ts`:
- Around line 71-134: The report insertion (insertReport) is committed inside
db.transaction but insertEdge runs outside it, leaving a potential inconsistency
if insertEdge fails; modify the code so insertEdge is executed inside the same
db.transaction callback that calls insertReport (use the existing db.transaction
wrapper around insertReport and include the insertEdge call there), preserving
the UNIQUE-error swallowing logic for insertEdge and using the same
capturedAt/reportId/modelProb/marketProb variables so both report and edge
persist atomically.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: f542d509-8422-49b7-97ec-3f39453bfb61
⛔ Files ignored due to path filters (1)
package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (36)
AGENTS.mdCLAUDE.mdGUIDE.mdREADME.mdpackage.jsonsrc/__tests__/e2e.test.tssrc/agent/prompts.tssrc/backtest/discovery.tssrc/backtest/fetcher.tssrc/backtest/metrics.tssrc/backtest/renderer.tssrc/backtest/types.tssrc/cli.tssrc/commands/analyze.tssrc/commands/backtest.tssrc/commands/dispatch.tssrc/commands/help.tssrc/commands/index.tssrc/commands/parse-args.tssrc/commands/search-edge.tssrc/components/answer-box.tssrc/components/chat-log.tssrc/components/intro.tssrc/components/working-indicator.tssrc/controllers/browse.tssrc/db/index.tssrc/db/octagon-cache.tssrc/db/schema.tssrc/gateway/channels/whatsapp/README.mdsrc/gateway/commands/handler.tssrc/scan/__tests__/octagon-client.test.tssrc/scan/edge-computer.tssrc/scan/octagon-client.tssrc/scan/octagon-events-api.tssrc/scan/octagon-prefetch.tssrc/setup/wizard.ts
| - `src/commands/dispatch.ts` — CLI dispatch block | ||
| - `src/cli.ts` — autocomplete `slashCommands` array | ||
| - `src/components/intro.ts` — welcome screen command list | ||
| - `README.md` — commands table, flags table, and examples |
There was a problem hiding this comment.
Clarify workflow precedence for README changes.
Line 13 mandates README updates, but repo guidance says docs should not be created/updated unless explicitly requested. Please resolve this conflict to avoid inconsistent contributor behavior.
Based on learnings "Do not create README or documentation files unless explicitly asked".
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@CLAUDE.md` at line 13, Update the sentence on CLAUDE.md that currently
mandates adding `README.md — commands table, flags table, and examples` (line
mentioning README) so it aligns with repo policy: change it from a hard
requirement to a conditional instruction (e.g., only add or update README.md
when documentation changes are explicitly requested) and/or add a short note
referencing the "Do not create README or documentation files unless explicitly
asked" guidance; ensure the change edits the exact text that mentions `README.md
— commands table, flags table, and examples` so contributors understand README
updates are conditional.
| "bin": { | ||
| "kalshi-trading-bot-cli": "./src/index.tsx", | ||
| "kalshi-deep-trading-bot": "./src/index.tsx" | ||
| }, |
There was a problem hiding this comment.
Expose the binary name your help output advertises.
The CLI help still shows commands as kalshi ..., but this package only installs kalshi-trading-bot-cli and kalshi-deep-trading-bot. A global install won't have the executable the built-in help tells users to run.
One possible fix
"bin": {
+ "kalshi": "./src/index.tsx",
"kalshi-trading-bot-cli": "./src/index.tsx",
"kalshi-deep-trading-bot": "./src/index.tsx"
},📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| "bin": { | |
| "kalshi-trading-bot-cli": "./src/index.tsx", | |
| "kalshi-deep-trading-bot": "./src/index.tsx" | |
| }, | |
| "bin": { | |
| "kalshi": "./src/index.tsx", | |
| "kalshi-trading-bot-cli": "./src/index.tsx", | |
| "kalshi-deep-trading-bot": "./src/index.tsx" | |
| }, |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@package.json` around lines 7 - 10, The package advertises a `kalshi` CLI but
the package.json "bin" only exposes "kalshi-trading-bot-cli" and
"kalshi-deep-trading-bot", so global installs won't provide the advertised
executable; either add a "kalshi": "./src/index.tsx" entry to the "bin" object
in package.json (or rename an existing bin key to "kalshi") or change the help
text in the CLI entrypoint (index.tsx) to use the actual binaries; update
package.json's "bin" to include the exact executable name your help output
references or align the help text in the CLI code (index.tsx) to the current bin
names.
| /** Fetch event markets from Kalshi, returning empty array on error. */ | ||
| async function fetchEventMarkets(eventTicker: string): Promise<KalshiMarket[]> { | ||
| try { | ||
| const response = await callKalshiApi('GET', `/events/${eventTicker}`, { | ||
| params: { with_nested_markets: true }, | ||
| }); | ||
| if (!response || typeof response !== 'object') return []; | ||
| const obj = response as Record<string, unknown>; | ||
| const event = (obj.event ?? obj) as Record<string, unknown>; | ||
| const markets = event.markets; | ||
| return Array.isArray(markets) ? markets as KalshiMarket[] : []; | ||
| } catch { | ||
| return []; | ||
| } |
There was a problem hiding this comment.
Don’t hide Kalshi fetch failures as empty events.
Lines 42-43 make an outage or rate-limit indistinguishable from “this event has no nested markets”. That silently drops data from both discovery paths, so downstream callers can return a successful but incomplete backtest instead of surfacing that discovery was partial.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/backtest/discovery.ts` around lines 31 - 44, fetchEventMarkets currently
swallows all errors from callKalshiApi and returns an empty array, making
outages/rate-limits indistinguishable from "no markets"; update
fetchEventMarkets to surface failures instead of hiding them by removing the
broad try/catch or rethrowing the caught error with context so callers can
detect partial discovery (e.g., catch error from callKalshiApi inside
fetchEventMarkets and throw a new Error including eventTicker and the original
error), keep the function signature and return type (Promise<KalshiMarket[]>)
but ensure any Kalshi API failure bubbles up rather than returning [] so callers
can handle or log the outage appropriately.
| let inner = `SELECT event_ticker, MAX(fetched_at) as max_fetched | ||
| FROM octagon_reports WHERE variant_used = 'events-api' AND outcome_probabilities_json IS NOT NULL`; | ||
| const params: Record<string, string> = {}; | ||
| if (category) { | ||
| inner += ' AND LOWER(series_category) LIKE $cat'; | ||
| params.$cat = `%${category.toLowerCase()}%`; | ||
| } | ||
| inner += ' GROUP BY event_ticker'; | ||
|
|
||
| const query = `SELECT r.event_ticker, r.series_category, r.confidence_score, r.outcome_probabilities_json | ||
| FROM octagon_reports r | ||
| INNER JOIN (${inner}) latest ON r.event_ticker = latest.event_ticker AND r.fetched_at = latest.max_fetched | ||
| WHERE r.variant_used = 'events-api' AND r.outcome_probabilities_json IS NOT NULL`; |
There was a problem hiding this comment.
Make the “latest row per event” selection unique.
Lines 32-44 only key latest by event_ticker + MAX(fetched_at). octagon_reports is still report-level (report_id, ticker), so sibling rows that share the same latest timestamp all match the join and the same outcome_probabilities_json gets scanned multiple times. That inflates total_scanned and can duplicate ranked markets.
Use a deterministic tie-breaker (report_id) or rank rows and keep exactly one row per event_ticker.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/commands/search-edge.ts` around lines 32 - 44, The current inner subquery
(variable inner) only picks MAX(fetched_at) per event_ticker which can return
multiple sibling rows with the same timestamp; change the selection to be
deterministic by tie-breaking on report_id or by using a windowed rank to keep
exactly one row per event_ticker. For example, either include MAX(report_id)
alongside MAX(fetched_at) in the aggregation and join on both fetched_at and
report_id, or replace the inner aggregation with a query that uses ROW_NUMBER()
OVER (PARTITION BY event_ticker ORDER BY fetched_at DESC, report_id DESC) and
then join/filter where row_number = 1 so the final query (query variable)
returns exactly one latest row per octagon_reports.event_ticker.
| finalizeAnswer(text: string): AnswerBoxComponent { | ||
| if (!this.activeAnswer) { | ||
| this.addChild(new AnswerBoxComponent(text)); | ||
| return; | ||
| const box = new AnswerBoxComponent(text); | ||
| this.addChild(box); | ||
| return box; | ||
| } | ||
| this.activeAnswer.setText(text); | ||
| const box = this.activeAnswer; | ||
| this.activeAnswer = null; | ||
| return box; |
There was a problem hiding this comment.
activeAnswer reuse path is not being activated
On Line 257, a new box is returned without setting this.activeAnswer, so the update path on Line 262-Line 265 won’t run in normal flow. In the async follow-up flow (see src/cli.ts Line 507-Line 521), this can produce extra answer boxes instead of updating the existing one.
🛠️ Proposed fix
diff --git a/src/components/chat-log.ts b/src/components/chat-log.ts
@@
addQuery(query: string) {
+ this.activeAnswer = null;
this.addChild(new UserQueryComponent(query));
}
@@
finalizeAnswer(text: string): AnswerBoxComponent {
if (!this.activeAnswer) {
const box = new AnswerBoxComponent(text);
this.addChild(box);
+ this.activeAnswer = box;
return box;
}
this.activeAnswer.setText(text);
const box = this.activeAnswer;
this.activeAnswer = null;
return box;
}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/chat-log.ts` around lines 256 - 265, The finalizeAnswer method
creates a new AnswerBoxComponent when this.activeAnswer is falsy but never
assigns it to this.activeAnswer, so later async follow-ups won’t reuse the same
box; update finalizeAnswer to set this.activeAnswer = box immediately after
creating and addChild(box) so the reuse path (this.activeAnswer.setText(...) /
clearing this.activeAnswer) will execute correctly for subsequent updates; keep
the existing behavior of setText and clearing activeAnswer in the else path.
| // Use larger braille block characters for a more visible spinner (ora's "dots2" style) | ||
| (this.loader as any).frames = ['⣾', '⣽', '⣻', '⢿', '⡿', '⣟', '⣯', '⣷']; |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
set -euo pipefail
# 1) Confirm installed/declared version
rg -n '"@mariozechner/pi-tui"' package.json bun.lock* || true
# 2) If dependency sources are present, verify Loader typing/API surface for "frames"
if [ -d node_modules/@mariozechner/pi-tui ]; then
rg -n "class Loader|interface Loader|frames" node_modules/@mariozechner/pi-tui -g "*.d.ts" -g "*.ts"
else
echo "node_modules not present in sandbox; verify Loader 'frames' support in upstream docs for the declared version."
fiRepository: OctagonAI/kalshi-trading-bot-cli
Length of output: 628
🏁 Script executed:
# Read the working-indicator.ts file to understand Loader context
head -100 src/components/working-indicator.tsRepository: OctagonAI/kalshi-trading-bot-cli
Length of output: 2334
🌐 Web query:
@mariozechner/pi-tui Loader frames property API documentation
💡 Result:
The @mariozechner/pi-tui Loader component is an animated loading spinner for terminal UIs. Official documentation from the npm package page and GitHub README provides the primary API details. ## Constructor new Loader( tui, // TUI instance for render updates (s) => chalk.cyan(s), // spinner color function (s) => chalk.gray(s), // message color function "Loading..." // initial message (default: "Loading...") ); ## Key Methods - loader.start - Begin animation - loader.setMessage("Still loading...") - Update message text - loader.stop - Stop animation ## CancellableLoader Extension Extends Loader with Escape key abort and AbortSignal: new CancellableLoader( tui, (s) => chalk.cyan(s), (s) => chalk.gray(s), "Working..." ); loader.onAbort = => done(null); // Escape press callback doAsyncWork(loader.signal).then(done); // AbortSignal Properties: signal: AbortSignal, aborted: boolean, onAbort?: => void No explicit "frames" property is documented in official sources. The spinner likely uses a built-in frame sequence cycled internally during start. Source code would confirm customization options, located in the badlogic/pi-mono repo at packages/tui (not directly surfaced in search results).
Citations:
- 1: https://www.npmjs.com/package/@mariozechner/pi-tui
- 2: https://github.com/badlogic/pi-mono
- 3: https://github.com/badlogic/pi-mono/blob/main/packages/tui/README.md
Remove the any cast for spinner frame customization.
Line 65 bypasses strict typing and assumes a frames API on Loader that is not documented in the official API. The Loader only exposes start(), setMessage(), and stop() methods; direct property assignment via any creates a brittle dependency on internal implementation details that may change across versions without warning.
Suggested typed pattern
- (this.loader as any).frames = ['⣾', '⣽', '⣻', '⢿', '⡿', '⣟', '⣯', '⣷'];
+ const loaderWithFrames = this.loader as Loader & { frames?: string[] };
+ if ('frames' in loaderWithFrames) {
+ loaderWithFrames.frames = ['⣾', '⣽', '⣻', '⢿', '⡿', '⣟', '⣯', '⣷'];
+ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/components/working-indicator.ts` around lines 64 - 65, The code bypasses
typing by using (this.loader as any).frames; instead introduce a proper typed
augmentation for the spinner and remove the any cast: declare an interface
(e.g., SpinnerWithFrames { frames: string[] }) and update the WorkingIndicator's
loader property type to Loader & SpinnerWithFrames (or change its declaration to
that type), then assign the frames array directly to this.loader.frames; this
keeps strong typing while allowing frame customization without relying on an any
cast or undocumented runtime assumptions.
| // Fire-and-forget: prefetch Octagon events in background (only for the real runtime DB) | ||
| if (dbPath === DEFAULT_DB_PATH) { | ||
| const db = _db; | ||
| prefetchOctagonEvents(db).catch(() => {}); | ||
| } |
There was a problem hiding this comment.
Avoid fully silent failure in background prefetch.
Line 33 drops every prefetch error (catch(() => {})). This makes cache bootstrap failures invisible and hard to diagnose. Please keep a retrievable error state (or surface it through an existing diagnostics path) instead of swallowing it entirely.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/db/index.ts` around lines 30 - 34, The background prefetch currently
swallows all errors (catch(() => {})) for prefetchOctagonEvents called when
dbPath === DEFAULT_DB_PATH which hides failures; change the catch to capture the
thrown error and surface it — e.g., log using your existing logger
(processLogger.error) and store the error in a retrievable place such as an
exported variable (e.g., octagonPrefetchError), a diagnostics API, or attach it
to the db instance (e.g., _db.prefetchError) so callers or diagnostics can
observe the failure; ensure you still run prefetchOctagonEvents(db).catch(err =>
{ processLogger.error('Octagon prefetch failed', err); /* store err in
octagonPrefetchError or _db.prefetchError or report to diagnostics */ }).
| if (!reportCols.some((c) => c.name === 'has_history')) { | ||
| db.exec(`ALTER TABLE octagon_reports ADD COLUMN has_history INTEGER DEFAULT 0`); | ||
| } | ||
| if (!reportCols.some((c) => c.name === 'mutually_exclusive')) { | ||
| db.exec(`ALTER TABLE octagon_reports ADD COLUMN mutually_exclusive INTEGER DEFAULT 0`); | ||
| } | ||
| if (!reportCols.some((c) => c.name === 'series_category')) { | ||
| db.exec(`ALTER TABLE octagon_reports ADD COLUMN series_category TEXT`); | ||
| } | ||
| if (!reportCols.some((c) => c.name === 'confidence_score')) { | ||
| db.exec(`ALTER TABLE octagon_reports ADD COLUMN confidence_score REAL`); | ||
| } | ||
| if (!reportCols.some((c) => c.name === 'outcome_probabilities_json')) { | ||
| db.exec(`ALTER TABLE octagon_reports ADD COLUMN outcome_probabilities_json TEXT`); | ||
| } | ||
|
|
||
| const historyCols = db.query(`PRAGMA table_info(octagon_history)`).all() as Array<{ name: string }>; | ||
| if (!historyCols.some((c) => c.name === 'outcome_probabilities_json')) { | ||
| db.exec(`ALTER TABLE octagon_history ADD COLUMN outcome_probabilities_json TEXT`); | ||
| } |
There was a problem hiding this comment.
has_history can drift away from the actual cache.
src/backtest/fetcher.ts Lines 114-177 inserts rows into octagon_history, but the only provided writer for octagon_reports.has_history is src/scan/octagon-prefetch.ts Lines 52-106. That means history can be cached while this flag stays 0, so discovery paths that rely on has_history will miss backtestable events. Either derive this state from octagon_history, or backfill/synchronize the flag whenever history rows are written.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/db/schema.ts` around lines 204 - 223, The has_history flag in
octagon_reports can become stale because octagon_history rows are inserted in
fetcher.ts (fetcher functions around lines 114-177) but only octagon-prefetch.ts
(writer in its 52-106 region) updates octagon_reports.has_history; to fix,
ensure the flag is synchronized whenever history rows are written by either (A)
updating the insert path in fetcher.ts to also UPDATE octagon_reports SET
has_history=1 for the corresponding report_id after inserting into
octagon_history, or (B) add a DB-level AFTER INSERT trigger on octagon_history
that sets octagon_reports.has_history=1, and ensure any deletions of history
rows also clear the flag or recompute it from octagon_history; pick one approach
and apply it to the functions that insert/delete history rows so the has_history
column always reflects actual octagon_history content.
- /search: shows 'edge' as first option with examples (--min-edge 30, --category crypto, --limit 50) when typing after 'edge' - /backtest: shows all flags (--days, --resolved, --unresolved, --category, --min-edge, --export) with default values - /help: added 'backtest' to help topic completions
There was a problem hiding this comment.
Actionable comments posted: 2
♻️ Duplicate comments (1)
src/commands/search-edge.ts (1)
32-44:⚠️ Potential issue | 🟠 MajorMake latest-row selection deterministic per
event_ticker.Line 32-Line 44 still allows duplicate matches when multiple reports share the same
fetched_at, which can inflatetotal_scannedand duplicate ranked markets.Suggested fix (ROW_NUMBER tie-break)
- let inner = `SELECT event_ticker, MAX(fetched_at) as max_fetched - FROM octagon_reports WHERE variant_used = 'events-api' AND outcome_probabilities_json IS NOT NULL`; + let inner = `SELECT event_ticker, fetched_at, report_id, + ROW_NUMBER() OVER ( + PARTITION BY event_ticker + ORDER BY fetched_at DESC, report_id DESC + ) AS rn + FROM octagon_reports + WHERE variant_used = 'events-api' AND outcome_probabilities_json IS NOT NULL`; @@ - inner += ' GROUP BY event_ticker'; + // keep only one deterministic latest row per event_ticker @@ - const query = `SELECT r.event_ticker, r.series_category, r.confidence_score, r.outcome_probabilities_json + const query = `SELECT r.event_ticker, r.series_category, r.confidence_score, r.outcome_probabilities_json FROM octagon_reports r - INNER JOIN (${inner}) latest ON r.event_ticker = latest.event_ticker AND r.fetched_at = latest.max_fetched + INNER JOIN (${inner}) latest + ON r.event_ticker = latest.event_ticker + AND r.fetched_at = latest.fetched_at + AND r.report_id = latest.report_id WHERE r.variant_used = 'events-api' AND r.outcome_probabilities_json IS NOT NULL`;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/commands/search-edge.ts` around lines 32 - 44, The current inner query building (variable "inner") selects MAX(fetched_at) per event_ticker which can still return multiple rows when multiple reports share the same fetched_at; update the query to deterministically pick a single latest row per event_ticker using a window function (e.g., ROW_NUMBER() OVER (PARTITION BY event_ticker ORDER BY fetched_at DESC, <tie-breaker column> ASC) and filter for row_number = 1) so joins on octagon_reports pick exactly one latest report; locate the query construction around variables "inner" and "query" and replace the GROUP BY/MAX approach with a subquery that ranks rows per event_ticker and selects only the top-ranked row (use a stable tie-breaker column such as id or created_at).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/commands/search-edge.ts`:
- Around line 57-67: The loop over outcomes may encounter nulls/primitives and
non-string tickers, causing property access and later padEnd calls to throw;
update the parsing/iteration in the outcomes handling (the variable outcomes and
the for (const o of outcomes) loop) to first check that each o is a non-null
object (e.g., typeof o === 'object' && o !== null), that o.market_ticker is a
string, and that o.model_probability and o.market_probability are numbers before
accessing their properties or using padEnd; apply the same runtime type-guard
pattern to the other similar block referenced around the model/market
probability checks so only validated items are processed.
- Around line 29-30: Normalize the incoming opts.limit to a positive integer
before using it: replace the raw assignment at const limit = opts?.limit ?? 20;
with logic that parses and clamps the value (e.g., use Number(opts?.limit),
guard NaN, apply Math.floor, and ensure a minimum of 1 with a default of 20) so
slice(0, limit) behaves as a true "take N" rather than misbehaving for
zeros/negatives/non-integers; apply the same normalization to the other usage
referenced around the second occurrence (the other limit usage at ~line 92) so
both places use the validated positive-integer limit variable.
---
Duplicate comments:
In `@src/commands/search-edge.ts`:
- Around line 32-44: The current inner query building (variable "inner") selects
MAX(fetched_at) per event_ticker which can still return multiple rows when
multiple reports share the same fetched_at; update the query to
deterministically pick a single latest row per event_ticker using a window
function (e.g., ROW_NUMBER() OVER (PARTITION BY event_ticker ORDER BY fetched_at
DESC, <tie-breaker column> ASC) and filter for row_number = 1) so joins on
octagon_reports pick exactly one latest report; locate the query construction
around variables "inner" and "query" and replace the GROUP BY/MAX approach with
a subquery that ranks rows per event_ticker and selects only the top-ranked row
(use a stable tie-breaker column such as id or created_at).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 60ec2a7f-daf7-47dd-a24d-73747f1cc971
📒 Files selected for processing (1)
src/commands/search-edge.ts
| const limit = opts?.limit ?? 20; | ||
| const category = opts?.category; |
There was a problem hiding this comment.
Normalize limit to a positive integer before slicing.
A negative/zero/non-integer limit leads to confusing slice(0, limit) behavior (e.g., dropping tail rows instead of limiting).
Suggested fix
- const limit = opts?.limit ?? 20;
+ const rawLimit = opts?.limit ?? 20;
+ const limit = Number.isFinite(rawLimit) ? Math.max(1, Math.trunc(rawLimit)) : 20;Also applies to: 92-92
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/commands/search-edge.ts` around lines 29 - 30, Normalize the incoming
opts.limit to a positive integer before using it: replace the raw assignment at
const limit = opts?.limit ?? 20; with logic that parses and clamps the value
(e.g., use Number(opts?.limit), guard NaN, apply Math.floor, and ensure a
minimum of 1 with a default of 20) so slice(0, limit) behaves as a true "take N"
rather than misbehaving for zeros/negatives/non-integers; apply the same
normalization to the other usage referenced around the second occurrence (the
other limit usage at ~line 92) so both places use the validated positive-integer
limit variable.
…bability heuristics - Added close_time column to octagon_reports schema - Prefetch stores close_time from events API and backfills existing rows - Edge scanner filters WHERE close_time > now to exclude resolved events - Removed the janky <1% / >99% probability filter — replaced by proper close_time check. Only keeps market_probability > 0 for illiquid filter. - 259 resolved events now properly excluded from scan results
There was a problem hiding this comment.
♻️ Duplicate comments (3)
src/commands/search-edge.ts (3)
59-70:⚠️ Potential issue | 🟠 MajorGuard parsed outcome element shape before property access.
Line 68 assumes every parsed element is an object with numeric/string fields;
nullor primitives can throw here and later break rendering (e.g.,padEnd).Suggested fix (parse as unknown + runtime guards)
- let outcomes: Array<{ market_ticker: string; model_probability: number; market_probability: number }>; + let outcomes: unknown; @@ - if (!Array.isArray(outcomes)) continue; + if (!Array.isArray(outcomes)) continue; @@ - for (const o of outcomes) { - if (typeof o.model_probability !== 'number' || typeof o.market_probability !== 'number') continue; - if (!o.market_ticker) continue; + for (const o of outcomes) { + if (!o || typeof o !== 'object') continue; + const rec = o as { + market_ticker?: unknown; + model_probability?: unknown; + market_probability?: unknown; + }; + if (typeof rec.model_probability !== 'number' || typeof rec.market_probability !== 'number') continue; + if (typeof rec.market_ticker !== 'string' || rec.market_ticker.length === 0) continue; @@ - const edgePp = Math.round((o.model_probability - o.market_probability) * 10) / 10; + const edgePp = Math.round((rec.model_probability - rec.market_probability) * 10) / 10; @@ - market_ticker: o.market_ticker, + market_ticker: rec.market_ticker, @@ - model_prob: o.model_probability, - market_prob: o.market_probability, + model_prob: rec.model_probability, + market_prob: rec.market_probability,As per coding guidelines: “Prefer strict typing and avoid
anytype”.Also applies to: 77-77
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/commands/search-edge.ts` around lines 59 - 70, The parsed JSON from row.outcome_probabilities_json should be treated as unknown and each element narrowed before accessing fields: change handling around outcomes and the for (const o of outcomes) loop to first verify Array.isArray(outcomes), then for each o assert typeof o === 'object' && o !== null and that 'model_probability' and 'market_probability' and 'market_ticker' exist (use the in operator) before reading their values or using methods like padEnd; also validate the numeric/string types (typeof ... === 'number' / 'string') and skip elements that fail these guards so you avoid runtime exceptions when elements are null/primitives.
33-46:⚠️ Potential issue | 🟠 MajorFix non-deterministic “latest row per event” selection.
At Line 45, joining only on
event_ticker + max_fetchedcan return sibling rows when multiple reports share the samefetched_at, which duplicates markets and inflatestotal_scanned.Suggested fix (deterministic latest row via window rank)
- let inner = `SELECT event_ticker, MAX(fetched_at) as max_fetched - FROM octagon_reports WHERE variant_used = 'events-api' AND outcome_probabilities_json IS NOT NULL - AND (close_time IS NULL OR close_time > $now)`; + let inner = `SELECT + event_ticker, + series_category, + confidence_score, + outcome_probabilities_json, + ROW_NUMBER() OVER ( + PARTITION BY event_ticker + ORDER BY fetched_at DESC, report_id DESC + ) AS rn + FROM octagon_reports + WHERE variant_used = 'events-api' + AND outcome_probabilities_json IS NOT NULL + AND (close_time IS NULL OR close_time > $now)`; @@ - inner += ' GROUP BY event_ticker'; - - const query = `SELECT r.event_ticker, r.series_category, r.confidence_score, r.outcome_probabilities_json - FROM octagon_reports r - INNER JOIN (${inner}) latest ON r.event_ticker = latest.event_ticker AND r.fetched_at = latest.max_fetched - WHERE r.variant_used = 'events-api' AND r.outcome_probabilities_json IS NOT NULL`; + const query = `SELECT event_ticker, series_category, confidence_score, outcome_probabilities_json + FROM (${inner}) latest + WHERE latest.rn = 1`;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/commands/search-edge.ts` around lines 33 - 46, The current inner/join logic using inner, latest, max_fetched and joining on event_ticker + fetched_at can return multiple sibling rows when fetched_at ties; replace the inner/join approach with a deterministic latest-row subquery using a window function (e.g., ROW_NUMBER() OVER (PARTITION BY event_ticker ORDER BY fetched_at DESC, <tie_breaker_column> DESC) ) to pick row_number = 1 from octagon_reports (still filtering variant_used='events-api' and outcome_probabilities_json IS NOT NULL and optional category) and then join that single-row-per-event result to the outer query (or use it directly) so event_ticker, series_category, confidence_score, outcome_probabilities_json are uniquely selected; ensure you add a stable tie-breaker column (primary key like id or report_id) in the ORDER BY to guarantee determinism.
29-30:⚠️ Potential issue | 🟡 MinorNormalize
limitbefore slicing.Line 93 currently uses a raw
limit; zero/negative/non-integer values cause confusing truncation behavior instead of “take N”.Suggested fix
- const limit = opts?.limit ?? 20; + const rawLimit = Number(opts?.limit ?? 20); + const limit = Number.isFinite(rawLimit) ? Math.max(1, Math.trunc(rawLimit)) : 20;Also applies to: 93-93
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/commands/search-edge.ts` around lines 29 - 30, Normalize the limit value before using it to slice results: coerce opts?.limit (the limit variable) to a safe integer (e.g., Number -> Math.floor or parseInt), clamp it to a minimum of 0 and a sensible maximum (e.g., results.length or a configured max) and use that normalized value (e.g., normalizedLimit) for the slice operation instead of the raw limit so zero/negative/non-integer inputs behave correctly.
🧹 Nitpick comments (1)
src/scan/octagon-prefetch.ts (1)
116-140: Consider movinginsertEdgeinside the transaction for atomicity.The
insertEdgecall is outside the transaction block (lines 76-112). If it fails with a non-UNIQUE error after the report is committed, the database will have a report without its corresponding edge history record. If edge history is essential for data integrity, consider wrapping both operations in a single transaction.♻️ Proposed refactor to include edge insertion in transaction
// Insert report and set metadata in a single transaction db.transaction(() => { insertReport(db, { // ... existing code ... }); db.prepare( // ... existing UPDATE ... ).run({ /* ... */ }); - })(); - // Also persist to edge_history - const edge = modelProb - marketProb; - try { + // Also persist to edge_history + const edge = modelProb - marketProb; insertEdge(db, { ticker: event.event_ticker, event_ticker: event.event_ticker, timestamp: capturedAt, model_prob: modelProb, market_prob: marketProb, edge, octagon_report_id: reportId, drivers_json: null, sources_json: null, catalysts_json: null, cache_hit: 1, cache_miss: 0, confidence: classifyConfidence(Math.abs(edge)), }); - } catch (err) { - // Only swallow UNIQUE constraint violations (duplicate ticker+timestamp) - const msg = err instanceof Error ? err.message : String(err); - if (/UNIQUE constraint failed/i.test(msg)) { - // Expected — edge already exists for this ticker+timestamp - } else { - throw err; - } - } + })();Note: If you keep the try-catch for UNIQUE constraints, you can still handle it inside the transaction—the transaction will only roll back on unhandled exceptions.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/scan/octagon-prefetch.ts` around lines 116 - 140, Move the insertEdge call into the same database transaction that creates/commits the octagon report so both the report and its edge history are atomic: locate the transaction block around the report creation in src/scan/octagon-prefetch.ts and perform insertEdge(...) inside that transaction rather than after it; keep the UNIQUE-constraint handling (the current try/catch around insertEdge) but rethrow non-UNIQUE errors so the transaction will roll back on unexpected failures and prevent a committed report without its edge record.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Duplicate comments:
In `@src/commands/search-edge.ts`:
- Around line 59-70: The parsed JSON from row.outcome_probabilities_json should
be treated as unknown and each element narrowed before accessing fields: change
handling around outcomes and the for (const o of outcomes) loop to first verify
Array.isArray(outcomes), then for each o assert typeof o === 'object' && o !==
null and that 'model_probability' and 'market_probability' and 'market_ticker'
exist (use the in operator) before reading their values or using methods like
padEnd; also validate the numeric/string types (typeof ... === 'number' /
'string') and skip elements that fail these guards so you avoid runtime
exceptions when elements are null/primitives.
- Around line 33-46: The current inner/join logic using inner, latest,
max_fetched and joining on event_ticker + fetched_at can return multiple sibling
rows when fetched_at ties; replace the inner/join approach with a deterministic
latest-row subquery using a window function (e.g., ROW_NUMBER() OVER (PARTITION
BY event_ticker ORDER BY fetched_at DESC, <tie_breaker_column> DESC) ) to pick
row_number = 1 from octagon_reports (still filtering variant_used='events-api'
and outcome_probabilities_json IS NOT NULL and optional category) and then join
that single-row-per-event result to the outer query (or use it directly) so
event_ticker, series_category, confidence_score, outcome_probabilities_json are
uniquely selected; ensure you add a stable tie-breaker column (primary key like
id or report_id) in the ORDER BY to guarantee determinism.
- Around line 29-30: Normalize the limit value before using it to slice results:
coerce opts?.limit (the limit variable) to a safe integer (e.g., Number ->
Math.floor or parseInt), clamp it to a minimum of 0 and a sensible maximum
(e.g., results.length or a configured max) and use that normalized value (e.g.,
normalizedLimit) for the slice operation instead of the raw limit so
zero/negative/non-integer inputs behave correctly.
---
Nitpick comments:
In `@src/scan/octagon-prefetch.ts`:
- Around line 116-140: Move the insertEdge call into the same database
transaction that creates/commits the octagon report so both the report and its
edge history are atomic: locate the transaction block around the report creation
in src/scan/octagon-prefetch.ts and perform insertEdge(...) inside that
transaction rather than after it; keep the UNIQUE-constraint handling (the
current try/catch around insertEdge) but rethrow non-UNIQUE errors so the
transaction will roll back on unexpected failures and prevent a committed report
without its edge record.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Run ID: 7e1f05a9-2f91-4e2d-8041-a75a37e9a051
📒 Files selected for processing (3)
src/commands/search-edge.tssrc/db/schema.tssrc/scan/octagon-prefetch.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- src/db/schema.ts
…Probs Both now exclude events where close_time is in the past, preventing stale data for closed/expired markets from being served. Backtest intentionally keeps closed markets for historical performance measurement.
…esting - fetchEventHistory always sends exclude_empty_model=true to filter snapshots with null model_probability - Pass --days value to history API so server excludes markets that closed before the lookback window - discoverSettledMarkets filters events by close_time within the lookback period to avoid fetching history for old resolved events - Removed legacy from/to date filtering from discovery
- Always send exclude_empty_model=true to history API to filter snapshots with null model_probability - Keep days filtering at the discovery level only (not in history fetch) so the SQLite cache is reusable across different --days values - discoverSettledMarkets filters events by close_time within lookback
…omplete analysis)
Problem 1: Unresolved markets without a snapshot from the lookback period were falling back to current price as market_then, making brier(current_price, current_price) ≈ 0 and artificially deflating Brier(Market). Now skips these markets instead. Problem 2: Prefetch skipped all events with model_probability=0, even those with valid per-market data in outcome_probabilities. Now only skips when model=0 AND no outcome_probabilities exist, recovering 64 events and 3,536 markets.
Summary by CodeRabbit
New Features
Documentation
UX
Chores
Tests