Skip to content

lopeselio/blockstory-pay

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

76 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

BlockStory: Institutional-Grade Intelligence Gateway

BlockStory Logo

A unified intelligence gateway bridging multichain historical data with real-time Nansen AI market insights. Powered by the Corbits Proxy, x402 pay-per-request, and Chainlink CRE for transparent, verifiable, and institutional-grade onchain analytics across Base, Ethereum, and Solana.


πŸŽ₯ Video Demo

BlockStory Demo Video

Click the image above to watch the full demo on YouTube.


πŸ† Hackathon Submission

Submitted to the Chainlink Convergence 2026 Hackathon.


πŸ“– Overview

The Problem: Subscription Fatigue & Data Opacity

Traditional blockchain analytics and historical data providers operate on rigid, expensive monthly subscription models. This creates a barrier for developers, institutional researchers, and individual traders who only need specific data points at irregular intervals. Furthermore, data fetched via traditional RPCs lacks a built-in "proof of source" or "proof of cost" layer, making it difficult to verify the integrity of historical records in a multi-stakeholder environment.

The Solution: BlockStory

solution Logo

**BlockStory** disrupts the status quo by introducing a **Pay-Per-Request Intelligence Layer**. By combining AI-driven natural language processing with decentralized attestation frameworks, BlockStory provides: 1. **Dynamic Cost Control:** Users only pay for what they use via **x402 micropayments**. 2. **Verifiable History:** Every query and result is hashed and attested onchain via **Chainlink CRE**. 3. **Unified Intelligence:** Access both raw RPC historical data and high-level **Nansen AI Market Insights** through a single conversational interface.

πŸ”— Chainlink Integration: How We Use It

We leverage Chainlink CRE (Chainlink Runtime Environment) as our core verifiable orchestration layer. This ensures that every piece of data served to the user is not just "displayed," but cryptographically linked to a source through an onchain attestation.

πŸ— Where is Chainlink in the Code?

Functional Part Key File Description
CRE Workflow cre/history-fetcher/workflow.yaml The core Chainlink CRE workflow definition for fetching, hashing, and reporting data.
Attestation Orchestrator server/src/index.ts#L655 The server-side trigger which executes the CRE local simulation for every paid history query.
Onchain Persistence contracts/QueryRegistry.sol The Solidity registry that receives reports from the CRE forwarder and persists attestations.
Client Verification frontend/src/lib/api.ts Frontend logic that queries the attestation state from both the server and the blockchain.

πŸ“œ Deployed Contracts (Base Sepolia)


πŸ— Architecture & Flow

🌊 System Workflow

sequenceDiagram
    participant U as User
    participant F as Frontend
    participant AI as Gemini AI
    participant S as Server
    participant X as x402
    participant B as Chain/Nansen
    participant C as Chainlink CRE

    U->>F: NL Query ("Fetch block 38581727")
    F->>AI: Resolve Intent (Plan)
    AI-->>F: {queryType: "block"}
    F->>S: POST /query + Payment
    S->>X: Verify USDC Micropayment
    S->>B: Fetch Historical Data
    S->>C: Submit Query/Result Hashes
    C->>B: Commit Attestation (QueryRegistry)
    S-->>F: Response + Verification Receipt
    F-->>U: Visual Signal Card (Verified)
Loading

🧠 Duplicate Request Cost Control

POST /chat includes message-level dedupe caching to avoid repeat x402 spend for identical prompts:

  • Cache key: Normalized user message hash.
  • TTL: 300 seconds.
  • Concurrent identical requests are coalesced into a single paid query.
  • Response includes cache.hit metadata.

πŸš€ Setup & Installation Guide

Prerequisites

  • Node.js 20+
  • npm or Bun
  • Foundry (forge, cast)
  • CRE CLI (cre) - for running local simulations

πŸ“¦ Monorepo Root Setup (Recommended)

BlockStory is a monorepo. You can install all dependencies and run both the server and frontend concurrently from the root directory:

# 1. Install all dependencies across monorepo
npm run install:all

# 2. Configure Environment (Server)
cp server/.env.example server/.env
# --- Server Configuration ---
PORT=3000                        # Port the local gateway server runs on
GEMINI_API_KEY=your_key_here      # Google Gemini AI API key for the Query Planner
GEMINI_MODEL=gemini-2.0-flash      # Gemini model version (e.g. gemini-2.0-flash)

# --- x402 Micropayments ---
X402_RECEIVER_ADDRESS=            # Address that receives USDC query payments
X402_FACILITATOR_URL=https://x402.org/facilitator # URL for the x402 payment facilitator
HISTORY_QUERY_PRICE=$0.01         # Human-readable price per historical query
AGENT_WALLET_PRIVATE_KEY=         # Private key for the server-side "Agent Mode" payer

# --- Blockchain RPCs (Base Sepolia) ---
BASE_SEPOLIA_RPC=                 # RPC URL for Base Sepolia (e.g. via Alchemy/Infura)
BASE_SEPOLIA_USDC_ADDRESS=0x036CbD53842c5426634e7929541eC2318f3dCF7e # USDC contract on Base Sepolia
BASE_SEPOLIA_LINK_ADDRESS=0xE4aB69C613f73dA9511813Bc0772412803960074 # LINK contract on Base Sepolia
QUERY_REGISTRY_ADDRESS=           # Deployed BlockStory QueryRegistry contract address

# --- Nansen via Corbits (Mainnet Logic) ---
# Note: Nansen queries via Corbits require real USDC on Base Mainnet or local simulation configuration
NANSEN_CORBITS_BASE_URL=https://nansen.api.corbits.dev # Corebits proxy for Nansen data
NANSEN_EVM_PRIVATE_KEY=           # Private key for Nansen query payments (Base Mainnet)
NANSEN_REGISTRY_ADDRESS=          # Optional: Contract to track Nansen queries
BASE_MAINNET_RPC=https://mainnet.base.org # RPC for verifying mainnet state

# --- Chainlink CRE Configuration ---
CRE_HTTP_TRIGGER_URL=             # Deployed CRE HTTP trigger (leave empty if using local simulation)
CRE_ETH_PRIVATE_KEY=              # Private key for the CRE signer/forwarder (used in simulation)

# --- Performance & Deduplication ---
DEDUPE_CACHE_ENABLED=true         # Enable/disable re-spending for identical recent queries
DEDUPE_CACHE_TTL_SECONDS=300      # How long to cache query results (seconds)
MAX_LOG_BLOCK_RANGE=2000          # Max block range for eth_getLogs queries
MAX_LOG_RESULTS=200               # Max results returned for eth_getLogs
# Fill in your GEMINI_API_KEY, RPC_URLS, and PRIVATE_KEYS in server/.env

# 3. Launch both Server & Frontend
npm run dev

🧱 Individual Component Setup (Manual)

1️⃣ Server Setup

cd server
npm install
cp .env.example .env
# Fill in your GEMINI_API_KEY, RPC_URLS, and PRIVATE_KEYS
npm run dev:tsx

2️⃣ Frontend Setup

cd frontend
npm install
npm run dev

πŸ›  CRE Local Simulation Setup

From cre/history-fetcher:

bun install  # Installs CRE SDK and runs cre-setup

Ensure you have an absolute path to your simulate-payload.json. Run the simulation to verify the attestation logic:

cre -T local-simulation workflow simulate ./history-fetcher \
  --broadcast \
  --non-interactive \
  --trigger-index 0 \
  --http-payload @/ABSOLUTE_PATH/TO/blockstory/cre/history-fetcher/simulate-payload.json

Expected Results:

  • Workflow compilation succeeds.
  • Logs show payload received.
  • Logs show a successful write to QueryRegistry.
  • A transaction hash string is returned.

3️⃣ Frontend Setup

cd frontend
npm install
npm run dev

Open http://localhost:3001 to access the BlockStory dashboard.


πŸ§ͺ Testing & Verification

πŸ“‘ End-to-End API Test

With the server running, you can test the AI query engine manually:

curl -sS -X POST http://localhost:3000/chat \
  -H "Content-Type: application/json" \
  -d '{"message":"Fetch Base Sepolia block 38581727"}'

Verify Response Includes:

  • queryId
  • attestation.queryHash
  • attestation.resultHash
  • attestation.cre.status

πŸ”— Onchain Verification

Verify an attestation manually using cast:

cast call [QUERY_REGISTRY_ADDRESS] \
  "getRecord(bytes32)((bytes32,address,bytes32,bytes32,uint256,string,bool))" \
  [QUERY_ID] \
  --rpc-url https://sepolia.base.org

πŸ“Ί Dashboard History Endpoints

The server persists local dashboard-ready data:

  • GET /dashboard/history - Chat requests + cache metadata.
  • GET /dashboard/payments - x402 payment records + spend summary.

πŸ–₯ Frontend Proof Panel

Our UI (implemented in ChatInterface.tsx and HistoryDashboard.tsx) provides real-time verification status:

  • Displays attestation.cre.status.
  • Shows raw queryHash and resultHash.
  • Shows a "Verified on Base" badge linked to /attestation/:queryId.
  • Live balances for receiver/agent/CRE signer from /dashboard/balances.

πŸŽ– Hackathon Submission Notes

  • Submission Workflow: Local simulation is used for this submission as per organizer guidance for the CRE track.
  • Verification Proof:
    • Simulation TX output (Success logs).
    • /chat and /history API status checks.
    • Onchain getRecord(queryId) verification success.
  • Project Scope: Demonstrating a complete vertical integration from AI query intent to onchain-attested data delivery.

πŸ›  Optional: Live CRE Deployment

If CRE cloud deployment access is available, transitions from simulation to live trigger:

cre -T staging-settings workflow deploy ./history-fetcher
cre -T staging-settings workflow activate ./history-fetcher

Then update your CRE_HTTP_TRIGGER_URL in the .env file.


βš–οΈ License

Distributed under the MIT License. See LICENSE for more information.


Built by Elio Jordan Lopes for Chainlink Convergence 2026. Institutional intelligence, democratized.

About

A unified intelligence gateway bridging multichain historical data with real-time Nansen AI market insights. Powered by the Corbits Proxy, x402 pay-per-request, and Chainlink CRE for transparent, verifiable, and institutional-grade onchain analytics across Base, Ethereum, and Solana.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors