Skip to content

omarmung/icejs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

icejs

A TypeScript port of ICE (Interactive Composition Explorer) — a library for building, debugging, and inspecting language model programs using composable recipes and agents.

Requirements

  • Node.js >= 18
  • An OpenAI API key (only required for machine / machine-cached modes) — set as OPENAI_API_KEY

Installation

npm install

Quick start

Run any example without an API key using the fake mode, which returns deterministic pseudo-random responses:

RECIPE_MODE=fake npx tsx examples/hello.ts

To use a real OpenAI model, set RECIPE_MODE=machine and provide your key:

RECIPE_MODE=machine OPENAI_API_KEY=sk-... npx tsx examples/hello.ts

Examples

hello

A minimal recipe that asks a single question and prints the answer.

# fake (no API key needed)
RECIPE_MODE=fake npx tsx examples/hello.ts

# or use the npm script
RECIPE_MODE=machine OPENAI_API_KEY=sk-... npm run example:hello

chain_of_thought

Asks the model to reason step-by-step before producing a concise final answer. Demonstrates the trace() helper for recording intermediate steps.

RECIPE_MODE=fake npx tsx examples/chain_of_thought.ts

# npm shortcut
RECIPE_MODE=machine OPENAI_API_KEY=sk-... npm run example:cot

qa

Classifies a yes/no/uncertain question using agent.classify(), then generates a brief explanation.

RECIPE_MODE=fake npx tsx examples/qa.ts

# npm shortcut
RECIPE_MODE=machine OPENAI_API_KEY=sk-... npm run example:qa

subquestions

Decomposes a complex question into 2–4 simpler sub-questions, answers them in parallel, then synthesises a final answer.

RECIPE_MODE=fake npx tsx examples/subquestions.ts

Modes

Set RECIPE_MODE (environment variable) or call recipe.setMode() in code.

Mode Description
machine LM answers automatically (requires OPENAI_API_KEY)
machine-cached Same as machine, but caches results to disk
fake / test Deterministic seeded responses, no API key needed
human Prompts a human interactively via stdin
augmented Human answers, shown LM suggestions
augmented-cached Same as augmented, with disk cache
approval LM answers, but human must approve each call

Writing a recipe

import { recipe, trace } from "icejs";

// Wrap steps you want traced
const summarise = trace(async function summarise(text: string) {
  const agent = recipe.agent();
  return agent.complete({
    prompt: `Summarise the following:\n\n${text}`,
    maxTokens: 200,
  });
});

async function run(text: string) {
  const summary = await summarise(text);
  console.log(summary);
  return summary;
}

// recipe.main() reads RECIPE_MODE from env and enables tracing
await recipe.main(run, [process.argv[2]]);

Run it:

RECIPE_MODE=fake npx tsx my_recipe.ts "Some long text..."

Agent API

All agents implement the same interface:

// Text completion
const text = await agent.complete({ prompt, stop?, maxTokens? });

// Classification — returns a probability distribution over choices
const [probs, explanation] = await agent.classify({ prompt, choices });

// Relevance score in [0, 1]
const score = await agent.relevance({ context, question });

// Next-token probability distribution
const dist = await agent.predict({ context });

You can also instantiate agents directly:

import { FakeAgent, CachedAgent, AIAgent } from "icejs";

const agent = new FakeAgent(/* seed = */ 42);
const cached = new CachedAgent(new AIAgent());

Tracing

When recipe.main() runs, a trace is written to ~/.icejs/traces/<id>/. Each call wrapped with trace() is recorded as a JSONL event tree including arguments, return values, and timing.

import { trace, addFields } from "icejs";

const myStep = trace(async function myStep(input: string) {
  addFields({ inputLength: input.length }); // attach custom metadata
  // ...
});

Trace viewer

After running any recipe, a URL like http://localhost:3000/traces/<id> is printed to stderr. Start the viewer server in a separate terminal to open it:

npm run trace-viewer
# or: npx tsx src/server.ts

Then open the printed URL in your browser. The viewer shows a collapsible call tree with inline short args/results, full argument and result panels on click, timing, and custom fields. All past traces are listed at http://localhost:3000/traces.

Building

npm run build      # compile to dist/
npm run dev        # watch mode

Project structure

src/
  mode.ts          – Mode type union
  trace.ts         – AsyncLocalStorage-based tracing; trace() helper; Trace class
  agent.ts         – agentPolicy() factory (mode → Agent)
  recipe.ts        – Recipe base class + recipe helper object
  utils.ts         – makeId, longestCommonPrefix, cosineSimilarity, etc.
  index.ts         – Public exports
  agents/
    base.ts        – Abstract Agent class
    openai.ts      – AIAgent (generateText), EmbeddingAgent (embed) via Vercel AI SDK
    human.ts       – HumanAgent (stdin/stdout)
    fake.ts        – FakeAgent (seeded PRNG, no API needed)
    cached.ts      – CachedAgent (disk cache wrapper)
    index.ts       – Re-exports all agents
examples/
  hello.ts
  chain_of_thought.ts
  qa.ts
  subquestions.ts

License

MIT

About

JS/TS port of ICE (Interactive Composition Explorer) — a lib for working with language model programs using composable recipes and agents.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors