A TypeScript port of ICE (Interactive Composition Explorer) — a library for building, debugging, and inspecting language model programs using composable recipes and agents.
- Node.js >= 18
- An OpenAI API key (only required for
machine/machine-cachedmodes) — set asOPENAI_API_KEY
npm installRun any example without an API key using the fake mode, which returns deterministic pseudo-random responses:
RECIPE_MODE=fake npx tsx examples/hello.tsTo use a real OpenAI model, set RECIPE_MODE=machine and provide your key:
RECIPE_MODE=machine OPENAI_API_KEY=sk-... npx tsx examples/hello.tsA minimal recipe that asks a single question and prints the answer.
# fake (no API key needed)
RECIPE_MODE=fake npx tsx examples/hello.ts
# or use the npm script
RECIPE_MODE=machine OPENAI_API_KEY=sk-... npm run example:helloAsks the model to reason step-by-step before producing a concise final answer. Demonstrates the trace() helper for recording intermediate steps.
RECIPE_MODE=fake npx tsx examples/chain_of_thought.ts
# npm shortcut
RECIPE_MODE=machine OPENAI_API_KEY=sk-... npm run example:cotClassifies a yes/no/uncertain question using agent.classify(), then generates a brief explanation.
RECIPE_MODE=fake npx tsx examples/qa.ts
# npm shortcut
RECIPE_MODE=machine OPENAI_API_KEY=sk-... npm run example:qaDecomposes a complex question into 2–4 simpler sub-questions, answers them in parallel, then synthesises a final answer.
RECIPE_MODE=fake npx tsx examples/subquestions.tsSet RECIPE_MODE (environment variable) or call recipe.setMode() in code.
| Mode | Description |
|---|---|
machine |
LM answers automatically (requires OPENAI_API_KEY) |
machine-cached |
Same as machine, but caches results to disk |
fake / test |
Deterministic seeded responses, no API key needed |
human |
Prompts a human interactively via stdin |
augmented |
Human answers, shown LM suggestions |
augmented-cached |
Same as augmented, with disk cache |
approval |
LM answers, but human must approve each call |
import { recipe, trace } from "icejs";
// Wrap steps you want traced
const summarise = trace(async function summarise(text: string) {
const agent = recipe.agent();
return agent.complete({
prompt: `Summarise the following:\n\n${text}`,
maxTokens: 200,
});
});
async function run(text: string) {
const summary = await summarise(text);
console.log(summary);
return summary;
}
// recipe.main() reads RECIPE_MODE from env and enables tracing
await recipe.main(run, [process.argv[2]]);Run it:
RECIPE_MODE=fake npx tsx my_recipe.ts "Some long text..."All agents implement the same interface:
// Text completion
const text = await agent.complete({ prompt, stop?, maxTokens? });
// Classification — returns a probability distribution over choices
const [probs, explanation] = await agent.classify({ prompt, choices });
// Relevance score in [0, 1]
const score = await agent.relevance({ context, question });
// Next-token probability distribution
const dist = await agent.predict({ context });You can also instantiate agents directly:
import { FakeAgent, CachedAgent, AIAgent } from "icejs";
const agent = new FakeAgent(/* seed = */ 42);
const cached = new CachedAgent(new AIAgent());When recipe.main() runs, a trace is written to ~/.icejs/traces/<id>/. Each call wrapped with trace() is recorded as a JSONL event tree including arguments, return values, and timing.
import { trace, addFields } from "icejs";
const myStep = trace(async function myStep(input: string) {
addFields({ inputLength: input.length }); // attach custom metadata
// ...
});After running any recipe, a URL like http://localhost:3000/traces/<id> is printed to stderr. Start the viewer server in a separate terminal to open it:
npm run trace-viewer
# or: npx tsx src/server.tsThen open the printed URL in your browser. The viewer shows a collapsible call tree with inline short args/results, full argument and result panels on click, timing, and custom fields. All past traces are listed at http://localhost:3000/traces.
npm run build # compile to dist/
npm run dev # watch modesrc/
mode.ts – Mode type union
trace.ts – AsyncLocalStorage-based tracing; trace() helper; Trace class
agent.ts – agentPolicy() factory (mode → Agent)
recipe.ts – Recipe base class + recipe helper object
utils.ts – makeId, longestCommonPrefix, cosineSimilarity, etc.
index.ts – Public exports
agents/
base.ts – Abstract Agent class
openai.ts – AIAgent (generateText), EmbeddingAgent (embed) via Vercel AI SDK
human.ts – HumanAgent (stdin/stdout)
fake.ts – FakeAgent (seeded PRNG, no API needed)
cached.ts – CachedAgent (disk cache wrapper)
index.ts – Re-exports all agents
examples/
hello.ts
chain_of_thought.ts
qa.ts
subquestions.ts
MIT