Skip to content

v1.0.0 — Production Release

Choose a tag to compare

@stackbilt-admin stackbilt-admin released this 01 Apr 14:12
· 24 commits to main since this release

First stable release. Production-tested in AEGIS cognitive kernel since v1.72.0.

Highlights

  • Zero runtime dependencies — supply chain security by design
  • 5 providers: OpenAI, Anthropic, Cloudflare Workers AI, Cerebras, Groq
  • LLMProviders.fromEnv() — one-line multi-provider setup
  • Graduated circuit breakers — automatic failover with half-open probe recovery
  • CreditLedger — per-provider budget tracking with threshold alerts + burn rate projection
  • npm provenance — every version cryptographically linked to its source commit

Install

npm install @stackbilt/llm-providers

Quick Start

import { LLMProviders } from '@stackbilt/llm-providers';

const llm = LLMProviders.fromEnv(process.env);
const response = await llm.generateResponse({
  messages: [{ role: 'user', content: 'Hello!' }],
});

See README for full documentation.
See SECURITY.md for supply chain security policy.