v1.0.0 — Production Release
First stable release. Production-tested in AEGIS cognitive kernel since v1.72.0.
Highlights
- Zero runtime dependencies — supply chain security by design
- 5 providers: OpenAI, Anthropic, Cloudflare Workers AI, Cerebras, Groq
LLMProviders.fromEnv()— one-line multi-provider setup- Graduated circuit breakers — automatic failover with half-open probe recovery
- CreditLedger — per-provider budget tracking with threshold alerts + burn rate projection
- npm provenance — every version cryptographically linked to its source commit
Install
npm install @stackbilt/llm-providersQuick Start
import { LLMProviders } from '@stackbilt/llm-providers';
const llm = LLMProviders.fromEnv(process.env);
const response = await llm.generateResponse({
messages: [{ role: 'user', content: 'Hello!' }],
});See README for full documentation.
See SECURITY.md for supply chain security policy.