Type-safe LLM orchestration for Go.
Define synapses with typed outputs, fire them with sessions, and get structured responses with built-in reliability.
Synapses wrap LLM interactions with compile-time type safety.
type Contact struct {
Name string `json:"name"`
Email string `json:"email"`
Phone string `json:"phone"`
}
func (c Contact) Validate() error {
if c.Email == "" {
return errors.New("email required")
}
return nil
}
// Define a synapse — typed extraction from unstructured text
extractor, _ := zyn.Extract[Contact]("contact information", provider)
// Fire it — get structured data back
session := zyn.NewSession()
contact, _ := extractor.Fire(ctx, session, "Reach John at john@acme.com or 555-1234")
// contact.Name → "John"
// contact.Email → "john@acme.com"
// contact.Phone → "555-1234"Sessions carry conversation context. Synapses stay focused on their task.
// Chain synapses — each sees the full conversation history
classifier, _ := zyn.Classification("urgency", []string{"low", "medium", "high"}, provider)
urgency, _ := classifier.Fire(ctx, session, contact.Name + "'s request")
responder, _ := zyn.Transform("write customer response", provider)
response, _ := responder.Fire(ctx, session, fmt.Sprintf("Urgency: %s", urgency))Type-safe at the edges. Conversational in between.
go get github.com/zoobz-io/zynRequires Go 1.24+.
package main
import (
"context"
"fmt"
"os"
"time"
"github.com/zoobz-io/zyn"
"github.com/zoobz-io/zyn/openai"
)
func main() {
ctx := context.Background()
// Create provider
provider := openai.New(openai.Config{
APIKey: os.Getenv("OPENAI_API_KEY"),
})
// Create synapse with reliability
classifier, _ := zyn.Classification(
"email category",
[]string{"spam", "urgent", "newsletter", "personal"},
provider,
zyn.WithRetry(3),
zyn.WithTimeout(10*time.Second),
)
// Fire with session context
session := zyn.NewSession()
category, _ := classifier.Fire(ctx, session, "URGENT: Your account will be suspended!")
fmt.Println("Category:", category) // "urgent"
}| Feature | Description | Docs |
|---|---|---|
| 8 Synapse Types | Binary, Classification, Ranking, Sentiment, Extract, Transform, Analyze, Convert | Synapses |
| Sessions | Conversation context across synapse calls | Sessions |
| Structured Prompts | Type-driven prompt generation prevents divergence | Concepts |
| Reliability Patterns | Retry, timeout, circuit breaker, rate limiting | Reliability |
| Observability | Typed signals via capitan for all LLM operations | Observability |
| Testing Utilities | Mock provider for deterministic tests | Testing |
- Type-safe — Generics enforce output types at compile time
- Structured — LLM responses parse directly into your structs
- Conversational — Sessions maintain context across synapse calls
- Reliable — pipz patterns built in
- Observable — capitan signals for every LLM call
- Testable — Mock provider for deterministic unit tests
Zyn enables a pattern: define synapses, compose with sessions, observe with signals.
Your synapses define typed LLM interactions. Sessions chain them into workflows with shared context. Reliability patterns wrap the whole thing. Capitan signals make it observable.
// Define synapses for each step
extractor, _ := zyn.Extract[Customer]("customer details", provider)
classifier, _ := zyn.Classification("urgency", []string{"low", "high"}, provider)
responder, _ := zyn.Transform("write response", provider)
// Compose via session — each step sees previous context
session := zyn.NewSession()
customer, _ := extractor.Fire(ctx, session, ticket)
urgency, _ := classifier.Fire(ctx, session, customer.Issue)
response, _ := responder.Fire(ctx, session, urgency)
// Observe via capitan — no instrumentation needed
capitan.Hook(zyn.RequestCompleted, logRequest)
capitan.Hook(zyn.ProviderCallCompleted, trackTokens)Three synapses, one session, full observability.
- Overview — Design philosophy
- Quickstart — Build your first synapse
- Core Concepts — Synapses, sessions, providers
- Architecture — How zyn works under the hood
- Installation — Installing and configuring
- Providers — LLM provider configuration
- Sessions — Managing conversation context
- Reliability — Retry, timeout, circuit breaker
- Observability — Monitoring with capitan
- Testing — Testing strategies
- Best Practices — Production guidelines
- Classification Workflows — Real-world classification
- Extraction Pipelines — Structured data extraction
- Multi-Turn Conversations — Complex workflows
- Error Handling — Robust error management
- Cheatsheet — Quick reference
- Synapses — All synapse types
- Options — Configuration options
- Session — Session API
See CONTRIBUTING.md for guidelines. Run make help for available commands.
MIT License — see LICENSE for details.