Skip to content

zoobz-io/zyn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

43 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

zyn

CI Status codecov Go Report Card CodeQL Go Reference License Go Version Release

Type-safe LLM orchestration for Go.

Define synapses with typed outputs, fire them with sessions, and get structured responses with built-in reliability.

Composable Thinking Synapses

Synapses wrap LLM interactions with compile-time type safety.

type Contact struct {
    Name  string `json:"name"`
    Email string `json:"email"`
    Phone string `json:"phone"`
}

func (c Contact) Validate() error {
    if c.Email == "" {
        return errors.New("email required")
    }
    return nil
}

// Define a synapse — typed extraction from unstructured text
extractor, _ := zyn.Extract[Contact]("contact information", provider)

// Fire it — get structured data back
session := zyn.NewSession()
contact, _ := extractor.Fire(ctx, session, "Reach John at john@acme.com or 555-1234")
// contact.Name  → "John"
// contact.Email → "john@acme.com"
// contact.Phone → "555-1234"

Sessions carry conversation context. Synapses stay focused on their task.

// Chain synapses — each sees the full conversation history
classifier, _ := zyn.Classification("urgency", []string{"low", "medium", "high"}, provider)
urgency, _ := classifier.Fire(ctx, session, contact.Name + "'s request")

responder, _ := zyn.Transform("write customer response", provider)
response, _ := responder.Fire(ctx, session, fmt.Sprintf("Urgency: %s", urgency))

Type-safe at the edges. Conversational in between.

Install

go get github.com/zoobz-io/zyn

Requires Go 1.24+.

Quick Start

package main

import (
    "context"
    "fmt"
    "os"
    "time"

    "github.com/zoobz-io/zyn"
    "github.com/zoobz-io/zyn/openai"
)

func main() {
    ctx := context.Background()

    // Create provider
    provider := openai.New(openai.Config{
        APIKey: os.Getenv("OPENAI_API_KEY"),
    })

    // Create synapse with reliability
    classifier, _ := zyn.Classification(
        "email category",
        []string{"spam", "urgent", "newsletter", "personal"},
        provider,
        zyn.WithRetry(3),
        zyn.WithTimeout(10*time.Second),
    )

    // Fire with session context
    session := zyn.NewSession()
    category, _ := classifier.Fire(ctx, session, "URGENT: Your account will be suspended!")

    fmt.Println("Category:", category) // "urgent"
}

Capabilities

Feature Description Docs
8 Synapse Types Binary, Classification, Ranking, Sentiment, Extract, Transform, Analyze, Convert Synapses
Sessions Conversation context across synapse calls Sessions
Structured Prompts Type-driven prompt generation prevents divergence Concepts
Reliability Patterns Retry, timeout, circuit breaker, rate limiting Reliability
Observability Typed signals via capitan for all LLM operations Observability
Testing Utilities Mock provider for deterministic tests Testing

Why zyn?

  • Type-safe — Generics enforce output types at compile time
  • Structured — LLM responses parse directly into your structs
  • Conversational — Sessions maintain context across synapse calls
  • Reliablepipz patterns built in
  • Observablecapitan signals for every LLM call
  • Testable — Mock provider for deterministic unit tests

Composable LLM Patterns

Zyn enables a pattern: define synapses, compose with sessions, observe with signals.

Your synapses define typed LLM interactions. Sessions chain them into workflows with shared context. Reliability patterns wrap the whole thing. Capitan signals make it observable.

// Define synapses for each step
extractor, _ := zyn.Extract[Customer]("customer details", provider)
classifier, _ := zyn.Classification("urgency", []string{"low", "high"}, provider)
responder, _ := zyn.Transform("write response", provider)

// Compose via session — each step sees previous context
session := zyn.NewSession()
customer, _ := extractor.Fire(ctx, session, ticket)
urgency, _ := classifier.Fire(ctx, session, customer.Issue)
response, _ := responder.Fire(ctx, session, urgency)

// Observe via capitan — no instrumentation needed
capitan.Hook(zyn.RequestCompleted, logRequest)
capitan.Hook(zyn.ProviderCallCompleted, trackTokens)

Three synapses, one session, full observability.

Documentation

Learn

Guides

Cookbook

Reference

Contributing

See CONTRIBUTING.md for guidelines. Run make help for available commands.

License

MIT License — see LICENSE for details.