Skip to content

XiaoConstantine/dspy-go

Repository files navigation

DSPy-Go

Go Report Card codecov Go Reference

What is DSPy-Go?

DSPy-Go is a native Go implementation of the DSPy framework, bringing systematic prompt engineering and automated reasoning capabilities to Go applications. Build reliable LLM applications through composable modules and workflows.

Full Documentation | API Reference | Examples

Key Features

Feature Description
Modular Architecture Compose simple, reusable components into complex applications
Multiple LLM Providers Anthropic, OpenAI, Google Gemini, Ollama, LlamaCPP, and more
Advanced Modules Predict, ChainOfThought, ReAct, RLM, Refine, Parallel
Intelligent Agents ReAct patterns, ACE framework for self-improving agents
A2A Protocol Multi-agent orchestration with hierarchical composition
Smart Tool Management Bayesian selection, chaining, composition, MCP integration
Quality Optimizers GEPA, MIPRO, SIMBA, BootstrapFewShot, COPRO
Structured Output JSON structured output and XML adapters with security controls

Installation

go get github.com/XiaoConstantine/dspy-go

Quick Start

CLI (Zero Code)

cd cmd/dspy-cli && go build -o dspy-cli
export GEMINI_API_KEY="your-api-key"

./dspy-cli list                           # See all optimizers
./dspy-cli try mipro --dataset gsm8k      # Test optimizer instantly
./dspy-cli view session.jsonl --stats     # View RLM session logs

CLI Documentation

Programming

package main

import (
    "context"
    "fmt"
    "github.com/XiaoConstantine/dspy-go/pkg/core"
    "github.com/XiaoConstantine/dspy-go/pkg/llms"
    "github.com/XiaoConstantine/dspy-go/pkg/modules"
)

func main() {
    // Configure LLM
    llm, err := llms.NewGeminiLLM("", core.ModelGoogleGeminiPro)
    if err != nil {
        panic(err)
    }
    core.SetDefaultLLM(llm)

    // Create signature and module
    signature := core.NewSignature(
        []core.InputField{{Field: core.NewField("question")}},
        []core.OutputField{{Field: core.NewField("answer")}},
    )
    cot := modules.NewChainOfThought(signature)

    // Execute
    result, _ := cot.Process(context.Background(), map[string]interface{}{
        "question": "What is the capital of France?",
    })
    fmt.Println(result["answer"])
}

Prefer helper functions such as core.SetDefaultLLM, core.SetTeacherLLM, and core.GetConcurrencyLevel over mutating core.GlobalConfig directly.

Modules resolve their model at execution time in this order: module-local via SetLLM, request-local via core.WithRuntime, then the package default configured with core.SetDefaultLLM.

Core Concepts

Signatures

Define input/output contracts for modules:

signature := core.NewSignature(
    []core.InputField{{Field: core.NewField("question", core.WithDescription("Question to answer"))}},
    []core.OutputField{{Field: core.NewField("answer", core.WithDescription("Detailed answer"))}},
).WithInstruction("Answer accurately and concisely.")

Modules

Module Description
Predict Direct prediction
ChainOfThought Step-by-step reasoning
ReAct Reasoning + tool use
RLM Large context exploration via REPL
Refine Quality improvement through iteration
Parallel Concurrent batch processing

Structured Output

// JSON structured output
cot := modules.NewChainOfThought(signature).WithStructuredOutput()

// XML adapter (alternative)
interceptors.ApplyXMLInterceptors(predict, interceptors.DefaultXMLConfig())

Core Concepts Guide

Documentation

Guide Description
Getting Started Installation and first program
Core Concepts Signatures, Modules, Programs
Building Agents ReAct, ACE framework, memory
A2A Protocol Multi-agent orchestration
RLM Module Large context exploration
XML Adapters Structured output parsing
Tool Management Smart registry, chaining, MCP
Optimizers GEPA, MIPRO, SIMBA, Bootstrap

Examples

Agent Frameworks

Modules

Optimization

  • rlm_oolong_gepa - Optimize an adaptive RLM agent, save the optimized program, restore it, and replay it

Tools

Optimizers

  • mipro - TPE-based optimization
  • simba - Introspective learning
  • gepa - Evolutionary optimization

LLM Providers

// Anthropic Claude
llm, _ := llms.NewAnthropicLLM("api-key", core.ModelAnthropicSonnet)

// Google Gemini
llm, _ := llms.NewGeminiLLM("api-key", core.ModelGoogleGeminiPro)

// OpenAI
llm, _ := llms.NewOpenAI(core.ModelOpenAIGPT4, "api-key")

// Ollama (local)
llm, _ := llms.NewOllamaLLM(core.ModelOllamaLlama3_8B)

// OpenAI-compatible (LiteLLM, LocalAI, etc.)
llm, _ := llms.NewOpenAILLM(core.ModelOpenAIGPT4,
    llms.WithAPIKey("api-key"),
    llms.WithOpenAIBaseURL("http://localhost:4000"))

Providers Reference

Community

License

DSPy-Go is released under the MIT License. See the LICENSE file for details.

Packages

 
 
 

Contributors