The ultra-thin kernel of the Amplifier modular AI agent system.
Amplifier Core provides the mechanisms for building modular AI agent systems. Following the Linux kernel model, it's a tiny, stable center (~2,600 lines) that rarely changes, with all policies and features implemented as replaceable modules at the edges.
Core responsibilities:
- Module discovery and loading
- Lifecycle coordination
- Hook system and events
- Session management
- Stable contracts and APIs
┌─────────────────────────────────────────────────────────────┐
│ KERNEL (amplifier-core) │
│ • Module loading • Event system │
│ • Session lifecycle • Coordinator │
│ • Minimal dependencies • Stable contracts │
└──────────────────┬──────────────────────────────────────────┘
│ protocols (Tool, Provider, etc.)
▼
┌─────────────────────────────────────────────────────────────┐
│ MODULES (Userspace - Swappable) │
│ • Providers: LLM backends (Anthropic, OpenAI, Azure, Ollama)│
│ • Tools: Capabilities (filesystem, bash, web, search) │
│ • Orchestrators: Execution loops (basic, streaming, events) │
│ • Contexts: Memory management (simple, persistent) │
│ • Hooks: Observability (logging, redaction, approval) │
└─────────────────────────────────────────────────────────────┘
The kernel provides capabilities without decisions:
| Kernel Provides (Mechanism) | Modules Decide (Policy) |
|---|---|
| Module loading | Which modules to load |
| Event emission | What to log, where |
| Session lifecycle | Orchestration strategy |
| Hook registration | Security policies |
Litmus test: "Could two teams want different behavior?" → If yes, it's policy → Module, not kernel.
- Backward compatible: Existing modules continue working across kernel updates
- Minimal dependencies: Only pydantic, tomli, pyyaml, typing-extensions
- Single maintainer scope: Can be understood by one person
- Additive evolution: Changes extend, don't break
For complete Amplifier installation and usage:
→ https://github.com/microsoft/amplifier
Execution context with mounted modules and conversation state. Lifespan: initialize() → execute() → cleanup().
Configuration dictionary specifying which modules to load and their configuration. Apps/profiles compile to Mount Plans.
Infrastructure context providing session_id, config access, hooks, and mount points. Injected into all modules.
All modules use Python Protocol (structural typing, no inheritance required):
- Provider - LLM backends (name, complete(), parse_tool_calls(), get_info(), list_models())
- Tool - Agent capabilities (name, description, execute())
- Orchestrator - Execution loops (execute())
- ContextManager - Memory (add_message(), get_messages(), compact())
- Hook - Observability (async function with event, data → HookResult)
from amplifier_core import AmplifierSession
# Define mount plan (modules must be installed or discoverable)
config = {
"session": {
"orchestrator": "loop-basic",
"context": "context-simple"
},
"providers": [
{"module": "provider-anthropic"}
],
"tools": [
{"module": "tool-filesystem"},
{"module": "tool-bash"}
]
}
# Create and use session
async with AmplifierSession(config) as session:
response = await session.execute("List files in current directory")Modules implement protocols via structural typing (duck typing):
from amplifier_core.interfaces import Tool
from amplifier_core.models import ToolResult
class MyTool:
"""Implements Tool protocol without inheritance."""
@property
def name(self) -> str:
return "my_tool"
@property
def description(self) -> str:
return "Does something useful"
async def execute(self, input: dict) -> ToolResult:
"""Execute tool with input dict."""
return ToolResult(
output=f"Processed: {input.get('param')}",
error=None
)
# Mount function (entry point)
async def mount(coordinator, config):
tool = MyTool()
await coordinator.mount("tools", tool, name="my_tool")
async def cleanup():
pass # Cleanup resources
return cleanupEntry point (pyproject.toml):
[project.entry-points."amplifier.modules"]
my-tool = "amplifier_module_my_tool:mount"For complete module development guide: → https://github.com/microsoft/amplifier
Module Contracts (Entry Point for Developers):
- Contracts Index - Start here for module development
- Provider Contract - LLM backend protocol
- Tool Contract - Agent capability protocol
- Hook Contract - Observability protocol
- Orchestrator Contract - Execution loop protocol
- Context Contract - Memory manager protocol
Specifications (Detailed Design):
- Mount Plan Specification - Configuration format
- Provider Specification - LLM provider details
- Contribution Channels - Module contribution protocol
Detailed Guides:
- Hooks API - Complete hook system reference
- Session Forking - Child sessions for delegation
- Module Source Protocol - Custom module loading
Philosophy:
- Design Philosophy - Kernel principles and patterns
cd amplifier-core
uv run pytest
uv run pytest --covNote
This project is not currently accepting external contributions, but we're actively working toward opening this up. We value community input and look forward to collaborating in the future. For now, feel free to fork and experiment!
Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit Contributor License Agreements.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.