OpenBee is an around-the-clock digital worker solution, dedicated to making AI Agents your 7×24 always-on assistant.
| 🤖 AI Workers | 💬 Multi-Platform Support | ⏰ Scheduled Tasks |
| Each Worker is an AI Agent capable of multi-step task planning and independent execution | Native support for Lark, DingTalk, WeCom, WeChat, Telegram, and Linear — receive tasks and reply where the work starts | Cron-based scheduling for automatic, hands-free triggering |
OpenBee supports multiple AI engines as the underlying execution backend:
| Engine | Description |
|---|---|
| Claude Code | Anthropic's official agentic coding tool; default and recommended engine |
| Codex | OpenAI's Codex agent, supported via the plugin engine |
| Pi | Pi agent, supported via the plugin engine |
| Kimi | Moonshot AI's Kimi agent, supported via the plugin engine |
📦 npm (Recommended)
npm install -g @theopenbee/cliThe platform-specific binary is downloaded automatically. Supports Linux / macOS / Windows (amd64 & arm64).
🔧 One-click Script
curl -fsSL https://raw.githubusercontent.com/theopenbee/openbee/main/install.sh | bash🍺 brew (macOS) / scoop (Windows)
macOS (Homebrew):
brew install theopenbee/tap/openbeeWindows (Scoop):
scoop bucket add theopenbee https://github.com/theopenbee/scoop-bucket
scoop install theopenbee/openbee📥 Manual Binary Download
Visit GitHub Releases, download the archive for your platform, extract it, and place the openbee executable in your PATH.
openbee configThe wizard will guide you through:
- Agent executable path
- Platform(s) to enable (Lark / DingTalk / WeCom / WeChat / Telegram / Linear) and their credentials
- Advanced options (can be skipped to use defaults)
The config file is written to config.yaml in the current directory by default. Use -o to specify a custom path:
openbee config -o /path/to/config.yamlopenbee server -d- Open the Web Console (default http://localhost:8080) to manage Workers and view task status
- Send messages directly in any configured platform (Lark / DingTalk / WeCom / WeChat / Telegram / Linear) to interact with OpenBee
graph TD
A["💬 Communication Layer\nLark / DingTalk / WeCom / WeChat / Telegram / Linear"] --> B["🧠 Scheduling Layer\nAI Agent"]
B --> C["🤖 Execution Layer\nAI Agents"]
C -. "Reply Results" .-> A
B -. "Reply Results" .-> A
OpenBee consists of three core layers:
1. Communication Layer Includes Lark, DingTalk, WeCom, WeChat, Telegram, and Linear. Users can send messages from chat platforms, or create and comment on Linear issues, then receive replies in the same conversation or issue thread.
2. Scheduling Layer (AI Agent) Responsible for task scheduling — receives messages from the Communication layer, understands user intent, and dispatches tasks to the Execution layer for execution. It can also reply results directly to the Communication layer.
3. Execution Layer Each Worker is an independent AI Agent, equipped with tool invocation (CLI) and multi-step task planning. Workers execute assigned tasks autonomously and reply results directly to the Communication layer — just like real workers.
- 🐛 Bug reports / Feature requests → GitHub Issues
- 🤝 Contributing → Please read the Contributing Guide. You must agree to the Contributor License Agreement (CLA) before submitting.