Build, deploy and manage AI agents with Huawei Cloud capabilities.
Huawei Cloud AgentArts SDK is a comprehensive toolkit for developing, deploying, and managing AI agents. It provides seamless integration with Huawei Cloud services while supporting mainstream agent frameworks.
- Framework Agnostic - Compatible with LangChain, LangGraph, AutoGen, CrewAI, Google ADK, and any custom agent framework
- One-Click Deployment - Deploy agents to Huawei Cloud with a single command
- Built-in Tools - Code interpreter sandbox, memory management, MCP gateway support
- Cloud Integration - Seamless integration with Huawei Cloud authentication, monitoring, and logging
- CLI Toolkit - Complete command-line tools for project initialization, local development, and cloud deployment
agentarts-sdk-python/
├── src/agentarts/
│ ├── sdk/ # Core SDK modules
│ │ ├── runtime/ # HTTP server runtime (AgentArtsRuntimeApp)
│ │ ├── memory/ # Conversation memory management
│ │ ├── tools/ # Built-in tools (Code Interpreter)
│ │ ├── mcpgateway/ # MCP Gateway client
│ │ ├── identity/ # Authentication & authorization
│ │ ├── integration/ # Framework adapters (LangGraph, etc.)
│ │ └── service/ # HTTP clients for cloud services
│ └── toolkit/ # CLI toolkit
│ ├── cli/ # Command-line interface
│ ├── operations/ # CLI operation handlers
│ └── utils/templates/ # Project templates
├── docs/ # Documentation
│ └── cn/ # Chinese documentation
│ ├── sdk_user_guide/ # SDK usage guides
│ └── toolkit_user_guide/ # CLI usage guides
└── tests/ # Test suites
The SDK provides AgentArtsRuntimeApp to wrap your agent logic as a standard HTTP server, exposing:
POST /invocations- Main agent invocation endpointGET /ping- Health check endpointWS /ws- WebSocket endpoint for streaming
# agent.py
import os
from typing import Dict, Any, TypedDict, Annotated
from operator import add
from agentarts.sdk import AgentArtsRuntimeApp, RequestContext
app = AgentArtsRuntimeApp()
class State(TypedDict):
messages: Annotated[list, add]
query: str
response: str
class LangGraphAgent:
def __init__(self):
self.model_name = os.environ.get("OPENAI_MODEL_NAME", "gpt-4o-mini")
self._graph = None
def _build_graph(self):
from langgraph.graph import StateGraph, END
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, AIMessage
llm = ChatOpenAI(
model=self.model_name,
api_key=os.environ.get("OPENAI_API_KEY"),
base_url=os.environ.get("OPENAI_BASE_URL")
)
async def process_node(state: State) -> Dict[str, Any]:
query = state.get("query", "")
messages = state.get("messages", []) or [HumanMessage(content=query)]
response = await llm.ainvoke(messages)
return {
"messages": [AIMessage(content=response.content)],
"response": response.content,
}
workflow = StateGraph(State)
workflow.add_node("process", process_node)
workflow.set_entry_point("process")
workflow.add_edge("process", END)
return workflow.compile()
async def run(self, query: str) -> Dict[str, Any]:
graph = self._graph or self._build_graph()
self._graph = graph
result = await graph.ainvoke({"messages": [], "query": query, "response": ""})
return {"response": result.get("response", "")}
_agent = LangGraphAgent()
@app.entrypoint
async def handler(payload: Dict[str, Any], context: RequestContext = None) -> Dict[str, Any]:
query = payload.get("message", "")
return await _agent.run(query)
if __name__ == "__main__":
app.run()- Focus on Agent Logic - You only need to implement the agent logic; the SDK handles HTTP server, request parsing, and response formatting
- Framework Agnostic - Works with any agent framework (LangChain, LangGraph, AutoGen, CrewAI, or custom implementations)
- Simple Decorator - Use
@app.entrypointto mark your handler function - Context Support - Optional
RequestContextparameter provides session info and request metadata - Configurable Model - Model name can be configured via environment variable (e.g.,
OPENAI_MODEL_NAME)
- Python 3.10 or higher
- pip or uv package manager
It is recommended to install the SDK in a virtual environment to avoid dependency conflicts.
Windows:
# Create virtual environment
python -m venv venv
# Activate virtual environment
.\venv\Scripts\Activate.ps1
# Or using Command Prompt
.\venv\Scripts\activate.batLinux/macOS:
# Create virtual environment
python -m venv venv
# Activate virtual environment
source venv/bin/activateWindows:
pip install agentarts-sdkLinux/macOS:
pip install agentarts-sdk# With LangChain support
pip install agentarts-sdk[langchain]
# With LangGraph support
pip install agentarts-sdk[langgraph]
# With all optional dependencies
pip install agentarts-sdk[all]Windows:
git clone https://github.com/huaweicloud/agentarts-sdk-python.git
cd agentarts-sdk-python
# Create and activate virtual environment
python -m venv venv
.\venv\Scripts\Activate.ps1
# Install in development mode
pip install -e ".[dev]"Linux/macOS:
git clone https://github.com/huaweicloud/agentarts-sdk-python.git
cd agentarts-sdk-python
# Create and activate virtual environment
python -m venv venv
source venv/bin/activate
# Install in development mode
pip install -e ".[dev]"Set environment variables for Huawei Cloud authentication:
Windows (PowerShell):
$env:HUAWEICLOUD_SDK_AK = "your-access-key"
$env:HUAWEICLOUD_SDK_SK = "your-secret-key"Windows (Command Prompt):
set HUAWEICLOUD_SDK_AK=your-access-key
set HUAWEICLOUD_SDK_SK=your-secret-keyLinux/macOS:
export HUAWEICLOUD_SDK_AK="your-access-key"
export HUAWEICLOUD_SDK_SK="your-secret-key"Note: Get your AK/SK from Huawei Cloud Console -> My Credentials -> Access Keys.
For complete environment variable configuration, see Environment Variables Guide.
# Create a new agent project with LangGraph template
agentarts init -n my_agent -t langgraph
# Available templates: basic, langchain, langgraph, google-adkThis creates:
my_agent/
├── agent.py # Agent implementation
├── requirements.txt # Python dependencies
├── .agentarts_config.yaml # Project configuration
└── Dockerfile # Docker build file
Edit .agentarts_config.yaml to set environment variables:
runtime:
environment_variables:
- key: OPENAI_API_KEY
value: "your-openai-api-key"
- key: OPENAI_MODEL_NAME
value: "gpt-4o-mini" # Optional: gpt-4o, gpt-4-turbo, etc.
- key: OPENAI_BASE_URL
value: "" # Optional: custom API endpoint# Start local development server
agentarts dev
# Server runs at http://127.0.0.1:8080
# Endpoints:
# POST /invocations - Invoke agent
# GET /ping - Health check# Configure region
agentarts config set region cn-southwest-2
# Deploy to cloud
agentarts deploy
# Check deployment status
agentarts status
# Invoke deployed agent
agentarts invoke '{"message": "Hello, AgentArts!"}'
# Destroy deployment
agentarts destroy| Command | Description |
|---|---|
agentarts init |
Initialize a new agent project |
agentarts dev |
Start local development server |
agentarts config |
Configure SDK settings (alias: configure) |
agentarts deploy |
Deploy agent to Huawei Cloud (alias: launch) |
agentarts invoke |
Invoke deployed agent |
agentarts status |
Check deployment status |
agentarts destroy |
Remove deployed agent |
agentarts mcp-gateway |
Manage MCP gateways |
- Minimum: Python 3.10
- Recommended: Python 3.10 or 3.11
When using optional framework dependencies, ensure the following minimum versions:
| Framework | Minimum Version | Install Command |
|---|---|---|
| LangGraph | 1.0.0 | pip install agentarts-sdk[langgraph] |
| LangChain | 0.1.0 | pip install agentarts-sdk[langchain] |
| langchain-core | 0.1.0 | Included with langgraph/langchain |
Note: LangGraph 1.0+ introduces a new Checkpoint format with required fields (
step,pending_sends,parents). The SDK's integration module is compatible with LangGraph 1.0 and above.
Docker is required for:
- Building and deploying agents with
agentarts deploy(alias:launch)
Install Docker:
Refer to Huawei Cloud AgentArts Documentation for resource quotas and limits.
- SDK User Guides - Memory, Code Interpreter, MCP Gateway
- CLI User Guides - init, config, deploy, invoke, status, destroy
- Contributing Guide - Development setup and guidelines
- Architecture - System architecture overview
# Install development dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Code formatting
black . && isort .
# Type checking
mypy agentarts
# Linting
ruff check .This project is licensed under the Apache 2.0 License - see the LICENSE file for details.
Contributions are welcome! Please see CONTRIBUTING.md for details.
- Issues: GitHub Issues
- Documentation: https://docs.huaweicloud.com/agentarts
- Email: agentarts@huawei.com