Get intelligent responses from 6 different AI models simultaneously! Compare perspectives from DeepSeek, OpenAI GPT, Google Gemini, Meta Llama, Alibaba Qwen, and Moonshot Kimi K2 - all in one unified interface.
Features • Demo • Installation • Usage • Architecture • Contributing
AIVerse is a cutting-edge multi-agent AI platform that leverages the power of 6 different state-of-the-art language models to provide diverse, comprehensive answers to your questions. By running all agents in parallel, you get instant access to multiple AI perspectives in seconds.
- 🔄 Multiple Perspectives: Compare responses from 6 different AI models
- ⚡ Parallel Processing: All agents run simultaneously using ThreadPoolExecutor
- 🛠️ Tool-Enabled Agents: Each agent has access to weather, currency conversion, web search, and calculator tools
- 💻 Dual Interface: Choose between beautiful Streamlit web UI or CLI
- 📥 Export Responses: Download any agent's response in Markdown format
- 🎨 Modern UI: Clean, gradient-styled interface with real-time progress tracking
Each AI agent has access to these tools:
- 🌤️ Weather Information: Get current weather for any location
- 💱 Currency Conversion: USD to INR conversion
- 🔍 Web Search: Real-time information retrieval via Google Serper API
- 🧮 Calculator: Mathematical calculations
streamlit run app.pypython main.pyEnter your query: What's the weather in New York and convert 100 USD to INR?
DeepSeek Agent Response:
The current temperature in New York is 5°C with Partly cloudy.
100 USD is equal to 8,350 INR.
OpenAI Agent Response:
New York weather: 5°C, partly cloudy conditions.
Currency: $100 = ₹8,350
[... responses from other agents ...]
| Category | Technology |
|---|---|
| Language | Python 3.11.9 |
| Web Framework | Streamlit |
| AI Framework | LangChain |
| AI Models | Ollama, Groq, Google Gemini |
| Concurrency | ThreadPoolExecutor |
| APIs | WeatherAPI, CurrencyAPI, Google Serper API |
| Environment | python-dotenv |
- Python 3.11.9 or higher
- pip package manager
- Active internet connection
- API keys (see Configuration section)
git clone https://github.com/yourusername/aiverse.git
cd aiverse# Windows
python -m venv venv
venv\Scripts\activate
# macOS/Linux
python3 -m venv venv
source venv/bin/activatepip install -r requirements.txtDownload and install Ollama from ollama.ai
# Pull required models
ollama pull deepseek-v3.1:671b-cloud
ollama pull gpt-oss:120b-cloudCreate a .env file in the root directory:
# Weather API
WEATHER_API_KEY=your_weatherapi_key_here
# Currency API
CURRENCY_API_KEY=your_currencyapi_key_here
# Google Serper API (for web search)
SERPER_API_KEY=your_serper_api_key_here
# Groq API (for Llama, Qwen, Kimi models)
GROQ_API_KEY=your_groq_api_key_here
# Google AI API (for Gemini)
GOOGLE_API_KEY=your_google_api_key_here| API | Get Key From | Free Tier |
|---|---|---|
| WeatherAPI | weatherapi.com | ✅ Yes |
| CurrencyAPI | currencyapi.com | ✅ Yes |
| Google Serper | serper.dev | ✅ Limited |
| Groq | console.groq.com | ✅ Yes |
| Google AI | ai.google.dev | ✅ Yes |
streamlit run app.py- Open browser at
http://localhost:8501 - Enter your question in the input field
- Click SUBMIT
- Watch as all 6 agents process your query in parallel
- View responses in individual tabs
- Download any response using the download buttons
python main.pyAll 6 agents run in parallel and display results as they complete.
python sequential_main.pyAgents run one after another (slower but more predictable).
AiVERSE_2.0/
│
├── app.py # Streamlit web application
├── main.py # CLI with parallel execution
├── sequential_main.py # CLI with sequential execution
├── agents.py # Agent configuration and initialization
├── tools.py # LangChain tool definitions
├── google_client.py # Google Gemini client wrapper
│
├── .env # Environment variables (create this)
├── .gitignore # Git ignore file
├── requirements.txt # Python dependencies
├── README.md # This file
│
└── __pycache__/ # Python cache files
User Query
↓
ThreadPoolExecutor (6 parallel threads)
├── DeepSeek Agent (Ollama) ──→ Tools (Weather, Currency, Search, Calculator)
├── OpenAI Agent (Ollama) ────→ Tools (Weather, Currency, Search, Calculator)
├── Qwen Agent (Groq) ────────→ Tools (Weather, Currency, Search, Calculator)
├── Llama Agent (Groq) ───────→ Tools (Weather, Currency, Search, Calculator)
├── Kimi Agent (Groq) ────────→ Tools (Weather, Currency, Search, Calculator)
└── Gemini Agent (Google) ────→ Direct API call
↓
Aggregate Responses
↓
Display in UI/CLI
Each LangChain agent follows this pattern:
create_agent(
model="provider:model_name",
system_prompt="You are a helpful assistant...",
tools=[weather, currency, web_search, calculator]
)Using concurrent.futures.ThreadPoolExecutor:
- All 6 agents execute simultaneously
- Responses collected as they complete
- Progress tracked in real-time
- Error handling per agent (failures don't block others)
- Models: DeepSeek V3.1, OpenAI GPT-OSS
- Setup: Install Ollama desktop app
- Endpoint:
http://localhost:11434
- Models: Llama 3.3, Qwen 3, Kimi K2
- Speed: Ultra-fast inference
- Free Tier: Generous limits
- Model: Gemma 3 27B IT
- Direct API: Google AI Studio
- Provider: WeatherAPI.com
- Endpoint:
api.weatherapi.com/v1/current.json
- Provider: CurrencyAPI.com
- Endpoint: Real-time USD to INR rates
- Provider: Google Serper API
- Feature: Real-time search results
pip install -r requirements.txt# Check Ollama is running
ollama list
# Pull models if missing
ollama pull deepseek-v3.1:671b-cloud
ollama pull gpt-oss:120b-cloud- Verify all keys in
.envfile - Check API key validity on provider websites
- Ensure no trailing spaces in keys
# Specify port manually
streamlit run app.py --server.port 8501pip install --upgrade langchain langchain-community- Response comparison and summary view
- Token usage and cost tracking
- Response caching for repeated queries
- User authentication and history
- More AI models (Anthropic Claude, Mistral, etc.)
- Custom tool creation interface
- API endpoint for programmatic access
- Docker containerization
- Response quality voting system
- Export all responses to PDF
Contributions are welcome! Here's how you can help:
- Fork the repository
- Create a feature branch
git checkout -b feature/AmazingFeature
- Commit your changes
git commit -m 'Add some AmazingFeature' - Push to the branch
git push origin feature/AmazingFeature
- Open a Pull Request