This workflow is a data integration and automation pipeline built in n8n. It mainly connects to external services to fetch, store, process, and report information. Every HTTP Request node can be pointed at any external data source, making the flow adaptable to different APIs and data feeds.
- Triggers from Telegram or a schedule
- Collects data from multiple external HTTP endpoints
- Transforms data with JavaScript steps
- Analyzes and summarizes data using AI models (OpenAI GPT and OpenRouter)
- Reports progress via Telegram messages
- Persists results across Google Sheets (append/update)
- Uses staged messaging and batching to manage flow and updates
- Created with ErrorHandling methods
- Triggers: Telegram Trigger, Schedule Trigger
- Data flow: Merge, If, Edit Fields / Set, multiple HTTP Request nodes, Wait, Split In Batches, Code (JavaScript)
- AI path: AI Agent → OpenAI Chat Model → OpenRouter Chat Model
- Output: Telegram messages, Google Sheets (Sheet1–Sheet9)
- Documentation aids: Sticky Notes
- Error handling: onError: continueErrorOutput
- Telegram Trigger or Schedule Trigger → Merge
- Branch via If → parallel HTTP Requests (8–15) with Edit Fields and Telegram status messages, then Merge2
- Optional Wait, per-item processing with Code + JavaScript
- Looping / batching with Loop Over Items
- AI path: AI Agent + LangChain models → final messages
- Data storage: Google Sheets (Sheet1–Sheet9)
- Final aggregation: Merge3 → final AI-driven message and summary via Telegram
- Each HTTP Request can target any external API; endpoints are configurable
- HTTP Requests continue on error to keep the flow resilient
- Outputs appear as Telegram messages during flow and as Google Sheets rows
- Import the workflow JSON into n8n
- Configure all credentials:
- Telegram Bot credentials
- Google Sheets OAuth2
- LangChain/OpenAI model (e.g., gpt)
- OpenRouter credentials
- Fill in documentId and sheetName for Sheets 1–9
- Enable triggers (Telegram and/or Schedule)
- Run a test to verify Telegram updates and Google Sheets writes
- Securely store credentials; avoid exposing tokens
- Monitor OpenAI/OpenRouter usage and costs
- Validate endpoints and data formats periodically