TrendAI is a news platform that uses semantic search and user behavior analysis to deliver relevant content. The app provides three main features: recommendations based on reading history, semantic search across articles and videos, and an AI chatbot that answers questions about the news.
The backend runs on FastAPI and uses two databases:
- Milvus: Vector database for semantic search and content embeddings
- Supabase: PostgreSQL database for user interactions and metadata
Articles and videos are embedded using the BAAI/bge-m3 multilingual model, creating 1024-dimensional vectors that capture semantic meaning in French and Arabic.
The system builds a user profile by analyzing their reading behavior stored in Supabase (views, bookmarks, interaction scores). For each article they've engaged with, the app fetches the corresponding embedding vector from Milvus. These vectors are averaged with higher weights for bookmarked content, creating a "user preference vector".
When requesting recommendations, the system searches Milvus using this vector to find semantically similar articles. It filters out content the user has already seen and returns the top matches. If no user history exists, it falls back to keyword extraction from their most recent interactions.
Search queries are converted to embeddings and compared against the article database using cosine similarity. The system classifies search intent (video content, match results, schedules, general news) and applies filters accordingly. For example, queries mentioning "video" automatically filter to video content, and year-specific queries narrow results to that timeframe.
Results are ranked by relevance score and limited to the most similar content.
The chatbot combines semantic search with Llama 3.3 70B. When a user asks a question:
- The question is embedded and used to search Milvus for relevant articles
- Results with similarity above 0.5 are selected
- Article titles and descriptions are combined into a context prompt
- The LLM generates a response based on this context
- The response is returned along with source articles
The chat keeps the last 4 turns of conversation history to maintain context across multiple messages.
- Python 3.10+
- Flutter SDK
- Dart SDK
- Clone the backend repository:
git clone https://github.com/TrendAiLab/TrendAi-FastApi.git
cd trendai-fastapi- Create a virtual environment:
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate- Install dependencies:
pip install -r requirements.txt- Configure environment variables:
cp .env.example .env
# Edit .env with your credentialsRequired environment variables:
MILVUS_URI: Your Milvus cloud instance URIMILVUS_TOKEN: Your Milvus API tokenSUPABASE_URL: Your Supabase project URLSUPABASE_ANON_KEY: Your Supabase anonymous keyGROQ_API_KEY: Your Groq API key
- Run the application:
python main.pyThe API will be available at http://localhost:8088
- Clone the frontend repository:
git clone https://github.com/TrendAiLab/TrendAi-flutter.git
cd trendai-flutter- Install dependencies:
flutter pub get-
Configure the API endpoint in your Flutter app to point to the backend API.
-
Run the application:
flutter runGET /health- Health checkGET /api/v1/articles- Get articlesPOST /api/v1/search- Semantic searchGET /api/v1/recommendations/{user_id}- Personalized recommendationsPOST /api/v1/chat- Chat with AI assistant
Backend:
- FastAPI for the API layer
- SentenceTransformers for embeddings
- Milvus for vector search
- Supabase for relational data
Frontend:
- Flutter for cross-platform mobile development
- Dart programming language