End-to-end MLOps system for forecasting daily and monthly meal expenses using historical meal and cost data.
This project implements a complete MLOps pipeline:
- Data preprocessing & feature engineering
- Model training & experiment logging (MLflow)
- Model versioning via Model Registry
- FastAPI-based model serving
- Automatic model updates by pulling the Production version
- Optional containerized deployment (Docker/Render/Railway)
┌────────────────────────┐
│ train_with_mlflow.py │
│ (Training Script) │
└─────────────┬───────────┘
│ Logs Metrics + Artifacts
▼
🧠 MLflow @ DagsHub
- Experiments Dashboard
- Metrics / Parameters
- Model Versions
- Model Registry (Production / Staging)
│
▼
┌────────────────────────┐
│ FastAPI Service │
│ Loads PRODUCTION Model │
└─────────────┬───────────┘
│
▼
✅ Returned Predictions (API)
meal-forecast-mlops/
│
├── app/
│ ├── main.py # FastAPI application (serves model)
│ ├── schemas.py # Response models
│ └── plotting.py # (Optional) Forecast graph utilities
│
├── data/
│ └── expenses.csv # Example dataset
│
├── mlops/
│ └── train_with_mlflow.py # Train + log + register model
│
├── infra/mlflow/
│ ├── Dockerfile # Deploy the API
│ └── render-start.sh # Server start script
│
├── requirements.txt
├── .env.example
└── README.md
git clone https://github.com/onlynayan/meal-forecast-mlops
cd meal-forecast-mlops
python -m venv .venv
source .venv/Scripts/activate # Windows
# or
source .venv/bin/activate # Mac/Linux
pip install -r requirements.txt
Create .env file based on .env.example:
MLFLOW_TRACKING_URI=https://dagshub.com/<username>/<repo>.mlflow
MLFLOW_TRACKING_USERNAME=<your-username>
MLFLOW_TRACKING_PASSWORD=<mlflow-token>
MODEL_NAME=meal_forecast_model
DATA_CSV=./data/expenses.csv
python mlops/train_with_mlflow.py
→ Visit MLflow UI to view metrics & artifact runs
→ Promote model to Production in Model Registry
https://dagshub.com/<username>/<repo>.mlflow
uvicorn app.main:app --reload
| Method | Endpoint | Description |
|---|---|---|
| GET | / |
API running status |
| GET | /health |
Check model & dataset load |
| GET | /predict/month?year=2025&month=2 |
Forecast full month |
/plots/forecast?year=2025&month=2 |
Download forecast graph |
Example request:
GET http://127.0.0.1:8000/predict/month?year=2025&month=2
docker build -t meal-forecast-api -f infra/mlflow/Dockerfile .
docker run -p 8000:8000 meal-forecast-api
When you promote a new model version to Production, the FastAPI service automatically starts using the new model — no re-deploy needed.
This is real MLOps.
Nayan Das
GitHub: https://github.com/onlynayan