A simple and clean Streamlit-based UI for interacting with locally running Ollama LLM models.
This app makes it easy to send prompts, view responses, and experiment with your installed models through a friendly web interface.
- Minimal and intuitive Streamlit UI
- Interact with local Ollama models
- Dynamic text generation output
- Easy to extend and customize
Before running the application, ensure you have:
- Ollama installed & running locally ๐ https://ollama.com/download
- Python 3.10+
git clone https://github.com/Aliyan-12/ollama_streamlit_app.git
cd ollama_streamlit_apppython -m venv venvvenv\Scripts\activatesource venv/bin/activatepip install -r requirements.txtstreamlit run app.py