Skip to content

Aliyan-12/ollama_streamlit_app

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

2 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿง  Streamlit App for Ollama LLMs

A simple and clean Streamlit-based UI for interacting with locally running Ollama LLM models.
This app makes it easy to send prompts, view responses, and experiment with your installed models through a friendly web interface.


๐Ÿš€ Features

  • Minimal and intuitive Streamlit UI
  • Interact with local Ollama models
  • Dynamic text generation output
  • Easy to extend and customize

๐Ÿ“ฆ Prerequisites

Before running the application, ensure you have:

  1. Ollama installed & running locally ๐Ÿ‘‰ https://ollama.com/download
  2. Python 3.10+

๐Ÿง  Installation

Clone the repository:

git clone https://github.com/Aliyan-12/ollama_streamlit_app.git
cd ollama_streamlit_app

Create and activate a virtual environment:

python -m venv venv

Windows

venv\Scripts\activate

macOS / Linux

source venv/bin/activate

Install dependencies:

pip install -r requirements.txt

Run the application:

streamlit run app.py

About

A Streamlit UI for running inference with Ollama LLMs locally.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages