Skip to content

sam189239/locallm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Local LLM Inference and UI - utils repo

License

This repository contains utilities and examples for performing Local LLM (Large Language Model) inference using various interfaces and frameworks. It leverages the Ollama API for LLM interaction and provides multiple UI options like CLI, Gradio, and Streamlit interfaces. This repository also includes Langchain-based inference examples.

Table of Contents

  • Ollama API Inference

    • CLI
    • Gradio-UI
    • Streamlit-UI
  • Langchain Inference

    • Initial testing
    • CLI
    • Gradio-UI
    • Streamlit-UI
    • Open-WebUI
  • Extras

    • HuggingFace API
    • Anthropic API

Installation

  1. Clone the repository.

  2. Install dependencies by creating a virtual environment and activating it, then installing the required packages using pip.

  3. Run the setup by following the instructions provided in the repository files.

  4. Samples provided in starter.ipynb

Usage

  • Follow the code and instructions in the different sections of the starter.ipynb file to run and use the code.

  • Modular functions in locallm can be used directly as utility functions and files in the ui directory can be run directly for UI based inference.

  • Refer to the YouTube Demo: coming soon

Tech Stack

  • Ollama API: API for running local large language models.
  • Langchain: A framework for building applications using LLMs.
  • Gradio: A Python library for creating UI interfaces for machine learning models.
  • Streamlit: A framework for building interactive data applications.
  • HuggingFace API: Provides access to a wide variety of transformer-based models.
  • Anthropic API: Access to models like Claude from Anthropic.

Contributing

Contributions are welcome! If you'd like to contribute, please fork the repository and submit a pull request.

  1. Fork the repository.
  2. Create a new branch for your changes.
  3. Make your changes and test them.
  4. Commit your changes.
  5. Open a pull request with a detailed description of what you have done.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgements

  • Thanks to the authors of Ollama and Langchain for their amazing libraries.
  • This project uses several open-source libraries that make LLM interaction and UI management seamless.

Note: This is a work-in-progress repository. Features and documentation will be updated regularly.

About

Utility and starter code for local LLM inference and chat interface.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published