Skip to content

adPalafox/local-grocery-agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Pantry Agent

A local-first, fully offline AI pantry manager that runs on your Mac from the command line.

No cloud. No subscriptions. No data leaving your machine.

Built with Python, SQLite, and a local LLM via Ollama.


How it works

You manage your pantry through plain CLI commands or by talking to an AI agent in plain English. The agent decides which action to take and calls structured tools to read or modify your pantry — it never writes to the database directly.

You: "what's expiring this week?"
Agent → calls list_expiring(days=7) → reads SQLite → replies in plain English

Everything runs locally. Your pantry data lives in data/pantry.db on your machine.


Requirements

  • macOS (Apple Silicon or Intel)
  • Homebrew
  • Internet connection only for the initial model download (~4 GB, one time)

Setup

1. Install system dependencies

brew install python@3.11 ollama

2. Start the Ollama service

brew services start ollama

3. Pull a language model

ollama pull mistral

mistral (7B, ~4.4 GB) is the default. Any Ollama-compatible model works. For a lighter option try phi3 (~2.3 GB).

4. Clone and install

git clone https://github.com/your-username/local-grocery-agent.git
cd local-grocery-agent

python3.11 -m venv .venv
source .venv/bin/activate

pip install -e .

5. Verify

pantry --help

Commands

pantry add

Add a new item or update an existing one.

pantry add <name> <quantity> [--expiry YYYY-MM-DD]
pantry add eggs "12"
pantry add milk "2L"
pantry add butter "250g" --expiry 2026-04-01
  • name and quantity are required.
  • --expiry / -e is optional. Use ISO date format: YYYY-MM-DD.
  • Adding an item that already exists updates it (upsert).

pantry list

Display all items currently in your pantry.

pantry list
         Pantry
┏━━━━━━━━━━┳━━━━━━━━━━┳━━━━━━━━━━━━┓
┃ Item     ┃ Quantity ┃ Expires    ┃
┡━━━━━━━━━━╇━━━━━━━━━━╇━━━━━━━━━━━━┩
│ butter   │ 250g     │ 2026-04-01 │
│ eggs     │ 12       │ —          │
│ milk     │ 2L       │ —          │
└──────────┴──────────┴────────────┘

pantry expiring

Show items expiring within N days (default: 7).

pantry expiring
pantry expiring --days 14
pantry expiring -d 3

Items are sorted by expiry date. Only items with an expiry date set are shown.


pantry remove

Remove an item from the pantry by name.

pantry remove <name>
pantry remove butter

Name matching is case-insensitive (Milk and milk are the same item).


pantry suggest

Ask the agent to suggest recipes based on what's currently in your pantry. Requires Ollama to be running.

pantry suggest

pantry chat

Start a conversational session with the AI agent.

pantry chat
pantry chat --model phi3
pantry chat -m llama3

The agent has two modes internally:

Fast path — explicit commands are matched by pattern and executed directly against the tool functions without an LLM round-trip. Instant response, no model needed.

What you type What happens
list / show all calls list_items() directly
expiring / expiring 3 / expiring in 14 days calls list_expiring(N) directly
remove <name> / delete <name> calls delete_item(name) directly
add eggs 12 / add milk 2L 2026-04-01 calls add_item(...) directly

LLM path — natural language and anything not matched by the fast path goes to Ollama. The model decides which tool to call (or answers in plain text).

What you type What happens
add tomato sauce 2 cans LLM parses the multi-word name and quantity
what's about to expire? LLM calls list_expiring and summarizes
suggest something I can cook LLM reads pantry and generates recipes
Any freeform question LLM answers conversationally
You: list
Agent: Pantry items:
  - eggs: 12 (exp: 2026-04-01)
  - milk: 2L

You: add tomato sauce 2 cans expiring June 2026
Agent: I've added 2 cans of tomato sauce with an expiry date of June 1, 2026.

You: suggest something I can cook
Agent: With eggs, milk, and tomato sauce you could make shakshuka or pasta...

Type exit or press Ctrl-C to quit.


Project structure

local-grocery-agent/
│
├── src/
│   ├── agent/
│   │   └── agent.py        # Agent loop — parses LLM output, dispatches tools
│   │
│   ├── cli/
│   │   └── main.py         # CLI commands (Typer + Rich)
│   │
│   ├── db/
│   │   └── database.py     # SQLite layer — schema, CRUD, all raw queries
│   │
│   ├── llm/
│   │   └── ollama_client.py  # HTTP wrapper for the Ollama API
│   │
│   └── tools/
│       └── pantry_tools.py   # Tool functions + registry + JSON schemas
│
├── data/
│   └── pantry.db           # SQLite database (created on first run, gitignored)
│
├── tests/
│   └── test_database.py    # DB layer smoke tests
│
├── pyproject.toml          # Project metadata and dependencies
├── LICENSE
└── README.md

Layer responsibilities

Layer File Does
CLI cli/main.py Parses flags, formats tables, runs the chat loop
Agent agent/agent.py Sends messages to LLM, detects tool calls, dispatches
Tools tools/pantry_tools.py Implements each tool, formats results for the agent
Database db/database.py All SQL — no logic above raw CRUD
LLM llm/ollama_client.py HTTP to Ollama, nothing else

The LLM never touches the database directly. It calls tools via a JSON protocol; the agent layer executes them.


Changing the model

Any model available in Ollama works. Pull it first, then pass it to chat:

ollama pull phi3
pantry chat --model phi3

To change the default, edit DEFAULT_MODEL in src/llm/ollama_client.py.


Running tests

source .venv/bin/activate
pip install pytest
pytest tests/

Contributing

Contributions are welcome.

  1. Fork the repository
  2. Create a feature branch: git checkout -b feat/your-feature
  3. Commit your changes
  4. Open a pull request

Please keep pull requests focused. One feature or fix per PR.


Roadmap

  • pantry suggest — dedicated recipe suggestion command
  • Bulk import from a text or CSV file
  • Voice input via Whisper.cpp
  • iOS companion app reading the same SQLite database

License

MIT — see LICENSE.

Free to use, modify, and distribute. Attribution appreciated but not required.

About

A local-first, fully offline AI pantry manager that runs on your Mac from the command line. No cloud. No subscriptions. No data leaving your machine.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages