Skip to content

jmouchawar/auto-blogger

Repository files navigation

AI News Blog Workflow

A scheduled Python workflow that fetches AI news from RSS feeds, summarizes it with a local Ollama model, and publishes the result as a post to your Google Blogger blog.

Requirements

  • Python 3.10+
  • Ollama installed and running locally (with at least one model pulled, e.g. ollama pull llama3.2)
  • A Blogger blog and Google Cloud project with the Blogger API enabled

Setup

1. Clone or create the project and install dependencies

cd ai-news-blog-workflow
python3 -m venv .venv
source .venv/bin/activate   # or .venv\Scripts\activate on Windows
pip install -r requirements.txt

If you don’t have a config.yaml yet, copy the example and edit it:

cp config.yaml.example config.yaml

2. Configure RSS feeds and options

Edit config.yaml:

  • feeds: List of RSS/Atom feed URLs. You can use AI/tech-focused feeds (e.g. TechCrunch AI, VentureBeat, The Verge AI) or any other sources—e.g. popular HN blogs OPML (use the xmlUrl values as feed URLs).
  • ollama: Set model to an Ollama model you have pulled (e.g. llama3.2, gemma3:latest). Optionally set host if Ollama runs elsewhere. Use timeout_seconds (e.g. 300) to avoid hanging on slow responses. Use prompt_persona and prompt_requirements to tune the summarizer’s tone and output format.
  • blogger: Set blog_id (see below). Paths to credentials.json and token.json default to the project directory.

3. Google Cloud and Blogger API

  1. Go to Google Cloud Console and create or select a project.
  2. Enable the Blogger API: APIs & Services → Enable APIs and Services → search for “Blogger API” → Enable.
  3. Create OAuth 2.0 credentials:
    • APIs & Services → Credentials → Create Credentials → OAuth client ID.
    • If prompted, configure the OAuth consent screen (e.g. External, add your email as a test user).
    • Application type: Desktop app.
    • Download the JSON and save it as credentials.json in this project directory. Do not commit this file.

4. Get your Blogger Blog ID

  • In Blogger, open your blog and go to Settings → Other. Under “Blog tools” you may see the Blog ID.
  • Or: open your blog’s URL in the browser and view the page source; search for blogID or similar.
  • Or: use the Blogger API “list by user” to list your blogs and their IDs.

Put this value in config.yaml under blogger.blog_id (or set the BLOGGER_BLOG_ID environment variable).

5. One-time OAuth (create token.json)

Run the auth helper once. It will open a browser for you to sign in and grant access; then it saves a refresh token to token.json:

python auth_blogger.py

Keep token.json and credentials.json out of version control (they are listed in .gitignore).

6. Run the workflow

python run.py

This will:

  1. Fetch recent items from the configured RSS feeds (last 48 hours, deduplicated).
  2. Call Ollama to generate a single blog post (title + HTML body) summarizing those items.
  3. Create a new post on your Blogger blog and publish it (or save as draft if blogger.publish_as_draft is true in config.yaml).

Progress is logged to stdout (timestamp, step, and key info). When run from cron, output is appended to logs/ai-news-blog.log.

Scheduling with cron

To run the workflow daily (e.g. at 9:00 AM), use cron. Ensure the cron job runs in an environment where Ollama is available (same machine or set ollama.host / OLLAMA_HOST if Ollama is on another host).

A ready-to-paste cron line for this project is in crontab.example (paths already set for this repo). Output is written to logs/ai-news-blog.log.

crontab -e

Paste the line from crontab.example, then save and exit. Confirm with crontab -l. On macOS, cron runs only when the machine is awake; ensure Ollama is running at the scheduled time.

Configuration reference

Setting Description
feeds List of RSS feed URLs.
ollama.model Ollama model name (e.g. llama3.2).
ollama.host Optional. Ollama server URL (default: http://localhost:11434).
ollama.max_tokens Max tokens for the summary (default: 4096).
ollama.timeout_seconds Optional. Request timeout in seconds (e.g. 300).
ollama.prompt_persona Optional. Multiline string: AI persona/tone injected at the start of the prompt.
ollama.prompt_requirements Optional. Multiline string: format and output requirements (title, HTML, TITLE line).
blogger.blog_id Your Blogger blog ID.
blogger.credentials_path Path to OAuth client JSON (default: credentials.json).
blogger.token_path Path to stored refresh token (default: token.json).
blogger.publish_as_draft If true, posts are created as drafts.

Environment variables:

  • BLOGGER_BLOG_ID overrides blogger.blog_id.
  • CONFIG_PATH overrides the config file path (default: config.yaml).

Project layout

File / folder Purpose
run.py Entrypoint: load config, fetch feeds, summarize, publish.
feeds.py RSS/Atom fetch, parse, dedupe by URL.
summarize.py Build prompt, call Ollama, return title + HTML body.
blogger_client.py OAuth and publish post to Blogger.
auth_blogger.py One-time OAuth flow to create token.json.
config.yaml Feeds, Ollama and Blogger settings (copy from config.yaml.example).
crontab.example Example cron line for scheduled runs.
logs/ Cron output goes to logs/ai-news-blog.log.

Troubleshooting

  • “No valid token”: Run python auth_blogger.py again and complete the browser flow.
  • “Credentials file not found”: Ensure credentials.json from Google Cloud (Desktop app) is in the project directory or at the path set in config.
  • Ollama connection errors: Ensure Ollama is running (ollama serve or the Ollama app) and that the model in config is pulled (ollama pull <model>).
  • Ollama timeout or very long runs: Increase ollama.timeout_seconds (e.g. 600) or lower ollama.max_tokens (e.g. 2048). Slow or large models can exceed the default 300s timeout.
  • Blog ID: Double-check blog_id in config or BLOGGER_BLOG_ID; wrong ID will cause API errors.
  • Feed fetch warnings: Failed feeds are logged; the run continues with the rest. Remove or replace broken feed URLs if needed.

License

Use and modify as you like.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages