Skip to content

Conversation

@KumarADITHYA123
Copy link

@KumarADITHYA123 KumarADITHYA123 commented Jan 26, 2026

I have successfully completed issue #195 the Docker setup for the GitHub MCP Server .

As requested, I have focused only on containerizing the MCP service without modifying the existing backend infrastructure configuration.

Implementation Details

Standalone Docker Compose (docker-compose.yml)
I created a root-level docker-compose.yml to orchestrate the github-mcp service.
Why:
This keeps the lightweight MCP environment decoupled from the heavier backend
infrastructure (Weaviate/RabbitMQ), allowing for faster local testing.
Networking:
Maps internal port 8001 to host 5001.

Build Optimization (Crucial Fix)
I modified the GitHub MCP Dockerfile to enforce a CPU-only installation for PyTorch.
Context:
The default PyTorch installation was attempting to pull ~2.5GB of Nvidia CUDA binaries,
causing hash mismatches and build failures.
Result:
The build is now stable and significantly smaller, which is appropriate for this service.

Dependency Fix
I added pydantic-settings to the requirements.txt.

Reason:
The app lication failed to start with an ImportError because settings.py relies on
BaseSettings, which was missing from the MCP environment.

✅ Verification
I have verified the setup on a fresh environment:
Build:
docker-compose up --build (Successful)

Health Check:

curl http://localhost:5001/health

Output: {"status": "healthy", "service": "github-mcp"}

Summary by CodeRabbit

  • Documentation

    • Added a Docker Compose setup guide with environment configuration, startup instructions, service URL/port mapping, health-check verification, quick-start notes, and troubleshooting tips for local backend development.
  • Chores

    • Added containerization and runtime configuration for the backend, included required Python dependencies and an updated environment example (new GITHUB_ORG variable).

✏️ Tip: You can customize this high-level summary in your review settings.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 26, 2026

📝 Walkthrough

Walkthrough

Adds containerization and local-development docs for the GitHub MCP backend: Dockerfile, docker-compose, Python dependency list, README Docker Compose section, and an env.example variable addition. No API or exported/interface changes.

Changes

Cohort / File(s) Summary
Documentation & Env
README.md, env.example
New "Docker Compose Setup" content in README with env config, startup/health-check steps, troubleshooting notes; env.example adds GITHUB_ORG.
Containerization
docker-compose.yml, backend/github_mcp_server/Dockerfile
Adds docker-compose.yml defining github-mcp service (host 5001→container 8001), mounts backend, passes env vars; Dockerfile based on python:3.10-slim, installs dependencies, exposes 8001, adds HEALTHCHECK and startup command.
Backend Dependencies
backend/github_mcp_server/requirements.txt
New requirements including fastapi, uvicorn, langchain ecosystem, torch (CPU), sentence-transformers, weaviate-client, supabase, sqlalchemy, and related packages.

Sequence Diagram(s)

(Skipped — changes are infrastructure/docs and do not introduce a new multi-component runtime control flow warranting a sequence diagram.)

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Suggested labels

documentation

Suggested reviewers

  • chandansgowda
  • smokeyScraper

Poem

🐇 I hopped into Docker, whiskers bright,

Ports and healthchecks snug at night.
Python packs bundled, volumes in tow,
Compose hums softly, ready to go.
A rabbit-approved deploy — hop, let's show!

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'feat: Dockerize GitHub MCP server and optimize build' directly addresses the main changes: adding Docker support (Dockerfile, docker-compose.yml) and optimizing the build (CPU-only PyTorch, reduced image size).
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🤖 Fix all issues with AI agents
In `@backend/github_mcp_server/Dockerfile`:
- Around line 17-19: The HEALTHCHECK uses curl but the base image
python:3.10-slim does not include it, so update the Dockerfile to install curl
before the HEALTHCHECK; modify the Dockerfile to run apt-get update and apt-get
install -y --no-install-recommends curl (and clean up apt lists) after the base
image is set and before the HEALTHCHECK line so the CMD curl -f
http://localhost:8001/health || exit 1 can succeed.

In `@backend/github_mcp_server/requirements.txt`:
- Line 13: requirements.txt currently lists a bare "torch" which can pull
CUDA-enabled wheels from PyPI; update requirements.txt to explicitly pin the
CPU-only torch and add the PyTorch CPU index (e.g., add an --extra-index-url
pointing to https://download.pytorch.org/whl/cpu and specify torch==2.3.0+cpu)
so pip install -r requirements.txt installs CPU-only wheels, or alternatively
remove the separate torch install from the Dockerfile and ensure
requirements.txt is the single source of truth for torch installation.

In `@README.md`:
- Line 22: Update the README line that reads "**ReAct Reasoning Pattern** -
Think → Act → Observe workflow for intelligent decision making" to hyphenate
"decision-making" so it becomes "decision-making"; locate and edit the exact
string "**ReAct Reasoning Pattern** - Think → Act → Observe workflow for
intelligent decision making" in the README and replace "decision making" with
"decision-making".
- Around line 182-187: Update the README's environment configuration section to
document the SUPABASE_URL and SUPABASE_KEY variables (the SUPABASE_* vars seen
being passed into the container) and state whether they are required at runtime
or optional; add example entries for SUPABASE_URL and SUPABASE_KEY to the `.env`
example and a short note that docker-compose injects these into the container if
present, or explicitly mark them optional if the app can start without them.

ddgs
sqlalchemy
langchain-community
torch
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

find . -name "Dockerfile" -o -name "dockerfile" | head -20

Repository: AOSSIE-Org/Devr.AI

Length of output: 100


🏁 Script executed:

git ls-files | grep -i docker

Repository: AOSSIE-Org/Devr.AI

Length of output: 144


🏁 Script executed:

cat backend/github_mcp_server/requirements.txt

Repository: AOSSIE-Org/Devr.AI

Length of output: 296


🏁 Script executed:

cat -n backend/github_mcp_server/Dockerfile

Repository: AOSSIE-Org/Devr.AI

Length of output: 864


🏁 Script executed:

web_search
What is the correct syntax for specifying CPU-only PyTorch in requirements.txt and how should pip index URLs be configured?

Repository: AOSSIE-Org/Devr.AI

Length of output: 155


🌐 Web query:

CPU-only PyTorch installation pip requirements.txt syntax torch index url

💡 Result:

Use pip’s requirements.txt “global options” lines (--index-url, --extra-index-url) to point torch installs at PyTorch’s CPU wheel index. Pip supports these options in requirements files. [1]

Option A (recommended): keep PyPI as default, add PyTorch CPU as extra index

--extra-index-url https://download.pytorch.org/whl/cpu
torch==2.3.0+cpu
torchvision==0.18.1+cpu
torchaudio==2.3.0+cpu

This matches the common CPU-only install approach (--extra-index-url …/cpu). [2][3]

Option B: make PyTorch CPU the primary index, fall back to PyPI for everything else

--index-url https://download.pytorch.org/whl/cpu
--extra-index-url https://pypi.org/simple
torch==2.2.1
# other deps...

(Useful if you want pip to prefer the CPU-only torch builds.) [3][1]

Option C: pin an explicit wheel URL (works even if index settings are awkward)

In requirements files you can also use direct references (e.g., torch @ <wheel-url>). [4][1]

Sources: pip requirements file options [1]; PyTorch “Get Started” CPU pip command pattern [2]; working requirements.txt examples for CPU wheels [3]; PyTorch forum note about using @ direct references [4].


Ensure CPU-only PyTorch is properly configured in requirements.txt, not just in the Dockerfile.

The Dockerfile does install CPU-only torch on line 8, but requirements.txt lists bare torch on line 13 with no version or index specification. When pip install -r requirements.txt runs (line 9), pip will install torch from the default PyPI index, which includes CUDA wheels by default. While the Dockerfile order makes this work in Docker context, the configuration is fragile and causes issues if requirements.txt is used elsewhere (local development, alternative CI).

Move torch specification into requirements.txt with proper CPU index configuration using one of these approaches:

--extra-index-url https://download.pytorch.org/whl/cpu
torch==2.3.0+cpu

Or update the Dockerfile to remove the separate torch install and rely on requirements.txt with CPU index settings.

🤖 Prompt for AI Agents
In `@backend/github_mcp_server/requirements.txt` at line 13, requirements.txt
currently lists a bare "torch" which can pull CUDA-enabled wheels from PyPI;
update requirements.txt to explicitly pin the CPU-only torch and add the PyTorch
CPU index (e.g., add an --extra-index-url pointing to
https://download.pytorch.org/whl/cpu and specify torch==2.3.0+cpu) so pip
install -r requirements.txt installs CPU-only wheels, or alternatively remove
the separate torch install from the Dockerfile and ensure requirements.txt is
the single source of truth for torch installation.


### 🧠 LangGraph Agent-Based Intelligence

- **ReAct Reasoning Pattern** - Think → Act → Observe workflow for intelligent decision making
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Hyphenate “decision-making.”

Minor grammar fix per style/clarity.

✍️ Proposed fix
-- **ReAct Reasoning Pattern** - Think → Act → Observe workflow for intelligent decision making
+- **ReAct Reasoning Pattern** - Think → Act → Observe workflow for intelligent decision-making
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- **ReAct Reasoning Pattern** - Think → Act → Observe workflow for intelligent decision making
- **ReAct Reasoning Pattern** - Think → Act → Observe workflow for intelligent decision-making
🧰 Tools
🪛 LanguageTool

[grammar] ~22-~22: Use a hyphen to join words.
Context: ...bserve workflow for intelligent decision making - Conversational Memory - Per...

(QB_NEW_EN_HYPHEN)

🤖 Prompt for AI Agents
In `@README.md` at line 22, Update the README line that reads "**ReAct Reasoning
Pattern** - Think → Act → Observe workflow for intelligent decision making" to
hyphenate "decision-making" so it becomes "decision-making"; locate and edit the
exact string "**ReAct Reasoning Pattern** - Think → Act → Observe workflow for
intelligent decision making" in the README and replace "decision making" with
"decision-making".

Comment on lines 182 to 187
1. **Configure Environment**: Ensure your `.env` file in the root directory has the required variables:

```bash
GITHUB_TOKEN=your_token
GITHUB_ORG=your_org
```
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Document SUPABASE_ vars if they’re required for startup.*

docker-compose.yml passes SUPABASE_URL and SUPABASE_KEY into the container. If they’re required at runtime, the setup instructions should include them (or explicitly mark them optional).

📄 Suggested doc addition
 GITHUB_TOKEN=your_token
 GITHUB_ORG=your_org
+SUPABASE_URL=your_supabase_url
+SUPABASE_KEY=your_supabase_anon_key
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
1. **Configure Environment**: Ensure your `.env` file in the root directory has the required variables:
```bash
GITHUB_TOKEN=your_token
GITHUB_ORG=your_org
```
1. **Configure Environment**: Ensure your `.env` file in the root directory has the required variables:
🤖 Prompt for AI Agents
In `@README.md` around lines 182 - 187, Update the README's environment
configuration section to document the SUPABASE_URL and SUPABASE_KEY variables
(the SUPABASE_* vars seen being passed into the container) and state whether
they are required at runtime or optional; add example entries for SUPABASE_URL
and SUPABASE_KEY to the `.env` example and a short note that docker-compose
injects these into the container if present, or explicitly mark them optional if
the app can start without them.

@KumarADITHYA123
Copy link
Author

KumarADITHYA123 commented Jan 26, 2026

closes #195
Hi @smokeyScraper and @chandansgowda - PR #250

  • GitHub MCP server containerization completed. Ready for maintainer review. All CodeRabbit issues resolved, CI checks passing, health check verified (200 OK).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant