Chat Assistant is a full-stack workspace with:
- a Vite + React dashboard client
- a NestJS + Express authentication server for Google SSO
- a NestJS worker that syncs Rocket.Chat subscriptions and messages through the main server
- a Rocket.Chat OpenAI bot that replies in direct messages
client/: frontend clientserver/: NestJS backend for Google OAuth and session authuser-data-worker/: NestJS worker for Rocket.Chat data syncingrocket-chat-openai-bot/: Rocket.Chat bot servicedocker-compose.rocket.yml: local Rocket.Chat container setupdocker-compose.app.yml: local app stack for MongoDB, Ollama, server, worker, bot, and client
- Node.js 18+
- npm
- A Google OAuth client
- A Rocket.Chat server
- An OpenAI API key, or local Ollama fallback through the app compose stack
- MongoDB running locally with replica set mode enabled when using the local Rocket.Chat container
From the repo root:
npm installAvailable scripts:
npm run dev- run the Vite clientnpm run dev:client- run the Vite clientnpm run dev:server- run the Nest auth servernpm run dev:bot- run the Rocket.Chat botnpm run dev:worker- run the Rocket.Chat data workernpm run dev:all- run client, server, worker, and bot togethernpm run build- build the Vite clientnpm run build:server- build the Nest servernpm run build:worker- build the Nest workernpm run compose:rocket- start the local Rocket.Chat container fromdocker-compose.rocket.ymlnpm run compose:app- start the app stack fromdocker-compose.app.yml
npm run dev:all starts the local client, server, worker, and bot development processes.
The client runs on http://localhost:8080.
Frontend env is optional in local development because Vite proxies /auth to the Nest server.
See .env.example.
Run only the client:
npm run dev:clientThe backend is a NestJS app using Express sessions and Google OAuth.
It runs on http://localhost:3001 by default, stores users in MongoDB, encrypts Rocket.Chat credentials before saving them, and exposes:
GET /auth/googleGET /auth/google/callbackGET /auth/sessionPOST /auth/logoutPOST /users/me/rocket-integrationPOST /users/internal/rocket-sync/subscriptionsPOST /users/internal/rocket-sync/messages
Setup:
cd server
npm install
cp .env.example .envRequired values in server/.env:
PORT=3001
CLIENT_URL=http://localhost:8080
USER_DATA_WORKER_URL=http://localhost:3002
MONGODB_URI=mongodb://127.0.0.1:27017/chat-assistant
RC_URL=http://localhost:3000
SESSION_SECRET=replace-this-session-secret
INTERNAL_API_KEY=replace-this-with-a-long-random-internal-key
ROCKET_CREDENTIALS_ENCRYPTION_KEY=replace-this-with-a-long-random-secret
ACCESS_TOKEN_SECRET=replace-this-with-a-long-random-access-secret
REFRESH_TOKEN_SECRET=replace-this-with-a-long-random-refresh-secret
REFRESH_TOKEN_HASH_SECRET=replace-this-with-a-long-random-refresh-hash-secret
GOOGLE_CLIENT_ID=your-google-client-id.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET=your-google-client-secret
GOOGLE_CALLBACK_URL=http://localhost:3001/auth/google/callback
SESSION_COOKIE_NAME=chat_assistant_session
SESSION_COOKIE_SECURE=falseGoogle OAuth configuration:
- Authorized JavaScript origin:
http://localhost:3001 - Authorized redirect URI:
http://localhost:3001/auth/google/callback
After Google login, the client sends the user to Rocket Integration, where they provide their Rocket.Chat user token and user id. Those values are encrypted and stored on the user document in MongoDB.
The worker and bot request the integrated Rocket.Chat users from the main server through internal endpoints protected by INTERNAL_API_KEY. If Rocket.Chat returns 401, the app disconnects that integration and clears the stored Rocket.Chat auth fields.
Run only the server:
npm run dev:serverThis repo has two compose files:
docker-compose.rocket.ymlruns a local Rocket.Chat container against MongoDB on your host machine.docker-compose.app.ymlruns the app services: MongoDB, Ollama, server, worker, bot, and client.
Start local Rocket.Chat:
npm run compose:rocketStart the app stack:
npm run compose:appWhen both compose files are started from this repo, the app stack defaults to RC_URL=http://rocketchat:3000 for container-to-container access. For a different Rocket.Chat server, set RC_URL in the root .env before starting the app stack.
The root docker-compose.rocket.yml runs Rocket.Chat on http://localhost:3000 and connects it to MongoDB running on your host machine.
Setup:
cp .env.example .envRoot .env:
MONGO_HOST=host.docker.internalMONGO_HOST must be the hostname or IP address that the Rocket.Chat container can use to reach the MongoDB host. 0.0.0.0 is only valid for MongoDB's bind address; do not use it as MONGO_HOST.
For Docker Desktop, host.docker.internal is the portable default. If the local MongoDB replica set was initialized with a specific host IP, use that same IP instead:
MONGO_HOST=192.168.56.1MongoDB must run with replica set mode enabled. In mongod.cfg:
net:
port: 27017
bindIp: 0.0.0.0
replication:
replSetName: rs0After restarting MongoDB, initialize the replica set once with the same host that Rocket.Chat will use:
rs.initiate({
_id: 'rs0',
members: [{ _id: 0, host: '192.168.56.1:27017' }],
});Start Rocket.Chat:
npm run compose:rocketThe worker lives in user-data-worker/. It polls Rocket.Chat using the integrated users returned by the main server, then persists subscriptions and messages back through internal main-server endpoints.
Setup:
cd user-data-worker
npm install
cp .env.example .envRequired values in user-data-worker/.env:
MAIN_SERVER_URL=http://localhost:3001
INTERNAL_API_KEY=replace-this-with-the-same-internal-key-used-by-server
RC_URL=http://localhost:3000
FULL_SUBSCRIPTIONS_SYNC_INTERVAL_MS=3600000
INCREMENTAL_SUBSCRIPTIONS_SYNC_INTERVAL_MS=900000
RC_REQUEST_INTERVAL_MS=500
RC_RETRY_BACKOFF_MS=5000
MAIN_SERVER_BATCH_SIZE=25
OPENAI_API_KEY=your-openai-api-key
SUMMARY_MODEL=gpt-4.1-mini
EMBEDDING_MODEL=text-embedding-3-small
SUMMARY_SOURCE_MESSAGE_LIMIT=100The worker reconciles Rocket.Chat subscriptions with the app database. When a subscription disappears from Rocket.Chat, the worker removes that subscription and its saved summarization data from the app database. While a sync is pending or running, the preferences page warns the user that the displayed data may be stale.
Run only the worker:
npm run dev:workerThe bot lives in rocket-chat-openai-bot/. It does not use a single static Rocket.Chat token from .env; it asks the main server for every user with a saved Rocket.Chat integration, then starts a runner for each integrated user.
It ignores existing messages when it first syncs a room, so it only replies to new ones after startup.
Setup:
cd rocket-chat-openai-bot
npm install
cp .env.example .envKey bot env values:
RC_URLMAIN_SERVER_URLINTERNAL_API_KEYOPENAI_API_KEY, or Ollama fallback settingsOPENAI_MODELOLLAMA_BASE_URLOLLAMA_MODEL
Run only the bot:
npm run dev:botFor full bot configuration, see rocket-chat-openai-bot/README.md.
For the local Docker path, start Rocket.Chat first and then the app stack:
npm run compose:rocket
npm run compose:appStart everything together from the repo root:
npm run dev:allOr run services separately:
npm run dev:server
npm run dev:worker
npm run dev:client
npm run dev:bot- Vite
- React
- TypeScript
- NestJS
- Express
- Google OAuth
- MongoDB
- Rocket.Chat
- OpenAI API
- Tailwind CSS
- Radix UI