Discord bot for user monitoring and management with Ollama-based, LLM-agnostic hate speech detection.
-
[OPTIONAL] Run the Ollama client in the Docker hosting machine.
-
Assign the "NODE_ENV" environment variable as following
export NODE_ENV="development/production/local"with the following format:NODE_ENV= DISCORD_OAUTH2_TOKEN= DISCORD_CLIENT_ID= CASSIE_HOST= CASSIE_PORT= CASSIE_KEYSPACE= OLLAMA_API_HOST= OLLAMA_API_PORT= OLLAMA_API_MODEL= OLLAMA_API_MODEL_SESSION_ID= COMPOSE_PROJECT_NAME=# optional CLIENT_LOCALE=# optional
1.1 ENV_FILE values respectively by NODE_ENV value given:
NODE_ENV ENV_FILE production .env development .env.dev test .env.dev "any" .env.local 1.2 Assign
ANALYSIS_MODELenvironment variable to be pulled with the Ollama API service container initialization. If not it will be set as default that's "gemma3:1b". -
Run
docker compose upat root directory.2.1 Optionally assign the
ANALYZE_MESSAGE_SUCCESS_RATIOenvironment variable with a decimal number between 0 and 1 to adjust the analysis success ratio (default is0.95= 95%). IMPORTANT: this will determinate the applicationdocker compose upoutcome and depends entirely and directly on both the ratio and model used respectively.