Chat UI documentation
Configuration Overview
Configuration Overview
Chat UI is configured through environment variables. Default values are in .env; override them in .env.local or via your environment.
Required Configuration
Chat UI connects to any OpenAI-compatible API endpoint:
OPENAI_BASE_URL=/static-proxy?url=https%3A%2F%2Frouter.huggingface.co%2Fv1
OPENAI_API_KEY=hf_************************Models are automatically discovered from ${OPENAI_BASE_URL}/models. No manual model configuration is required.
Database
MONGODB_URL=mongodb://localhost:27017
MONGODB_DB_NAME=chat-uiFor development, MONGODB_URL is optional - Chat UI falls back to an embedded MongoDB that persists to ./db.
Model Overrides
To customize model behavior, use the MODELS environment variable (JSON5 format):
MODELS=`[
{
"id": "meta-llama/Llama-3.3-70B-Instruct",
"name": "Llama 3.3 70B",
"multimodal": false,
"supportsTools": true
}
]`Override properties:
id- Model identifier (must match an ID from the/modelsendpoint)name- Display name in the UImultimodal- Enable image uploadssupportsTools- Enable MCP tool calling for models that don’t advertise tool supportparameters- Override default parameters (temperature, max_tokens, etc.)
Task Model
Set a specific model for internal tasks (title generation, etc.):
TASK_MODEL=meta-llama/Llama-3.1-8B-InstructIf not set, the current conversation model is used.
Voice Transcription
Enable voice input with Whisper:
TRANSCRIPTION_MODEL=openai/whisper-large-v3-turbo
TRANSCRIPTION_BASE_URL=/static-proxy?url=https%3A%2F%2Frouter.huggingface.co%2Fhf-inference%2Fmodels%3C!-- HTML_TAG_END -->Feature Flags
LLM_SUMMARIZATION=true # Enable automatic conversation title generation
ENABLE_DATA_EXPORT=true # Allow users to export their data
ALLOW_IFRAME=false # Disallow embedding in iframes (set to true to allow)User Authentication
Use OpenID Connect for authentication:
OPENID_CLIENT_ID=your_client_id
OPENID_CLIENT_SECRET=your_client_secret
OPENID_SCOPES="openid profile"See OpenID configuration for details.
Environment Variable Reference
See the .env file for the complete list of available options.