LLM calls
OPENAI_API_KEY: OpenAI API keyANTHROPIC_API_KEY: Anthropic API keyGOOGLE_API_KEY: Google API keyLLM_CACHE_PATH: Path to the LLM cache
You don’t have to specify API keys for all providers; only ones that are used. See here for details on adding new providers and customizing Docent’s LLM API calls.
Postgres
We have provided reasonable defaults in.env.template, but you’re welcome to customize these as needed.
DOCENT_PG_USER: Postgres usernameDOCENT_PG_PASSWORD: Postgres passwordDOCENT_PG_HOST: Postgres hostDOCENT_PG_PORT: Postgres portDOCENT_PG_DATABASE: Postgres database (notpostgres)
Redis
We have provided reasonable defaults in.env.template, but you’re welcome to customize these as needed.
DOCENT_REDIS_HOST: Redis hostDOCENT_REDIS_PORT: Redis portDOCENT_REDIS_USER: Redis username (optional)DOCENT_REDIS_PASSWORD: Redis password (optional)
CORS
DOCENT_CORS_ORIGINS: CSV list of allowed frontend origins (optional)- Leave empty/unset for development (defaults to
localhost:*) - Example for multiple domains:
DOCENT_CORS_ORIGINS=https://app.yourdomain.com,https://admin.yourdomain.com
- Leave empty/unset for development (defaults to
Optional variables for deployed environments
DEPLOYMENT_ID: ID of the deployment (unset for local)SENTRY_DSN: Sentry DSNPOSTHOG_API_KEY: PostHog API keyPOSTHOG_API_HOST: PostHog API host (defaults tohttps://us.i.posthog.com)

