Skip to main content
Tell server configuration lives in a single TOML file. This page covers the server-side settings — API server, authentication, query backend, logging, and integrations. For pipeline configuration (sources, sinks, routing), see Pipeline Configuration.

Minimal server config

A Tell server with local auth and ClickHouse query backend:
[api_server]
port = 3000

[auth]
provider = "local"
jwt_secret = "your-secret-key-at-least-32-characters-long"

[query]
clickhouse_url = "http://localhost:8123"
clickhouse_database = "tell"

API server

[api_server]
enabled = true
host = "::"
port = 3000
audit_logging = false
control_db = "~/.tell/control.db"
FieldDefaultNotes
port3000HTTP port
host"::"Bind address (dual-stack IPv4+IPv6)
audit_loggingfalseLog all HTTP requests
control_db~/.tell/control.dbSQLite database for workspaces, boards, users
tls_cert_pathTLS certificate PEM path
tls_key_pathTLS private key PEM path

Authentication

Tell supports local auth (built-in) and WorkOS (SSO).

Local auth

[auth]
provider = "local"
jwt_secret = "your-secret-key-at-least-32-characters-long"
jwt_expires_in = "24h"

[auth.local]
db_path = "data/users.db"
allow_registration = false
  • jwt_secret — required, must be at least 32 characters
  • allow_registration — set to true to let users self-register after initial setup
  • jwt_expires_in — how long tokens last (default: 24 hours)

WorkOS (SSO)

[auth]
provider = "workos"

[auth.workos]
api_key = "sk_live_..."
client_id = "client_..."
redirect_uri = "https://your-domain.com/auth/callback"
Both api_key and client_id are required when using WorkOS.

Query backend

Tell auto-detects the query backend. If a ClickHouse sink is configured, it uses ClickHouse. Otherwise, it falls back to the local Polars engine.
[query]
clickhouse_url = "http://localhost:8123"
clickhouse_database = "tell"
clickhouse_username = "default"
clickhouse_password = ""
Or reference an existing sink by name:
[query]
clickhouse_sink = "clickhouse"

Local (Polars)

[query]
backend = "polars"
data_dir = "./data"
Reads Arrow IPC files from disk. Good for development and small deployments.

Logging

[log]
level = "info"
format = "console"
output = "stdout"
FieldDefaultOptions
level"info"trace, debug, info, warn, error
format"console""console" (human-readable), "json" (structured)
output"stdout""stdout", "stderr", or a file path

MCP server

[mcp]
http_enabled = true
stdio_enabled = true
HTTP transport runs on the API server at /api/v1/mcp. Stdio transport is available via tell mcp. See MCP for details.

LLM integration

[llm]
enabled = true
provider = "anthropic"
model = "claude-sonnet-4-5-20250929"
FieldDefaultNotes
enabledfalseEnable AI assistant features
provider"anthropic""anthropic" or "openai-compatible"
model"claude-sonnet-4-5-20250929"Model ID
api_keyAPI key (env variable takes priority)
base_urlCustom URL for OpenAI-compatible providers

API client

Settings for the tell CLI when connecting to a remote server:
[api]
url = "http://localhost:3000"
credential_store = "file"
FieldDefaultOptions
urlhttp://localhost:3000Server URL
credential_store"file""file", "keyring", or "auto"

Metrics reporting

[metrics]
enabled = true
interval = "1h"
format = "human"
Controls pipeline metrics output — batches processed, bytes written, connection counts. Set format = "json" for machine-readable output.

Connectors

[connectors.github]
type = "github"
enabled = true
schedule = "0 */6 * * *"
Each connector has a name, type, and optional cron schedule. Connector-specific fields vary by type. See Connectors for details.