Minimal config
A working pipeline with one source and one sink:Sources
Sources define where data comes in. You can run multiple sources at the same time.TCP
The primary source for SDK data. SDKs send FlatBuffer batches over TCP.| Field | Default | Notes |
|---|---|---|
port | 50000 | Listen port (required) |
address | "::" | Bind address (IPv4+IPv6) |
flush_interval | "100ms" | Batch flush interval |
max_connections | 10000 | Max concurrent connections |
forwarding_mode | false | Trust upstream Tell instances |
HTTP
REST API source for webhook integrations and browser clients.| Field | Default | Notes |
|---|---|---|
port | 8080 | Listen port |
max_payload_size | 10MB | Max request body |
cors_enabled | false | Enable for browser clients |
trust_proxy | false | Trust X-Forwarded-For |
tls_cert_path | — | TLS certificate path |
tls_key_path | — | TLS private key path |
Syslog
Collect logs from syslog-compatible systems (RFC 3164/5424).workspace_id since syslog clients don’t authenticate with API keys.
Sinks
Sinks define where data goes. Each sink has a name and a type.ClickHouse (recommended)
| Field | Default | Notes |
|---|---|---|
host | — | ClickHouse HTTP address (required) |
database | "default" | Database name |
batch_size | 50000 | Rows per insert |
flush_interval | "5s" | Flush interval |
Disk
Binary and plaintext file sinks for local storage.| Field | Default | Notes |
|---|---|---|
path | — | Output directory (required) |
rotation | "daily" | "hourly" or "daily" |
compression | "none" | "none" or "lz4" |
Parquet
Columnar storage for data warehousing.snappy, zstd, lz4, uncompressed.
Arrow IPC
Fast columnar storage for hot data — readable with DuckDB, PyArrow, or Polars.Forwarder
Send data to another Tell instance for edge-to-cloud deployments.api_key must be exactly 32 hex characters.
Routing
Routing connects sources to sinks. Data from a source goes through matching rules and is delivered to the configured sinks.default— sinks for traffic that doesn’t match any rulematch— filter bysource(exact name) orsource_type("tcp","syslog")sinks— where to send matched data (must exist in[sinks])transformers— transforms to apply in order before writing
Transforms
Transforms modify data in routing rules before it reaches sinks.pattern_matcher, redact, filter, reduce. See Transforms for configuration details.
Global defaults
Tune pipeline-wide defaults in[global]: