Skip to main content
Read events, logs, or raw lines from a file and push them into Tell’s pipeline. Use this for one-time imports, replaying exported data, or ingesting log files from disk.
[sources.file]
path = "/var/log/app.log"
mode = "lines"
Tell reads the file line by line, batches the content, and routes it through the pipeline like any other source.

Read modes

The file source supports three modes, each producing different batch types:
ModeOutputUse case
linesSyslog batchesRaw log files, one line per message
jsonlLog or Event batchesStructured JSON — one JSON object per line
binaryOriginal batchesRe-import data previously exported by the disk sink

Lines mode

Each line becomes a syslog message. Empty lines are skipped.
[sources.file]
path = "/var/log/nginx/access.log"
mode = "lines"
one_shot = true
This reads the entire file, batches every 500 lines, and exits. Use transforms downstream to parse structured fields from raw lines.

JSONL mode

Parse each line as a JSON object and produce typed batches. Set format to control the schema:

Log format

[sources.file]
path = "logs.jsonl"
mode = "jsonl"
format = "log"
service = "api"
Each line is a JSON log entry:
{"level": "error", "message": "connection refused", "service": "api", "timestamp": 1718467200000}
{"level": "info", "message": "request completed", "data": {"status": 200, "duration_ms": 42}}
FieldTypeRequiredDescription
messagestringYesLog message text.
levelstringNodebug, info, warn, error. Default: info.
timestampintegerNoUnix milliseconds. Default: current time.
sourcestringNoSource location (e.g., app.js:42).
servicestringNoService name. Falls back to config service value.
session_idstringNoSession UUID.
typestringNoLog type. Default: log.
dataobjectNoAdditional properties merged into the payload.

Event format

[sources.file]
path = "events.jsonl"
mode = "jsonl"
format = "event"
Each line is a JSON event:
{"type": "page_view", "device_id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890", "properties.page": "/pricing"}
{"type": "purchase", "device_id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890", "properties.amount": 99}
FieldTypeRequiredDescription
typestringYesEvent type.
device_idstringYesDevice UUID.
eventstringNoEvent name.
session_idstringNoSession UUID.
user_idstringNoUser identifier.
group_idstringNoGroup identifier.
timestampintegerNoUnix milliseconds. Default: current time.
servicestringNoService name.
Additional fields are flattened into event properties. Parse errors in JSONL mode are logged and skipped — a malformed line won’t stop the import.

Binary mode

Re-import data previously exported by the disk sink. Tell reconstructs the original batches with their metadata, including source IPs.
[sources.file]
path = "/data/tell/export/2025-06-15.bin"
mode = "binary"
This is useful for replaying historical data or migrating between Tell instances.

One-shot vs continuous

By default, the file source reads the file once and exits (one_shot = true). Set one_shot = false to keep the source running after reading:
[sources.file]
path = "test-data.jsonl"
mode = "jsonl"
format = "event"
one_shot = false

Configuration reference

[sources.file]
path = ""                # File path to read (required)
mode = "lines"           # "lines", "jsonl", or "binary"
format = "log"           # "log" or "event" (JSONL mode only)
workspace_id = "1"       # Workspace ID for routing
service = ""             # Service name for log enrichment
batch_size = 500         # Items per batch
one_shot = true          # Exit after reading the file
The file source requires the source-file feature flag at build time.

What’s next

  • Routing — control where file data goes after ingestion
  • Transforms — parse and enrich raw lines before storage
  • Disk sink — export data that binary mode can re-import