Skip to main content
When you ask Tell a question with tell ask or through MCP, the response comes back as blocks — a streaming JSONL format that renders as rich UI on any platform. The same JSON produces charts in your terminal, on the web dashboard, and in native apps.

How it works

An LLM generates one JSON object per line. Each object has a t field that identifies its type. Your client reads lines as they arrive and renders each block immediately — no waiting for the full response.
{"t":"text","c":"Here are your metrics:"}
{"t":"metric","title":"DAU","value":12847,"unit":"users","change":15.2}
{"t":"chart","v":"sparkline","data":[[0,10],[1,15],[2,12]]}
This renders as a text paragraph, a large number with a trend arrow, and a sparkline chart — all streaming in real time.

Block types

Tell supports 8 block types:
TypeWhat it shows
textPlain text paragraph
metricSingle KPI — value, unit, trend, color
metrics2-4 KPIs in a row
chartLine, bar, area, pie, or sparkline
tableRows and columns with headers
statusHealth indicator (healthy, degraded, down, unknown)
calloutAlert box (info, warning, error, success)
codeSource code with syntax highlighting

metric

A single number with optional context:
{"t":"metric","title":"Error Rate","value":0.12,"unit":"%","change":-5.2,"color":"success"}
  • title — what the number measures
  • value — the number itself
  • unit — display unit (users, %, ms, etc.)
  • change — percentage trend (positive = up, negative = down)
  • color — hint: success, error, warning, info, or default

metrics

Multiple KPIs side by side:
{"t":"metrics","items":[
  {"title":"DAU","value":12847,"unit":"users"},
  {"title":"WAU","value":45230,"unit":"users"},
  {"title":"MAU","value":128450,"unit":"users"}
]}

chart

Data visualization with five chart variants:
{"t":"chart","v":"line","title":"User Growth","data":[[1704067200000,10000],[1704153600000,10500],[1704240000000,11200]]}
  • v — chart variant: line, bar, area, pie, or sparkline
  • data — array of [x, y] pairs (x is typically a timestamp in milliseconds)
In the TUI, sparklines render as compact Unicode bars (▂▃▅▆▇).

table

Structured data with headers:
{"t":"table","cols":["Service","Requests","P99"],"rows":[["api",52000,"45ms"],["auth",31000,"23ms"]]}
Cell values can be strings or numbers.

status

Health indicators for systems or components:
{"t":"status","label":"API Gateway","state":"healthy","message":"All operational"}
States: healthy (green), degraded (yellow), down (red), unknown (gray).

callout

Highlighted notices:
{"t":"callout","v":"warning","title":"Rate Limit","c":"You're at 80% quota"}
Variants: info, warning, error, success.

code

Source code with optional syntax highlighting:
{"t":"code","lang":"sql","c":"SELECT event_name, COUNT(*) FROM events_v1 GROUP BY event_name"}

Streaming

Blocks stream as JSONL — one complete JSON object per line. The parser buffers partial data and emits blocks as each line completes. If a line can’t be parsed, it’s reported as an error without crashing the stream. This means LLM output renders progressively as it’s generated.

Where blocks appear

  • tell ask — CLI responses rendered in the terminal
  • MCP tools — AI assistant responses in editors and chat interfaces
  • TUI — interactive dashboard views
  • Web dashboard — rendered as React components