AI Agent — Data Sources

The AI agent ingests log signals via pluggable sources. Every source implements the same contract — pull new signals since a cursor, return the next cursor — so they can be mixed freely in one deployment.

SourceType stringBest for
FilefileLocal files, container stdout via volume, fixtures
ElasticsearchelasticsearchELK, Elastic Cloud, OpenSearch
LokilokiGrafana Loki self-hosted, Grafana Cloud Logs
CloudWatch LogscloudwatchlogsAWS Lambda, ECS, EKS, EC2

How sources are configured

Sources live in a separate file, agent_sources.yaml, sitting next to your main config.yaml. The file is optional. When present, it REPLACES any inline agent.sources from the main config.

# agent_sources.yaml
sources:
  - name: my-source        # unique, used in cursor keys & admin views
    type: file             # one of: file | elasticsearch | loki | cloudwatchlogs
    enable: true
    file:                  # block name MUST match `type`
      path: /var/log/app.log

Multiple sources are supported — each runs on its own goroutine with an independent cursor.

Cursor & ordering

Every source is cursor-based:

  • Pull(ctx, since) returns signals strictly after since plus the new cursor the worker should pass back next tick.
  • The worker stores the cursor in Redis under versus:agent:cursor:<source> (RFC3339Nano timestamp) and falls back to in-memory state when Redis is unavailable. The file source uses a sidecar .cursor file with the byte offset instead.
  • On first start (no cursor), the agent backfills agent.lookback worth of history (default 5m).

This means restarts are safe: the agent picks up exactly where it left off, and no signal is processed twice or skipped.

Try it locally

The runnable docker-compose example ships with Versus + Redis + Loki + Elasticsearch + Grafana + Kibana so you can experiment with all source types in a single docker compose up.