arc-atlas
CLI (or python -m atlas.cli.export
) to convert those records into JSONL files that plug directly into the training stack (load_runtime_traces
, sessions_to_rl_records
).
1. Enable Postgres persistence
Add astorage
block to your SDK config:
atlas.core.run(..., stream_progress=True)
as usual. Each session, step result, and intermediate event is written to Postgres.
2. Export to JSONL
Start Postgres before exporting (e.g.,
docker compose up -d postgres
or brew services start postgresql
) so the CLI can connect successfully.If another tool owns the
atlas
command on your system, run the exporter with python -m atlas.cli.export ...
or adjust PATH
so arc-atlas
resolves first.Optional filters
--session-id 42
exports a single session.--limit 25
pulls the most recent 25 sessions.--pretty
emits indented JSON (useful for debugging, but larger on disk).
AtlasSessionTrace
:
Tip: Compress large exports withxz
orgzip
—the loader streams line-by-line, so you can decompress on the fly if desired.
3. Feed the training stack
configs/data/runtime_traces.yaml
) described in the top-level quickstart. The exported schema matches the training adapters, so no custom glue code is required.
Troubleshooting
Error | Likely cause | Fix |
---|---|---|
database connection refused | Postgres URL unreachable | Verify host/port, ensure server is running. |
Empty JSONL file | No sessions stored | Confirm storage block is enabled and runs completed successfully. |
Missing rewards in JSON | Judges disabled | Ensure your rim block activates the judges you expect. |
Compatibility aliases
atlas-export
and atlas.export
remain for existing scripts, but new projects should prefer arc-atlas
or python -m atlas.cli.export
.