Are you an LLM? Read llms.txt for a summary of the docs, or llms-full.txt for the full context.
Skip to content

amodal chat

Open an interactive terminal chat with your agent. The chat UI is a React-based TUI built with Ink.

amodal chat

Modes

Local mode (default)

Boots a local runtime server from your repo and connects to it:

amodal chat

Remote mode

Connect to an already-running server:

amodal chat --url http://localhost:3847
amodal chat --url https://my-agent.amodal.ai

Snapshot mode

Load from a snapshot file (built with amodal build):

amodal chat --config snapshot.json

Options

FlagDescription
--url <remote>Connect to remote server
--config <file>Load from snapshot
--tenant-id <id>Tenant identifier
--port <number>Local server port
--resume <id|latest>Resume a previous session
--fullscreenUse alternate terminal buffer

Features

  • Streaming responses — see the agent think in real-time
  • Tool call display — watch tool invocations as they happen
  • Skill activation — see which reasoning framework is active
  • Session resume — pick up where you left off
  • Session browser — navigate previous conversations
  • Markdown rendering — formatted output in the terminal
  • Responsive layout — adapts to terminal size