Skip to content

Write an arka-deck worker

A worker is a single-shot headless LLM process. It receives structured JSON, produces structured JSON, and exits. No chat, no multi-turn, no direct user interface.


Workers handle tasks that need an LLM but no interaction: analyze a project context, draft a note, propose a squad composition.

Worker cycle:

  1. arka-deck’s system invokes it with a JSON payload (via the for-workers.ts port)
  2. The worker calls the configured LLM with its prompt + payload
  3. It returns structured JSON
  4. The parent addon receives the result and exploits it

Existing workers and their parent addon:

WorkerParent addonStatusRole
cortex-favorites-suggestercortex-actionsactiveProposes 5-15 Cortex artifacts relevant to a project
memory-note-writermemory-localactiveDrafts an L1-L5 memory note from a session transcript
governance-policy-designergouvernance-liteactiveAssists project governance policy design
squad-creatorsquad-leaderactiveProposes a squad composition from a mission

A worker is declared by a manifest.json. The runtime implementation (LLM invocation, JSON parsing) is handled by arka-deck — the worker itself contains only its declaration and prompt.

workers/<name>/
├── manifest.json # required
├── prompt.json # if the prompt is local (optional — otherwise via Cortex)
└── # worker description

{
"name": "my-worker",
"version": "1.0.0",
"description": "What the worker does in one sentence.",
"status": "active",
"input": "Expected JSON payload description.",
"output": "Produced JSON description.",
"addon_parent": "my-addon",
"runtime": "claude-headless",
"model": "haiku",
"service": {
"famille_cortex": "SERVICE",
"sous_famille_cortex": "service/service/my_worker",
"tool_name": "my_worker_template",
"prompt_source": "cortex://atoms/my_worker_prompt",
"prompt_atom_id": "atom:my_worker_prompt",
"preferred_model": "haiku",
"model_override_via": "Settings > Services (user config)"
}
}
FieldTypeDescription
namestringkebab-case identifier — must match the folder name
versionstringSemver
descriptionstringWhat the worker does
statusstring"active" / "draft"
inputstringInput JSON description (prose format or pseudo-schema)
outputstringOutput JSON description
addon_parentstringName of the addon that owns this worker (null if standalone)
runtimestringSee available runtimes below
service.prompt_sourcestringPrompt source — see formats below
RuntimeDescription
claude-headlessClaude call via Anthropic SDK — ideal for short analysis tasks
llm-jsonMulti-provider LLM call in strict JSON mode — returns a parsed JSON object
FormatExampleDescription
cortex://atoms/<id>cortex://atoms/notewriter_note_promptPublic Cortex atom — fetched at runtime
Local pathworkers/squad-creator/prompt.jsonLocal JSON file in the repo

The Cortex format is preferable for active workers: the prompt is versioned in Cortex and can be updated without modifying the repo.


{
"name": "cortex-favorites-suggester",
"runtime": "claude-headless",
"model": "haiku",
"input": "JSON: { project: { name, description }, agents: [{label, sourceId}], recentSessions: [{summary, agentLabel}], memorySummary?: string, libraryTree: { modes, blocs: { types: { skill, expertise, tool, methode, scope } } } }",
"output": "JSON: { suggestions: [{ ref: { type: 'atome'|'bloc', id, label }, justification }] } — 5 to 15 items"
}
{
"name": "memory-note-writer",
"runtime": "llm-json",
"model": "service-assigned",
"input": "JSON: { projectPath, sessionId, agentId?, agentLabel?, transcript, recentNotes, profileMemory, projectMemory }",
"output": "JSON: { level: 'L1'|'L2'|'L3'|'L4'|'L5', title, summary?, content, tags?, actions?, decisions? }"
}
{
"name": "squad-creator",
"runtime": "llm-json",
"model": "gemini-2.5-flash",
"addon_parent": "squad-leader",
"input": "Mission in free text, optional constraints, available profile catalogue",
"output": "List of recommended agents (profile + justification), designated leader, strategy summary"
}

A worker must be declared in its parent addon’s manifest.json:

{
"name": "my-addon",
"workers": ["my-worker"]
}

The addon invokes the worker via the for-workers.ts port (use-case side):

const result = await deps.workers.invoke('my-worker', {
projectPath,
// ... JSON payload per the worker's manifest
});

The JSON result is directly usable by the addon.


The manifest declares a preferred model. The user can override it from Settings → Services.

CaseRecommended model
Short analysis, classification taskhaiku
Structured drafting, extractiongemini-2.5-flash or haiku
Complex task, reasoningsonnet

Rule of thumb: start with haiku — the worker is 1-shot, the cost is proportional to the chosen model.


  • A worker cannot start a multi-turn conversation
  • A worker cannot directly call another worker
  • A worker cannot write to .arka-deck/ — the parent addon persists the result
  • draft workers can evolve without notice