For AI agents: Read this file to learn how to install, configure, use, and extend openrappter. This is the single source of truth.
openrappter is a dual-runtime (Python + TypeScript) AI agent framework. It uses GitHub Copilot as the cloud AI backbone β your agent data (memory, config, state) stays local in ~/.openrappter/. Copilot handles inference; everything else runs on the user's machine.
- Repo:
https://github.com/kody-w/openrappter - License: MIT
- TypeScript Version: 1.8.0
- Python Version: 1.8.0
| Requirement | Check Command | Notes |
|---|---|---|
| Node.js 18+ | node --version |
TypeScript runtime |
| Python 3.10+ | python3 --version |
Python runtime |
| GitHub Copilot CLI | copilot --version |
Required β provides AI-powered routing via the Copilot SDK. |
git clone https://github.com/kody-w/openrappter.git
cd openrappterSee data sloshing, agents, and chaining in action β no build step, no API keys:
./quickstart.shThis installs TypeScript dependencies (if needed) and runs a guided 5-step tour showing data sloshing, ShellAgent, MemoryAgent, and agent-to-agent chaining. Takes ~5 seconds. After the demo, continue below for the full setup.
cd typescript
npm install
npm run buildcd python
pip install -e .
# If pip version is old or editable mode fails:
pip install .
# Or run directly without installing:
python3 -m openrappter.cli --statusTo run openrappter fully, start all three components:
# Terminal 1: Interactive CLI (chat mode)
cd typescript
npm run dev
# Terminal 2: Gateway (WebSocket backend for UI)
cd typescript
npx tsx src/index.ts --daemon # ws://127.0.0.1:18790
# Terminal 3: Web UI
cd typescript/ui
npm run dev # http://localhost:3000The gateway must be running before the UI can connect. The CLI can run independently.
openrappter includes a web-based chat dashboard built with Lit + Vite that connects to the gateway via WebSocket.
The gateway is the WebSocket backend that the UI connects to. Start it first:
cd typescript
npm run build # Build first (if not done)
node dist/index.js --daemon # Start gateway on ws://127.0.0.1:18790
# Or in development mode:
npx tsx src/index.ts --daemonThe gateway runs on port 18790 by default. Set OPENRAPPTER_PORT to change it.
In a second terminal:
cd typescript/ui
npm install # First time only
npm run dev # Starts Vite on http://localhost:3000Open http://localhost:3000 in your browser. The UI auto-connects to the gateway and supports:
- Chat with streaming responses
- Markdown rendering in assistant messages
- Sidebar navigation (Chat, Channels, Sessions, Agents, Skills, Cron, Config, Devices, Health, Logs)
- Auto-reconnect if the gateway restarts
cd typescript/ui
npm run build # Outputs to typescript/dist/ui/| Variable | Default | Description |
|---|---|---|
OPENRAPPTER_PORT |
18790 |
Gateway WebSocket port |
OPENRAPPTER_TOKEN |
(none) | Auth token for gateway connections |
OPENRAPPTER_MODEL |
(default) | AI model override |
OPENRAPPTER_HOME |
~/.openrappter |
Data directory for config, memory, skills |
OPENAI_API_KEY |
(none) | OpenAI provider API key (optional) |
ANTHROPIC_API_KEY |
(none) | Anthropic provider API key (optional) |
OLLAMA_URL |
(none) | Ollama server URL (optional) |
LOG_LEVEL |
(default) | Logging verbosity |
Run these commands after install. All must succeed before proceeding.
cd typescript
# Status check β expect "Agents: 2 loaded" (Shell + Memory; Assistant is the orchestrator)
node dist/index.js --status
# Memory store
node dist/index.js "remember that I installed openrappter"
# Memory recall
node dist/index.js "recall openrappter"
# Shell test
node dist/index.js "ls"cd python
# Status check β expect agents_loaded: 7
python3 -m openrappter.cli --status
# List agents
python3 -m openrappter.cli --list-agents
# Memory test (use --task flag for positional arg)
python3 -m openrappter.cli --task "remember that Python works"node dist/index.js [options] [message]openrappter [options] # If pip-installed
python3 -m openrappter.cli [options] # Direct| Option | Description |
|---|---|
[message] |
Send a single message (TypeScript only as positional arg) |
-t, --task <task> |
Run a task and exit |
-s, --status |
Show agent status |
--list-agents |
List all discovered agents |
--exec <agent> <query> |
Execute a specific agent directly |
-e, --evolve <n> |
Run N evolution ticks |
-d, --daemon |
Run as background daemon |
-v, --version |
Show version |
-h, --help |
Show help |
onboard |
Run interactive setup wizard (TypeScript) |
| Command | Description |
|---|---|
/help |
Show help |
/agents |
List available agents |
/status |
Show agent status |
/quit |
Exit |
| Agent | Name | Description |
|---|---|---|
| ManageMemory | ManageMemory |
Stores important information to memory for future reference. Accepts content, importance, memory_type, tags. |
| ContextMemory | ContextMemory |
Recalls and provides context based on stored memories of past interactions. |
| Shell | Shell |
Executes shell commands and file operations. Actions: bash, read, write, list. |
| LearnNew | LearnNew |
Creates new agents from natural language descriptions. Generates code, writes to agents/, and hot-loads. |
| Agent | Name | Description |
|---|---|---|
| Assistant | Assistant |
Copilot SDK-powered orchestrator that routes user queries to agents via tool calling. |
| MemoryAgent | Memory |
Stores and recalls facts in persistent memory. Actions: remember, recall, list, forget. |
| ShellAgent | Shell |
Executes shell commands and file operations. Actions: bash, read, write, list. |
Agents are routed via the Copilot SDK using tool calling β the Assistant agent determines the best agent for each query:
# Memory keywords: remember, store, save, recall, memory, forget
openrappter --task "remember that the deploy command is npm run deploy"
openrappter --task "recall deploy"
# Shell keywords: run, execute, bash, ls, cat, read file, list dir
openrappter --task "ls"
openrappter --task "read README.md"
# Direct agent execution
openrappter --exec Shell "ls -la"
openrappter --exec ManageMemory "save this fact"
# TypeScript equivalents
node dist/index.js "remember my API endpoint is /v2/users"
node dist/index.js --exec Shell "ls"Agents are auto-discovered by file naming convention. Drop a file in the agents/ directory and the registry finds it β no manual registration needed.
Create python/openrappter/agents/my_agent.py:
from openrappter.agents.basic_agent import BasicAgent
import json
class MyAgent(BasicAgent):
def __init__(self):
self.name = 'MyAgent'
self.metadata = {
"name": self.name,
"description": "Describe what this agent does",
"parameters": {
"type": "object",
"properties": {
"query": {"type": "string", "description": "User input"}
},
"required": []
}
}
super().__init__(name=self.name, metadata=self.metadata)
def perform(self, **kwargs):
query = kwargs.get('query', '')
# self.context has enriched signals from data sloshing (see Section 10)
return json.dumps({"status": "success", "result": query})Create typescript/src/agents/MyAgent.ts:
import { BasicAgent } from './BasicAgent.js';
import type { AgentMetadata } from './types.js';
export class MyAgent extends BasicAgent {
constructor() {
const metadata: AgentMetadata = {
name: 'MyAgent',
description: 'Describe what this agent does',
parameters: {
type: 'object',
properties: {
query: { type: 'string', description: 'User input' }
},
required: []
}
};
super('MyAgent', metadata);
}
async perform(kwargs: Record<string, unknown>): Promise<string> {
const query = kwargs.query as string;
// this.context has enriched signals from data sloshing (see Section 10)
return JSON.stringify({ status: 'success', result: query });
}
}After creating, rebuild TypeScript (npm run build) β Python agents are hot-loaded automatically.
- Extend
BasicAgentβ do not implement from scratch - Implement
perform()β this is called by the orchestrator after context enrichment - Return JSON string β always
{"status": "success|error", ...} - Metadata format β OpenAI tools format with
name,description,parameters - File naming β
*_agent.py(Python) or*Agent.ts(TypeScript) for auto-discovery
The LearnNew agent can create agents from natural language:
openrappter --exec LearnNew "create an agent that fetches weather data"
# Generates code, writes to agents/, hot-loads, installs dependencies if neededMemory persists across sessions. Python stores memory in ~/.openrappter/memory.json. TypeScript uses SQLite at ~/.openrappter/memory.db.
# Python
openrappter --task "remember that the database is PostgreSQL"
# TypeScript
node dist/index.js "remember that the database is PostgreSQL"openrappter --task "recall database"
node dist/index.js "recall database"node dist/index.js "forget database"{
"mem_1707100000000": {
"message": "the database is PostgreSQL",
"theme": "general",
"timestamp": "2025-02-05T10:00:00.000Z"
}
}| File | Purpose |
|---|---|
~/.openrappter/config.json |
Configuration settings (also supports config.json5, config.yaml) |
~/.openrappter/memory.json |
Persistent memory store (Python) |
~/.openrappter/memory.db |
Persistent memory store (TypeScript, SQLite) |
~/.openrappter/state.json |
Agent state data |
~/.openrappter/skills/ |
Installed ClawHub/RappterHub skills |
~/.openrappter/sessions/ |
Session transcripts (JSONL) |
~/.openrappter/workspace/ |
Per-agent workspaces |
Every agent call is automatically enriched with contextual signals before perform() runs. Agents never execute "blind." Access via self.context (Python) or this.context (TypeScript).
| Signal | Keys | Description |
|---|---|---|
| Temporal | time_of_day, day_of_week, is_weekend, quarter, fiscal, likely_activity, is_urgent_period |
Time awareness |
| Query Signals | specificity, hints, word_count, is_question, has_id_pattern |
What the user is asking |
| Memory Echoes | message, theme, relevance |
Relevant past interactions |
| Behavioral | prefers_brief, technical_level, frequent_entities |
User patterns |
| Orientation | confidence, approach, hints, response_style |
Synthesized action guidance |
| Upstream Slush | source_agent, plus agent-declared signals |
Live data from the previous agent in a chain |
# Python β in perform()
time = self.get_signal('temporal.time_of_day')
confidence = self.get_signal('orientation.confidence')
is_brief = self.get_signal('behavioral.prefers_brief', False)
# Access upstream agent signals (when chained)
upstream = self.context.get('upstream_slush', {})
prev_agent = upstream.get('source_agent')// TypeScript β in perform()
const time = this.getSignal('temporal.time_of_day');
const confidence = this.getSignal('orientation.confidence');
const isBrief = this.getSignal('behavioral.prefers_brief', false);
// Access upstream agent signals (when chained)
const upstream = this.context?.upstream_slush;
const prevAgent = upstream?.source_agent;Agents can return a data_slush field in their JSON output β curated signals extracted from live results. The framework automatically extracts this and makes it available for downstream chaining.
# Agent A β return data_slush with curated signals
def perform(self, **kwargs):
weather = fetch_weather(kwargs.get('query'))
return json.dumps({
"status": "success",
"result": weather,
"data_slush": { # β curated signal package
"source_agent": self.name,
"temp_f": 65,
"condition": "cloudy",
}
})
# Chain: A's data_slush feeds into B's context
result_a = agent_a.execute(query="Smyrna GA")
result_b = agent_b.execute(
query="...",
upstream_slush=agent_a.last_data_slush # β B sees A's signals
)
# Inside B: self.context['upstream_slush'] == {"source_agent": "WeatherPoet", "temp_f": 65, ...}User Input β execute() β slosh() enriches context β merge upstream_slush β perform() β extract data_slush
β
last_data_slush β next agent
# Search agents
openrappter rappterhub search "git automation"
# Install an agent
openrappter rappterhub install kody-w/git-helper
# List installed
openrappter rappterhub list
# Uninstall
openrappter rappterhub uninstall kody-w/git-helperopenrappter is compatible with ClawHub skills from OpenClaw:
openrappter clawhub search "productivity"
openrappter clawhub install author/skill-name
openrappter clawhub listInstalled skills are loaded from ~/.openrappter/skills/ and prefixed with skill: in the agent registry.
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β User Input (CLI / Web UI / Channels) β
ββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββ
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Assistant (Copilot SDK tool-calling orchestrator) β
β Routes queries to agents via @github/copilot-sdk β
ββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββ
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Agent Registry (auto-discovery from agents/ directory) β
β Python: *_agent.py TypeScript: *Agent.ts β
ββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββ
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Data Sloshing (context enrichment layer) β
β Temporal + Memory + Behavioral + Query signals β
β + upstream data_slush from previous agent β
ββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββ
βΌ
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β Agent.perform() β executes with enriched context β
ββββββββββββββββββββββββββ¬ββββββββββββββββββββββββββββββββββ
βΌ
βββββββββββββββββββββββ ββββββββββββββββββββββββββββββββββ
β GitHub Copilot SDK β β ~/.openrappter/ β
β (cloud AI backbone)β β config | memory | sessions β
β Inference layer β β Local-first data storage β
βββββββββββββββββββββββ ββββββββββββββββββββββββββββββββββ
openrappter/
βββ python/
β βββ openrappter/
β β βββ cli.py # Entry point & orchestrator
β β βββ clawhub.py # ClawHub compatibility
β β βββ rappterhub.py # RappterHub client
β β βββ agents/
β β βββ basic_agent.py # Base class (extend this)
β β βββ shell_agent.py # Shell commands
β β βββ manage_memory_agent.py # Store memories
β β βββ context_memory_agent.py # Recall memories
β β βββ learn_new_agent.py # Generate new agents
β β βββ learn_new_agent.py # Generate new agents
β βββ pyproject.toml
βββ typescript/
β βββ src/
β β βββ index.ts # Entry point
β β βββ config/ # YAML/JSON/JSON5 config with Zod validation
β β βββ gateway/ # WebSocket gateway server
β β βββ memory/ # Content chunker, embeddings, hybrid search
β β βββ channels/ # CLI, Slack, Discord, Telegram, Signal, iMessage, etc.
β β βββ providers/ # Model integrations (Anthropic, OpenAI, Ollama, Copilot)
β β βββ storage/ # SQLite & in-memory storage adapters
β β βββ agents/
β β βββ BasicAgent.ts # Base class (extend this)
β β βββ Assistant.ts # Copilot SDK orchestrator
β β βββ AgentRegistry.ts # Auto-discovery
β β βββ ShellAgent.ts # Shell commands
β β βββ MemoryAgent.ts # Memory store/recall
β β βββ broadcast.ts # Multi-agent broadcast (all/race/fallback)
β β βββ router.ts # Rule-based agent routing
β β βββ subagent.ts # Nested agent invocation
β β βββ types.ts # Shared type definitions
β βββ ui/ # Web dashboard (Lit + Vite)
β β βββ src/
β β β βββ main.ts # UI entry point
β β β βββ components/ # Lit web components (app, chat, sidebar, etc.)
β β β βββ services/ # Gateway client, markdown renderer
β β βββ package.json
β β βββ vite.config.ts
β βββ package.json
β βββ tsconfig.json
βββ docs/ # GitHub Pages site
βββ skills.md # This file
cd typescript
rm -rf node_modules dist
npm install
npm run buildcd python
pip install -e .
# If editable mode fails (old pip):
pip install .
# Or run directly:
python3 -m openrappter.cli --statusThe pyproject.toml requires >=3.10. If system Python is older, use Homebrew or pyenv:
# macOS
brew install python@3.11
/opt/homebrew/bin/python3.11 -m pip install .
# Or use pyenv
pyenv install 3.11
pyenv local 3.11openrappter requires the Copilot SDK for AI-powered agent routing:
npm install -g @githubnext/github-copilot-cli
github-copilot-cli auth# Reset memory
rm ~/.openrappter/memory.json
# Reset all config
rm -rf ~/.openrappteropenrappter supports multiple config file formats, loaded from ~/.openrappter/:
| Format | File | Notes |
|---|---|---|
| JSON5 | config.json5 |
Primary β supports comments and trailing commas |
| YAML | config.yaml |
Alternative |
| JSON | config.json |
Fallback |
| Profile | config.{profile}.json5 |
Per-profile overrides |
Config files support ${VAR_NAME} environment variable substitution. All schemas are validated with Zod.
| Section | Purpose |
|---|---|
models |
AI model provider configuration (copilot, anthropic, openai, ollama, gemini, bedrock) |
agents |
Agent-specific settings, workspaces, sandbox options |
channels |
Enable/configure messaging channels (iMessage, Slack, Discord, Telegram, etc.) |
gateway |
WebSocket server settings (port, bind, auth) |
memory |
Memory provider and chunking settings |
cron |
Scheduled task configuration |
# Install & demo
git clone https://github.com/kody-w/openrappter.git && cd openrappter
./quickstart.sh # Guided demo (no build needed)
cd typescript && npm install && npm run build # Full TypeScript build
cd ../python && pip install . # Python
# Start everything (three terminals)
cd typescript && npm run dev # Terminal 1: CLI chat
cd typescript && npx tsx src/index.ts --daemon # Terminal 2: gateway β ws://127.0.0.1:18790
cd typescript/ui && npm install && npm run dev # Terminal 3: UI β http://localhost:3000
# Status
node dist/index.js --status # TypeScript
python3 -m openrappter.cli --status # Python
# Use
node dist/index.js "remember X" # Store memory
node dist/index.js "recall X" # Recall memory
node dist/index.js "ls" # Shell command
openrappter --task "remember X" # Python memory
openrappter --exec Shell "ls" # Python shell
openrappter --list-agents # List agents
# Create agents
# Python: add *_agent.py to python/openrappter/agents/
# TypeScript: add *Agent.ts to typescript/src/agents/, rebuild
# RappterHub
openrappter rappterhub search "topic"
openrappter rappterhub install author/agentAgent Notes:
- Both runtimes can be installed independently β pick the user's preferred language
- TypeScript must be rebuilt after adding new agents (
npm run build) - Python agents are hot-loaded automatically
- All agent data is local-first in
~/.openrappter/ - Copilot provides cloud AI inference β no separate API keys needed
- This skill should complete in under 5 minutes