6.3kā
by macstenk
skill-exporter ā OpenClaw Skill
skill-exporter is an OpenClaw Skills integration for coding workflows. Export Clawdbot skills as standalone, deployable microservices. Use when you want to dockerize a skill, deploy it to Railway or Fly.io, or create an independent API service. Generates Dockerfile, FastAPI wrapper, requirements.txt, deployment configs, and optional LLM client integration.
Skill Snapshot
| name | skill-exporter |
| description | Export Clawdbot skills as standalone, deployable microservices. Use when you want to dockerize a skill, deploy it to Railway or Fly.io, or create an independent API service. Generates Dockerfile, FastAPI wrapper, requirements.txt, deployment configs, and optional LLM client integration. OpenClaw Skills integration. |
| owner | macstenk |
| repository | macstenk/skill-exporter |
| language | Markdown |
| license | MIT |
| topics | |
| security | L1 |
| install | openclaw add @macstenk/skill-exporter |
| last updated | Feb 7, 2026 |
Maintainer

name: skill-exporter description: Export Clawdbot skills as standalone, deployable microservices. Use when you want to dockerize a skill, deploy it to Railway or Fly.io, or create an independent API service. Generates Dockerfile, FastAPI wrapper, requirements.txt, deployment configs, and optional LLM client integration. license: MIT compatibility: Requires python3. Works with any AgentSkills-compatible agent. metadata: author: MacStenk version: "1.0.0" clawdbot: emoji: "š¦" requires: bins: - python3
Skill Exporter
Transform Clawdbot skills into standalone, deployable microservices.
Workflow
Clawdbot Skill (tested & working)
ā
skill-exporter
ā
Standalone Microservice
ā
Railway / Fly.io / Docker
Usage
Export a skill
python3 {baseDir}/scripts/export.py \
--skill ~/.clawdbot/skills/instagram \
--target railway \
--llm anthropic \
--output ~/projects/instagram-service
Options
| Flag | Description | Default |
|---|---|---|
--skill | Path to skill directory | required |
--target | Deployment target: railway, fly, docker | docker |
--llm | LLM provider: anthropic, openai, none | none |
--output | Output directory | ./<skill-name>-service |
--port | API port | 8000 |
Targets
railway ā Generates railway.json, optimized Dockerfile, health checks
fly ā Generates fly.toml, multi-region ready
docker ā Generic Dockerfile, docker-compose.yml
LLM Integration
When --llm is set, generates llm_client.py with:
- Caption/prompt generation
- Decision making helpers
- Rate limiting and error handling
What Gets Generated
<skill>-service/
āāā Dockerfile
āāā docker-compose.yml
āāā api.py # FastAPI wrapper
āāā llm_client.py # If --llm specified
āāā requirements.txt
āāā .env.example
āāā railway.json # If --target railway
āāā fly.toml # If --target fly
āāā scripts/ # Copied from original skill
āāā *.py
Requirements
The source skill must have:
SKILL.mdwith valid frontmatter- At least one script in
scripts/ - Scripts should be callable (functions, not just inline code)
Post-Export
- Copy
.env.exampleto.envand fill in secrets - Test locally:
docker-compose up - Deploy:
railway uporfly deploy
No README available.
Permissions & Security
Security level L1: Low-risk skills with minimal permissions. Review inputs and outputs before running in production.
Requirements
The source skill must have: - `SKILL.md` with valid frontmatter - At least one script in `scripts/` - Scripts should be callable (functions, not just inline code)
FAQ
How do I install skill-exporter?
Run openclaw add @macstenk/skill-exporter in your terminal. This installs skill-exporter into your OpenClaw Skills catalog.
Does this skill run locally or in the cloud?
OpenClaw Skills execute locally by default. Review the SKILL.md and permissions before running any skill.
Where can I verify the source code?
The source repository is available at https://github.com/openclaw/skills/tree/main/skills/macstenk/skill-exporter. Review commits and README documentation before installing.
