6.9k★by okaris
inference-sh – OpenClaw Skill
inference-sh is an OpenClaw Skills integration for coding workflows. |
6.9k stars3.8k forksSecurity L1
Updated Feb 7, 2026Created Feb 7, 2026coding
Skill Snapshot
| name | inference-sh |
| description | | OpenClaw Skills integration. |
| owner | okaris |
| repository | okaris/inference-shpath: inference-sh |
| language | Markdown |
| license | MIT |
| topics | |
| security | L1 |
| install | openclaw add @okaris/inference-sh:inference-sh |
| last updated | Feb 7, 2026 |
Maintainer

File Explorer
6 files
inference-sh
references
app-discovery.md
1.7 KBauthentication.md
804 Bcli-reference.md
2.7 KBrunning-apps.md
2.5 KBSKILL.md
3.1 KBSKILL.md
name: inference-sh description: | Run 100+ AI models via inference.sh CLI - image generation, video creation, TTS, music, transcription, and more. Use when running AI apps, generating images/videos/audio, or working with models like FLUX, Veo, Whisper, SDXL. Triggers: inference.sh, infsh, ai model, run ai, serverless ai, ai api allowed-tools: Bash(infsh *)
inference.sh
Run AI models in the cloud with a simple CLI. No GPU required.
Quick Start
# Install CLI
curl -fsSL https://cli.inference.sh | sh
# Login
infsh login
# Run an image generation app
infsh app run falai/flux-dev-lora --input '{"prompt": "a cat astronaut"}'
Quick Reference
| Task | Command |
|---|---|
| Install CLI | curl -fsSL https://cli.inference.sh | sh |
| Login | infsh login |
| Check auth | infsh me |
| List all apps | infsh app list |
| Search apps | infsh app list --search "flux" |
| Filter by category | infsh app list --category image |
| Get app details | infsh app get falai/flux-dev-lora |
| Generate sample input | infsh app sample falai/flux-dev-lora --save input.json |
| Run app | infsh app run falai/flux-dev-lora --input input.json |
| Run with inline JSON | infsh app run falai/flux-dev-lora --input '{"prompt": "hello"}' |
| Run without waiting | infsh app run <app> --input input.json --no-wait |
| Check task status | infsh task get <task-id> |
Categories
| Category | Command | Examples |
|---|---|---|
| Image | infsh app list --category image | FLUX, SDXL, Gemini, Grok, Seedream |
| Video | infsh app list --category video | Veo, Seedance, Wan, LTX, OmniHuman |
| Audio | infsh app list --category audio | TTS, Whisper, music generation |
| Text | infsh app list --category text | Search, OCR, code execution |
Workflow
- Find an app:
infsh app list --search "your query" - Get details:
infsh app get user/app-name - Generate sample:
infsh app sample user/app-name --save input.json - Edit input: Modify
input.jsonas needed - Run:
infsh app run user/app-name --input input.json
Related Skills
# Image generation (FLUX, Gemini, Grok, Seedream)
npx skills add inference-sh/skills@ai-image-generation
# Video generation (Veo, Seedance, Wan, OmniHuman)
npx skills add inference-sh/skills@ai-video-generation
# LLMs (Claude, Gemini, Kimi, GLM via OpenRouter)
npx skills add inference-sh/skills@llm-models
# Web search (Tavily, Exa)
npx skills add inference-sh/skills@web-search
# AI avatars & lipsync (OmniHuman, Fabric, PixVerse)
npx skills add inference-sh/skills@ai-avatar-video
# Twitter/X automation
npx skills add inference-sh/skills@twitter-automation
# Model-specific
npx skills add inference-sh/skills@flux-image
npx skills add inference-sh/skills@google-veo
# Utilities
npx skills add inference-sh/skills@image-upscaling
npx skills add inference-sh/skills@background-removal
Reference Files
README.md
No README available.
Permissions & Security
Security level L1: Low-risk skills with minimal permissions. Review inputs and outputs before running in production.
Requirements
- OpenClaw CLI installed and configured.
- Language: Markdown
- License: MIT
- Topics:
FAQ
How do I install inference-sh?
Run openclaw add @okaris/inference-sh:inference-sh in your terminal. This installs inference-sh into your OpenClaw Skills catalog.
Does this skill run locally or in the cloud?
OpenClaw Skills execute locally by default. Review the SKILL.md and permissions before running any skill.
Where can I verify the source code?
The source repository is available at https://github.com/openclaw/skills/tree/main/skills/okaris/inference-sh. Review commits and README documentation before installing.
