skills$openclaw/inference-sh
okaris6.9k

by okaris

inference-sh – OpenClaw Skill

inference-sh is an OpenClaw Skills integration for coding workflows. |

6.9k stars3.8k forksSecurity L1
Updated Feb 7, 2026Created Feb 7, 2026coding

Skill Snapshot

nameinference-sh
description| OpenClaw Skills integration.
ownerokaris
repositoryokaris/inference-shpath: inference-sh
languageMarkdown
licenseMIT
topics
securityL1
installopenclaw add @okaris/inference-sh:inference-sh
last updatedFeb 7, 2026

Maintainer

okaris

okaris

Maintains inference-sh in the OpenClaw Skills directory.

View GitHub profile
File Explorer
6 files
inference-sh
references
app-discovery.md
1.7 KB
authentication.md
804 B
cli-reference.md
2.7 KB
running-apps.md
2.5 KB
SKILL.md
3.1 KB
SKILL.md

name: inference-sh description: | Run 100+ AI models via inference.sh CLI - image generation, video creation, TTS, music, transcription, and more. Use when running AI apps, generating images/videos/audio, or working with models like FLUX, Veo, Whisper, SDXL. Triggers: inference.sh, infsh, ai model, run ai, serverless ai, ai api allowed-tools: Bash(infsh *)

inference.sh

Run AI models in the cloud with a simple CLI. No GPU required.

Quick Start

# Install CLI
curl -fsSL https://cli.inference.sh | sh

# Login
infsh login

# Run an image generation app
infsh app run falai/flux-dev-lora --input '{"prompt": "a cat astronaut"}'

Quick Reference

TaskCommand
Install CLIcurl -fsSL https://cli.inference.sh | sh
Logininfsh login
Check authinfsh me
List all appsinfsh app list
Search appsinfsh app list --search "flux"
Filter by categoryinfsh app list --category image
Get app detailsinfsh app get falai/flux-dev-lora
Generate sample inputinfsh app sample falai/flux-dev-lora --save input.json
Run appinfsh app run falai/flux-dev-lora --input input.json
Run with inline JSONinfsh app run falai/flux-dev-lora --input '{"prompt": "hello"}'
Run without waitinginfsh app run <app> --input input.json --no-wait
Check task statusinfsh task get <task-id>

Categories

CategoryCommandExamples
Imageinfsh app list --category imageFLUX, SDXL, Gemini, Grok, Seedream
Videoinfsh app list --category videoVeo, Seedance, Wan, LTX, OmniHuman
Audioinfsh app list --category audioTTS, Whisper, music generation
Textinfsh app list --category textSearch, OCR, code execution
  1. Find an app: infsh app list --search "your query"
  2. Get details: infsh app get user/app-name
  3. Generate sample: infsh app sample user/app-name --save input.json
  4. Edit input: Modify input.json as needed
  5. Run: infsh app run user/app-name --input input.json

Related Skills

# Image generation (FLUX, Gemini, Grok, Seedream)
npx skills add inference-sh/skills@ai-image-generation

# Video generation (Veo, Seedance, Wan, OmniHuman)
npx skills add inference-sh/skills@ai-video-generation

# LLMs (Claude, Gemini, Kimi, GLM via OpenRouter)
npx skills add inference-sh/skills@llm-models

# Web search (Tavily, Exa)
npx skills add inference-sh/skills@web-search

# AI avatars & lipsync (OmniHuman, Fabric, PixVerse)
npx skills add inference-sh/skills@ai-avatar-video

# Twitter/X automation
npx skills add inference-sh/skills@twitter-automation

# Model-specific
npx skills add inference-sh/skills@flux-image
npx skills add inference-sh/skills@google-veo

# Utilities
npx skills add inference-sh/skills@image-upscaling
npx skills add inference-sh/skills@background-removal

Reference Files

README.md

No README available.

Permissions & Security

Security level L1: Low-risk skills with minimal permissions. Review inputs and outputs before running in production.

Requirements

  • OpenClaw CLI installed and configured.
  • Language: Markdown
  • License: MIT
  • Topics:

FAQ

How do I install inference-sh?

Run openclaw add @okaris/inference-sh:inference-sh in your terminal. This installs inference-sh into your OpenClaw Skills catalog.

Does this skill run locally or in the cloud?

OpenClaw Skills execute locally by default. Review the SKILL.md and permissions before running any skill.

Where can I verify the source code?

The source repository is available at https://github.com/openclaw/skills/tree/main/skills/okaris/inference-sh. Review commits and README documentation before installing.