skills$openclaw/local-whisper
araa476.8k

by araa47

local-whisper – OpenClaw Skill

local-whisper is an OpenClaw Skills integration for coding workflows. Local speech-to-text using OpenAI Whisper. Runs fully offline after model download. High quality transcription with multiple model sizes.

6.8k stars3.5k forksSecurity L1
Updated Feb 7, 2026Created Feb 7, 2026coding

Skill Snapshot

namelocal-whisper
descriptionLocal speech-to-text using OpenAI Whisper. Runs fully offline after model download. High quality transcription with multiple model sizes. OpenClaw Skills integration.
owneraraa47
repositoryaraa47/local-whisper
languageMarkdown
licenseMIT
topics
securityL1
installopenclaw add @araa47/local-whisper
last updatedFeb 7, 2026

Maintainer

araa47

araa47

Maintains local-whisper in the OpenClaw Skills directory.

View GitHub profile
File Explorer
4 files
.
scripts
transcribe.py
2.5 KB
_meta.json
278 B
SKILL.md
1.4 KB
SKILL.md

name: local-whisper description: Local speech-to-text using OpenAI Whisper. Runs fully offline after model download. High quality transcription with multiple model sizes. metadata: {"clawdbot":{"emoji":"🎙️","requires":{"bins":["ffmpeg"]}}}

Local Whisper STT

Local speech-to-text using OpenAI's Whisper. Fully offline after initial model download.

Usage

# Basic
~/.clawdbot/skills/local-whisper/scripts/local-whisper audio.wav

# Better model
~/.clawdbot/skills/local-whisper/scripts/local-whisper audio.wav --model turbo

# With timestamps
~/.clawdbot/skills/local-whisper/scripts/local-whisper audio.wav --timestamps --json

Models

ModelSizeNotes
tiny39MFastest
base74MDefault
small244MGood balance
turbo809MBest speed/quality
large-v31.5GBMaximum accuracy

Options

  • --model/-m — Model size (default: base)
  • --language/-l — Language code (auto-detect if omitted)
  • --timestamps/-t — Include word timestamps
  • --json/-j — JSON output
  • --quiet/-q — Suppress progress

Setup

Uses uv-managed venv at .venv/. To reinstall:

cd ~/.clawdbot/skills/local-whisper
uv venv .venv --python 3.12
uv pip install --python .venv/bin/python click openai-whisper torch --index-url https://download.pytorch.org/whl/cpu
README.md

No README available.

Permissions & Security

Security level L1: Low-risk skills with minimal permissions. Review inputs and outputs before running in production.

Requirements

  • OpenClaw CLI installed and configured.
  • Language: Markdown
  • License: MIT
  • Topics:

FAQ

How do I install local-whisper?

Run openclaw add @araa47/local-whisper in your terminal. This installs local-whisper into your OpenClaw Skills catalog.

Does this skill run locally or in the cloud?

OpenClaw Skills execute locally by default. Review the SKILL.md and permissions before running any skill.

Where can I verify the source code?

The source repository is available at https://github.com/openclaw/skills/tree/main/skills/araa47/local-whisper. Review commits and README documentation before installing.