skills$openclaw/mlx-whisper
kevin37li7.0k

by kevin37li

mlx-whisper – OpenClaw Skill

mlx-whisper is an OpenClaw Skills integration for coding workflows. Local speech-to-text with MLX Whisper (Apple Silicon optimized, no API key).

7.0k stars5.1k forksSecurity L1
Updated Feb 7, 2026Created Feb 7, 2026coding

Skill Snapshot

namemlx-whisper
descriptionLocal speech-to-text with MLX Whisper (Apple Silicon optimized, no API key). OpenClaw Skills integration.
ownerkevin37li
repositorykevin37li/mlx-whisper
languageMarkdown
licenseMIT
topics
securityL1
installopenclaw add @kevin37li/mlx-whisper
last updatedFeb 7, 2026

Maintainer

kevin37li

kevin37li

Maintains mlx-whisper in the OpenClaw Skills directory.

View GitHub profile
File Explorer
2 files
.
_meta.json
277 B
SKILL.md
1.6 KB
SKILL.md

name: mlx-whisper version: 1.0.0 description: Local speech-to-text with MLX Whisper (Apple Silicon optimized, no API key). homepage: https://github.com/ml-explore/mlx-examples/tree/main/whisper metadata: {"clawdbot":{"emoji":"🍎","requires":{"bins":["mlx_whisper"]},"install":[{"id":"pip","kind":"pip","package":"mlx-whisper","bins":["mlx_whisper"],"label":"Install mlx-whisper (pip)"}]}}

MLX Whisper

Local speech-to-text using Apple MLX, optimized for Apple Silicon Macs.

Quick Start

mlx_whisper /path/to/audio.mp3 --model mlx-community/whisper-large-v3-turbo

Common Usage

# Transcribe to text file
mlx_whisper audio.m4a -f txt -o ./output

# Transcribe with language hint
mlx_whisper audio.mp3 --language en --model mlx-community/whisper-large-v3-turbo

# Generate subtitles (SRT)
mlx_whisper video.mp4 -f srt -o ./subs

# Translate to English
mlx_whisper foreign.mp3 --task translate

Models (download on first use)

ModelSizeSpeedQuality
mlx-community/whisper-tiny~75MBFastestBasic
mlx-community/whisper-base~140MBFastGood
mlx-community/whisper-small~470MBMediumBetter
mlx-community/whisper-medium~1.5GBSlowerGreat
mlx-community/whisper-large-v3~3GBSlowestBest
mlx-community/whisper-large-v3-turbo~1.6GBFastExcellent (Recommended)

Notes

  • Requires Apple Silicon Mac (M1/M2/M3/M4)
  • Models cache to ~/.cache/huggingface/
  • Default model is mlx-community/whisper-tiny; use --model mlx-community/whisper-large-v3-turbo for best results
README.md

No README available.

Permissions & Security

Security level L1: Low-risk skills with minimal permissions. Review inputs and outputs before running in production.

Requirements

  • OpenClaw CLI installed and configured.
  • Language: Markdown
  • License: MIT
  • Topics:

FAQ

How do I install mlx-whisper?

Run openclaw add @kevin37li/mlx-whisper in your terminal. This installs mlx-whisper into your OpenClaw Skills catalog.

Does this skill run locally or in the cloud?

OpenClaw Skills execute locally by default. Review the SKILL.md and permissions before running any skill.

Where can I verify the source code?

The source repository is available at https://github.com/openclaw/skills/tree/main/skills/kevin37li/mlx-whisper. Review commits and README documentation before installing.