3.8k★by guoqiao
mlx-stt – OpenClaw Skill
mlx-stt is an OpenClaw Skills integration for coding workflows. Speech-To-Text with MLX (Apple Silicon) and GLM-ASR-Nano-2512 locally.
Skill Snapshot
| name | mlx-stt |
| description | Speech-To-Text with MLX (Apple Silicon) and GLM-ASR-Nano-2512 locally. OpenClaw Skills integration. |
| owner | guoqiao |
| repository | guoqiao/mlx-stt |
| language | Markdown |
| license | MIT |
| topics | |
| security | L1 |
| install | openclaw add @guoqiao/mlx-stt |
| last updated | Feb 7, 2026 |
Maintainer

name: mlx-stt description: Speech-To-Text with MLX (Apple Silicon) and GLM-ASR-Nano-2512 locally. metadata: {"openclaw":{"always":true,"emoji":"🦞","homepage":"https://github.com/guoqiao/skills/blob/main/mlx-stt/mlx-stt/SKILL.md","os":["darwin"],"tags":["latest","asr","stt","speech-to-text","audio","glm","glm-asr","glm-asr-nano-2512","glm-asr-nano-2512-8bit","macOS","MacBook","Mac mini","Apple Silicon","mlx","mlx-audio"],"requires":{"bins":["brew"]}}}
⚠️ Deprecation Notice
This skill is deprecated and will no longer receive updates.
Please use mlx-audio-server instead, which replaces this skill and provides improved functionality.
MLX STT
Speech-To-Text/ASR/Transcribe with MLX (Apple Silicon) and GLM-ASR-Nano-2512 locally.
Free and Accurate. No api key required. No server required.
Requirements
mlx: macOS with Apple Siliconbrew: used to install deps if not available
Installation
bash ${baseDir}/install.sh
This script will use brew to install these cli tools if not available:
ffmpeg: convert audio format when neededuv: install python package and run python scriptmlx_audio: do the real job
Usage
To transcribe an audio file, run the mlx-stt.py script:
uv run ${baseDir}/mlx-stt.py <audio_file_path>
- When first run, it will download model from Hugging Face, default:
mlx-community/GLM-ASR-Nano-2512-8bit, 2.5GB ish. - The transcript result will be printed to stdout.
No README available.
Permissions & Security
Security level L1: Low-risk skills with minimal permissions. Review inputs and outputs before running in production.
Requirements
- `mlx`: macOS with Apple Silicon - `brew`: used to install deps if not available
FAQ
How do I install mlx-stt?
Run openclaw add @guoqiao/mlx-stt in your terminal. This installs mlx-stt into your OpenClaw Skills catalog.
Does this skill run locally or in the cloud?
OpenClaw Skills execute locally by default. Review the SKILL.md and permissions before running any skill.
Where can I verify the source code?
The source repository is available at https://github.com/openclaw/skills/tree/main/skills/guoqiao/mlx-stt. Review commits and README documentation before installing.
