skills$openclaw/mlx-stt
guoqiao3.8k

by guoqiao

mlx-stt – OpenClaw Skill

mlx-stt is an OpenClaw Skills integration for coding workflows. Speech-To-Text with MLX (Apple Silicon) and GLM-ASR-Nano-2512 locally.

3.8k stars1.2k forksSecurity L1
Updated Feb 7, 2026Created Feb 7, 2026coding

Skill Snapshot

namemlx-stt
descriptionSpeech-To-Text with MLX (Apple Silicon) and GLM-ASR-Nano-2512 locally. OpenClaw Skills integration.
ownerguoqiao
repositoryguoqiao/mlx-stt
languageMarkdown
licenseMIT
topics
securityL1
installopenclaw add @guoqiao/mlx-stt
last updatedFeb 7, 2026

Maintainer

guoqiao

guoqiao

Maintains mlx-stt in the OpenClaw Skills directory.

View GitHub profile
File Explorer
4 files
.
_meta.json
797 B
install.sh
382 B
mlx-stt.py
3.4 KB
SKILL.md
1.5 KB
SKILL.md

name: mlx-stt description: Speech-To-Text with MLX (Apple Silicon) and GLM-ASR-Nano-2512 locally. metadata: {"openclaw":{"always":true,"emoji":"🦞","homepage":"https://github.com/guoqiao/skills/blob/main/mlx-stt/mlx-stt/SKILL.md","os":["darwin"],"tags":["latest","asr","stt","speech-to-text","audio","glm","glm-asr","glm-asr-nano-2512","glm-asr-nano-2512-8bit","macOS","MacBook","Mac mini","Apple Silicon","mlx","mlx-audio"],"requires":{"bins":["brew"]}}}

⚠️ Deprecation Notice

This skill is deprecated and will no longer receive updates.

Please use mlx-audio-server instead, which replaces this skill and provides improved functionality.

MLX STT

Speech-To-Text/ASR/Transcribe with MLX (Apple Silicon) and GLM-ASR-Nano-2512 locally.

Free and Accurate. No api key required. No server required.

Requirements

  • mlx: macOS with Apple Silicon
  • brew: used to install deps if not available

Installation

bash ${baseDir}/install.sh

This script will use brew to install these cli tools if not available:

  • ffmpeg: convert audio format when needed
  • uv: install python package and run python script
  • mlx_audio: do the real job

Usage

To transcribe an audio file, run the mlx-stt.py script:

uv run  ${baseDir}/mlx-stt.py <audio_file_path>
  • When first run, it will download model from Hugging Face, default: mlx-community/GLM-ASR-Nano-2512-8bit, 2.5GB ish.
  • The transcript result will be printed to stdout.
README.md

No README available.

Permissions & Security

Security level L1: Low-risk skills with minimal permissions. Review inputs and outputs before running in production.

Requirements

- `mlx`: macOS with Apple Silicon - `brew`: used to install deps if not available

FAQ

How do I install mlx-stt?

Run openclaw add @guoqiao/mlx-stt in your terminal. This installs mlx-stt into your OpenClaw Skills catalog.

Does this skill run locally or in the cloud?

OpenClaw Skills execute locally by default. Review the SKILL.md and permissions before running any skill.

Where can I verify the source code?

The source repository is available at https://github.com/openclaw/skills/tree/main/skills/guoqiao/mlx-stt. Review commits and README documentation before installing.