by samuelfaj · Codex Skill · ★ 493
Last updated: · Indexed by AgentSkillsHub · Auto-synced every 8h
distill Agent command outputs are one of the biggest sources of token waste. Logs, test results, stack traces… thousands of tokens sent to an LLM just to answer a simple question. 🔥 compresses command outputs into only what the LLM actually needs. Save up to 99% of tokens without losing the signal. How to use You can also point at OpenAI-compatible providers such as LM Studio, Jan, LocalAI, vLLM, SGLang, llama.cpp-compatible servers, MLX-based servers, and Docker Model Runner. Add in your global agent instructions file: md CRITICAL: Pipe every non-interactive shell command through unless raw output is explicitly required. CRITICAL: Your prompt to must be fully explicit. State exactly what you want to know and exactly what the output must contain. If you want only filenames, say If you want JSON, say Do not ask vague questions. Bad: Good: Examples: npm audit 2&1 | d
| Stars | 493 |
| Forks | 29 |
| Language | TypeScript |
| Category | Codex Skill |
| Quality Score | 54.208/100 |
| Open Issues | 2 |
| Last Updated | 2026-04-25 |
| Created | 2026-03-06 |
| Platforms | claude-code, cli, codex, node |
| Est. Tokens | ~10k |
Looking for a distill alternative? If you're comparing distill with other codex skill tools, these 2 projects are the closest alternatives on Agent Skills Hub — ranked by topic overlap, star count, and community traction.
The leading, most token-efficient MCP server for GitHub source code exploration via tree-sitter AST parsing
htop for your Claude Code sessions — real-time cost, cache efficiency, model comparison, and smart alerts
Explore other popular codex skill tools:
distill is Distill large CLI outputs into small answers for LLMs and save tokens!. It is categorized as a Codex Skill with 493 GitHub stars.
distill is primarily written in TypeScript. It covers topics such as claude-code, codex, llm.
You can find installation instructions and usage details in the distill GitHub repository at github.com/samuelfaj/distill. The project has 493 stars and 29 forks, indicating an active community.
The top alternatives to distill on Agent Skills Hub include jcodemunch-mcp, claudetop. Each offers a different approach to the same problem space — compare them side-by-side by stars, quality score, and community activity.