by Chen-zexi · LLM Plugin · ★ 480
Last updated: · Indexed by AgentSkillsHub · Auto-synced every 8h
A command-line interface tool for serving LLM using vLLM.
| Stars | 480 |
| Forks | 26 |
| Language | Python |
| Category | LLM Plugin |
| License | MIT |
| Quality Score | 42.75/100 |
| Open Issues | 3 |
| Last Updated | 2026-01-25 |
| Created | 2025-08-14 |
| Platforms | cli, python |
| Est. Tokens | ~431k |
These tools work well together with vllm-cli for enhanced workflows:
Looking for a vllm-cli alternative? If you're comparing vllm-cli with other llm plugin tools, these 6 projects are the closest alternatives on Agent Skills Hub — ranked by topic overlap, star count, and community traction.
Control Gmail, Google Calendar, Docs, Sheets, Slides, Chat, Forms, Tasks, Search & Drive with AI - Comprehensi
The LLM Anti-Framework
Your autonomous engineering team in a CLI. Point Zeroshot at an issue, walk away, and return to production-gra
A curated collection of top-tier penetration testing tools and productivity utilities across multiple domains.
OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL,
List of awesome hosting sorted by minimal plan price
Explore other popular llm plugin tools:
vllm-cli is A command-line interface tool for serving LLM using vLLM.. It is categorized as a LLM Plugin with 480 GitHub stars.
vllm-cli is primarily written in Python. It covers topics such as llm, llm-inference, llm-tools.
You can find installation instructions and usage details in the vllm-cli GitHub repository at github.com/Chen-zexi/vllm-cli. The project has 480 stars and 26 forks, indicating an active community.
vllm-cli is released under the MIT license, making it free to use and modify according to the license terms.
The top alternatives to vllm-cli on Agent Skills Hub include google_workspace_mcp, mirascope, zeroshot. Each offers a different approach to the same problem space — compare them side-by-side by stars, quality score, and community activity.