lm-proxy — LLM Plugin by Nayjest

by Nayjest · LLM Plugin · ★ 117

Last updated: · Indexed by AgentSkillsHub · Auto-synced every 8h

About lm-proxy

OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPI—use as library or standalone service.

aianthropicapi-proxyfastapigoogle-ailanguage-modelsllmllm-apillm-gatewayllm-inference

Quick Facts

Stars117
Forks13
LanguagePython
CategoryLLM Plugin
LicenseMIT
Quality Score40.75/100
Open Issues10
Last Updated2026-04-29
Created2025-05-24
Platformspython
Est. Tokens~100k

lm-proxy alternative? Top 6 similar tools

Looking for a lm-proxy alternative? If you're comparing lm-proxy with other llm plugin tools, these 6 projects are the closest alternatives on Agent Skills Hub — ranked by topic overlap, star count, and community traction.

  • anthropic-max-router by nsxdavid · ⭐ 52

    Dual API router (Anthropic + OpenAI compatible) for Claude MAX Plan - Use flat-rate billing with ANY AI too

  • NadirClaw by NadirRouter · ⭐ 449

    Open-source LLM router & AI cost optimizer. Routes simple prompts to cheap/local models, complex ones to premi

  • ccproxy by starbaser · ⭐ 390

    Build mods for Claude Code: Hook any request, modify any response, /model "with-your-custom-model", intelligen

  • NadirClaw by doramirdor · ⭐ 336

    Open-source LLM router & AI cost optimizer. Routes simple prompts to cheap/local models, complex ones to premi

  • holysheep-cli by holysheep123 · ⭐ 292

    🐑 One command to configure all AI coding tools — Claude Code, Codex, Gemini CLI, Cursor, Aider & more

  • dario by askalf · ⭐ 182

    Local LLM router. One endpoint for Claude Max/Pro, OpenAI, OpenRouter, Groq, Ollama, LiteLLM, any OpenAI-compa

More LLM Plugin Tools

Explore other popular llm plugin tools:

View all LLM Plugin tools →

Popular Python Agent Tools

Frequently Asked Questions

What is lm-proxy?

lm-proxy is OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPI—use as library or standalone service.. It is categorized as a LLM Plugin with 117 GitHub stars.

What programming language is lm-proxy written in?

lm-proxy is primarily written in Python. It covers topics such as ai, anthropic, api-proxy.

How do I install or use lm-proxy?

You can find installation instructions and usage details in the lm-proxy GitHub repository at github.com/Nayjest/lm-proxy. The project has 117 stars and 13 forks, indicating an active community.

What license does lm-proxy use?

lm-proxy is released under the MIT license, making it free to use and modify according to the license terms.

What are the best alternatives to lm-proxy?

The top alternatives to lm-proxy on Agent Skills Hub include anthropic-max-router, NadirClaw, ccproxy. Each offers a different approach to the same problem space — compare them side-by-side by stars, quality score, and community activity.

View on GitHub → Browse LLM Plugin tools