less than 1 minute read

llmspy

Introducing llms.py 🚀

We’re excited to announce llms.py - a super lightweight CLI tool and OpenAI-compatible server that acts as a configurable gateway over multiple configurable Large Language Model (LLM) providers.

🎯 OpenRouter but Local

llms.py is designed as a unified gateway that seamlessly connects you to multiple LLM providers through a single, consistent interface. Whether using cloud APIs or local models, llms provides intelligent routing and automatic failover to ensure your AI workflows connect to your chosen providers in your preferred priority - whether optimizing for cost, performance, or availability.

https://llmspy.org/docs/getting-started https://github.com/ServiceStack/llms

Updated: