Open Source AI Gateway
Open-source AI gateway for managing multiple LLM providers with built-in features.
Please wait while we load the page
An open-source AI gateway designed for managing multiple LLM providers such as OpenAI, Anthropic, Gemini, Ollama, Mistral, and Cohere. It offers built-in analytics, guardrails, rate limiting, caching, and administrative controls. It supports both HTTP and gRPC interfaces.
1. Configure the Config.toml file with your API keys and model settings. 2. Run the Docker container, mounting the Config.toml file. 3. Use curl commands to make API requests to the gateway, specifying the LLM provider.
You should choose this if you want an open-source AI gateway that gives you flexibility and control over your AI integrations. It’s perfect for developers and tech enthusiasts.
Pricing information not available
No products available