Why Use AI Gateway?
One SDK for All Models
Use OpenAI SDK to access GPT, Claude, Gemini, and 100+ other models
No Rate Limits
Skip provider tier restrictions - use credits with 0% markup
Always Online
Automatic failover across providers keeps your app running
Unified Observability
Track usage, costs, and performance across all providers in one dashboard
How It Works
The AI Gateway sits between your application and LLM providers, acting as a unified translation layer:- You make one request - Use the OpenAI SDK format, regardless of which provider you want
- We translate & route - Helicone converts your request to the correct provider format (Anthropic, Google, etc.)
- Provider responds - The LLM provider processes your request
- We log & return - You get the response back while we capture metrics, costs, and errors
https://ai-gateway.helicone.ai
With credits, we manage provider API keys for you. Your requests automatically work with OpenAI, Anthropic, Google, and 100+ other providers without signing up for each one.
Quick Example
Add two lines to your existing OpenAI code to unlock 100+ models with automatic observability:Helicone vs OpenRouter
Helicone offers a complete platform for production AI applications, while OpenRouter focuses on simple model access.| Feature | Helicone | OpenRouter |
|---|---|---|
| Pricing | 0% markup | 5.5% markup |
| Observability | Full-featured (sessions, users, custom properties, cost tracking) | Basic (requests/costs per model only) |
| Session Tracking | ✅ | ❌ |
| Prompt Management | ✅ | ❌ |
| Caching | ✅ | ❌ |
| Custom Rate Limits | ✅ | ❌ |
| LLM Security | ✅ | ❌ |
| Open Source | ✅ | ❌ |
| BYOK | ✅ | ✅ |
| Automatic Fallbacks | ✅ | ✅ |
Migrating from OpenRouter?
Migrating from OpenRouter?
See our OpenRouter migration guide for step-by-step instructions.
Next Steps
Get Started in 5 Minutes
Set up AI Gateway and make your first request
Browse Model Registry
See all supported models and provider formats
Provider Routing
Configure automatic routing and fallbacks for reliability
Prompt Integration
Deploy and manage prompts through the gateway
Want to integrate a new model provider to the AI Gateway? Check out our tutorial for detailed instructions.