OpenRouter Integration
Integrate Helicone with OpenRouter, a unified API for accessing multiple LLM providers. Monitor and analyze AI interactions across various models through a single, streamlined interface.
OpenRouter is a tool that helps you integrate multiple NLP APIs in your application. It provides a single API endpoint that you can use to call multiple NLP APIs.
You can follow their documentation here: https://openrouter.ai/docs#quick-start
Gateway Integration
Create an OpenRouter account
Log into www.openrouter.ai or create an account. Once you have an account, you can generate an API key.
Set HELICONE_API_KEY and OPENROUTER_API_KEY as environment variable
Modify the base URL and add Auth headers
Replace the following OpenRouter URL with the Helicone Gateway URL:
https://openrouter.ai/api/v1/chat/completions
-> https://openrouter.helicone.ai/api/v1/chat/completions
and then add the following authentication headers.
Now you can access all the models on OpenRouter with a simple fetch call:
Example
We now also support streaming in responses from OpenRouter.
Note: usage data and cost calculations while streaming are only offered for OpenAI and Anthropic models. For non-stream requests, usage data and cost calculations are available for all models.
For more information on how to use headers, see Helicone Headers docs. And for more information on how to use OpenRouter, see OpenRouter Docs.
Was this page helpful?