Skip to main content
This integration method is maintained but no longer actively developed. For the best experience and latest features, use our new AI Gateway with unified API access to 100+ models.
1
2
HELICONE_API_KEY=<your-helicone-api-key>
OPENAI_API_KEY=<your-openai-api-key>
3
import { createOpenAI } from "@ai-sdk/openai";

const openai = createOpenAI({
  baseURL: "https://oai.helicone.ai/v1",
  headers: {
    "Helicone-Auth": `Bearer ${process.env.HELICONE_API_KEY}`,
  },
});

// Use openai to make API calls
const response = streamText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
});

Configuring Helicone Features with Headers

Enable Helicone features through headers, configurable at client initialization or individual request level.

Configure Client

const openai = createOpenAI({
  baseURL: "https://oai.helicone.ai/v1",
  headers: {
    "Helicone-Auth": `Bearer ${process.env.HELICONE_API_KEY}`,
    "Helicone-Cache-Enabled": "true",
  },
});

Generate Text

const response = generateText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
  headers: {
    "Helicone-User-Id": "[email protected]",
    "Helicone-Session-Id": "uuid",
    "Helicone-Session-Path": "/chat",
    "Helicone-Session-Name": "Chatbot",
  },
});

Stream Text

const response = streamText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
  headers: {
    "Helicone-User-Id": "[email protected]",
    "Helicone-Session-Id": "uuid",
    "Helicone-Session-Path": "/chat",
    "Helicone-Session-Name": "Chatbot",
  },
});

Using with Existing Custom Base URLs

If you’re already using a custom base URL for an OpenAI-compatible vendor, you can proxy your requests through Helicone by setting the Helicone-Target-URL header to your existing vendor’s endpoint.

Example with Custom Vendor

import { createOpenAI } from "@ai-sdk/openai";

const openai = createOpenAI({
  baseURL: "https://oai.helicone.ai/v1",
  headers: {
    "Helicone-Auth": `Bearer ${process.env.HELICONE_API_KEY}`,
    "Helicone-Target-URL": "https://your-vendor-api.com/v1", // Your existing vendor's endpoint
  },
});

// Use openai to make API calls - requests will be proxied to your vendor
const response = streamText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
});

Example with Multiple Vendors

You can also dynamically set the target URL per request:
const response = streamText({
  model: openai("gpt-4o"),
  prompt: "Hello world",
  headers: {
    "Helicone-Target-URL": "https://your-vendor-api.com/v1", // Override for this request
  },
});
This approach allows you to:
  • Keep your existing vendor integrations
  • Add Helicone monitoring and features
  • Switch between vendors without changing your base URL
  • Maintain compatibility with your current setup