Manual Logger with Streaming Support
Helicone’s Manual Logger provides powerful capabilities for tracking LLM requests and responses, including streaming responses. This guide will show you how to use the@helicone/helpers
package to log streaming responses from various LLM providers.
Installation
First, install the@helicone/helpers
package:
Basic Setup
Initialize the HeliconeManualLogger with your API key:Streaming Methods
The HeliconeManualLogger provides several methods for working with streams:1. logBuilder (New)
The recommended method for handling streaming responses with improved error handling:2. logStream
A flexible method that gives you full control over stream handling:3. logSingleStream
A simplified method for logging a single ReadableStream:4. logSingleRequest
For logging a single request with a response body:Next.js App Router with LogBuilder (Recommended)
The newlogBuilder
method provides better error handling and simplified stream management:
logBuilder
approach offers several advantages:
- Better error handling with
setError
method - Simplified stream handling with
toReadableStream
- More flexible async/await patterns with
sendLog
- Proper error status code tracking
Examples with Different LLM Providers
OpenAI
Together AI
Anthropic
Next.js API Route Example
Here’s how to use the manual logger in a Next.js API route:Next.js App Router with Vercel’s after
Function
For Next.js App Router, you can use Vercel’s after
function to log requests without blocking the response:
Logging Custom Events
You can also use the manual logger to log custom events:Advanced Usage: Tracking Time to First Token
ThelogStream
, logSingleStream
, and logBuilder
methods automatically track the time to first token, which is a valuable metric for understanding LLM response latency: