Python Manual Logger
Logging calls to custom models is supported via the Helicone Python SDK.1
Install the Helicone helpers package
2
Set `HELICONE_API_KEY` as an environment variable
You can also set the Helicone API Key in your code (See below)
3
Create a new HeliconeManualLogger instance
4
Define your operation and make the request
API Reference
HeliconeManualLogger
LoggingOptions
log_request
Parameters
request
: A dictionary containing the request parametersoperation
: A callable that takes a HeliconeResultRecorder and returns a resultadditional_headers
: Optional dictionary of additional headersprovider
: Optional provider specification (“openai”, “anthropic”, or None for custom)
send_log
Parameters
provider
: Optional provider specification (“openai”, “anthropic”, or None for custom)request
: A dictionary containing the request parametersresponse
: Either a dictionary or string response to logoptions
: A LoggingOptions dictionary with timing information
HeliconeResultRecorder
Advanced Usage Examples
Direct Logging with String Response
For direct logging of string responses:Streaming Responses
For streaming responses with Python, you can use thelog_request
method with time to first token tracking:
Using with Anthropic
Custom Model Integration
For custom models that don’t have a specific provider integration:Direct Stream Logging
For direct control over streaming responses, you can use thesend_log
method to manually track time to first token: