What You’ll Build
An AI debate simulator where:- Users can select topics and watch AI-generated debates
- Four different integration methods showcase flexibility
- Helicone provides complete observability for all approaches
- Both streaming and non-streaming responses are supported
Prerequisites
- Next.js project with TypeScript
- Vercel AI Gateway API key from your Vercel dashboard
- Helicone API key from Helicone
Setup
Install the required dependencies:Integration Methods
1. Vercel AI SDK (Non-Streaming)
Create a basic debate generation endpoint using the Vercel AI SDK:2. Vercel AI SDK (Streaming)
Enable real-time debate streaming for better user experience:3. OpenAI SDK (Non-Streaming)
Use the OpenAI SDK directly with Vercel AI Gateway routing:4. OpenAI SDK (Streaming)
Enable streaming with the OpenAI SDK:Frontend Integration
Create a debate interface that supports all integration methods:Monitoring in Helicone
View comprehensive analytics for your debate simulator:- Method Comparison: Compare performance across integration methods
- Topic Analytics: See which debate topics are most popular
- Stream vs Non-Stream: Analyze latency and user experience differences
- Cost Tracking: Monitor costs per debate and integration method
Custom Filters
Use Helicone’s property filters to analyze:- Performance by integration method:
property:Method = "vercel-ai-stream"
- Popular topics: Group by
property:Topic
- Streaming usage: Filter by
property:Stream = "true"