Skip to main content

Langfuse Integration Guide

This document describes how to set up and use Langfuse for observability and debugging of the Vibe AI Browser Agent.

What is Langfuse?

Langfuse is an open-source LLM observability and analytics platform that helps you:

  • Track all LLM calls and their performance
  • Debug agent execution issues
  • Analyze token usage and costs
  • Monitor response times and errors
  • View detailed traces of multi-step agent workflows

Architecture

The integration uses:

  • Langfuse SDK (langfuse) - Core client library
  • Langfuse LangChain (langfuse-langchain) - LangChain callback handler

All LLM calls made through the agent are automatically traced and sent to Langfuse.

Deployment Options

Deploy Langfuse to your Azure account using the provided script:

# Deploy Langfuse to Azure
./scripts/deploy-langfuse-azure.sh

This will:

  1. Create a PostgreSQL Flexible Server
  2. Deploy Langfuse as an Azure Container Instance
  3. Configure networking and security
  4. Save credentials to .env.langfuse

Cost Estimate: ~$30-50/month for basic usage (1 CPU, 2GB RAM, 32GB storage)

Option 2: Langfuse Cloud

Use the hosted version at https://cloud.langfuse.com:

  1. Create an account at https://cloud.langfuse.com
  2. Create a new project
  3. Get your API keys from Settings > API Keys
  4. Run ./scripts/configure-langfuse.sh

Cost: Free tier available, paid plans start at $59/month

Option 3: Self-Hosted

Deploy Langfuse using Docker:

# Clone Langfuse
git clone https://github.com/langfuse/langfuse.git
cd langfuse

# Start with Docker Compose
docker-compose up -d

Then run ./scripts/configure-langfuse.sh with your self-hosted URL.

Configuration

Step 1: Deploy Langfuse

Choose one of the deployment options above.

Step 2: Configure the Extension

Run the configuration script:

./scripts/configure-langfuse.sh

This will:

  1. Prompt for your Langfuse URL and API keys
  2. Update the .env file with Langfuse credentials
  3. Save configuration to .env.langfuse for future reference

Step 3: Load Environment Variables

export $( < .env )

Step 4: Build and Test

# Build the extension
npm run build

# Run tests
npm test

Usage

Viewing Traces

  1. Visit your Langfuse URL (e.g., http://your-instance.eastus.azurecontainer.io:3000)
  2. Navigate to your project
  3. Click on "Traces" to see all agent executions
  4. Click on a trace to view detailed information:
    • LLM calls and responses
    • Tool executions
    • Token usage
    • Execution time
    • Error details

Analyzing Agent Behavior

Langfuse provides several views to analyze agent behavior:

Traces View: See the full execution flow

  • All LLM calls in chronological order
  • Input/output for each step
  • Execution time breakdown

Sessions View: Group related agent invocations

  • Track multi-turn conversations
  • Analyze session-level metrics

Metrics Dashboard: Aggregate statistics

  • Total token usage
  • Average response time
  • Error rates
  • Cost analysis

Debugging Failed Evaluations

When an evaluation test fails, use Langfuse to:

  1. Find the trace for that test run
  2. View the timeline of LLM calls
  3. Identify where the agent got stuck or looped
  4. Check token usage to see if context window was exhausted
  5. Examine error messages and stack traces

Example workflow:

# Run evaluation
npm run eval

# If it fails, check Langfuse for the trace
# Look for:
# - Repeated tool calls (doom loop)
# - High token usage (context exhaustion)
# - Long response times (timeout)
# - Error messages

Managing Azure Deployment

Use the management script to control your Azure deployment:

# Check status
./scripts/manage-langfuse-azure.sh status

# View logs
./scripts/manage-langfuse-azure.sh logs

# Restart container
./scripts/manage-langfuse-azure.sh restart

# Stop container (to save costs)
./scripts/manage-langfuse-azure.sh stop

# Start container
./scripts/manage-langfuse-azure.sh start

# Delete entire deployment
./scripts/manage-langfuse-azure.sh delete

Environment Variables

The following environment variables control Langfuse integration:

# Required for Langfuse to work
LANGFUSE_PUBLIC_KEY=pk-lf-... # Your project public key
LANGFUSE_SECRET_KEY=sk-lf-... # Your project secret key
LANGFUSE_BASE_URL=https://... # Langfuse instance URL

# Optional (for Azure deployment management)
LANGFUSE_RESOURCE_GROUP=... # Azure resource group name
LANGFUSE_CONTAINER_GROUP=... # Azure container group name
LANGFUSE_POSTGRES_SERVER=... # PostgreSQL server name

Troubleshooting

Langfuse not logging traces

  1. Check that environment variables are set:

    echo $LANGFUSE_PUBLIC_KEY
    echo $LANGFUSE_SECRET_KEY
    echo $LANGFUSE_BASE_URL
  2. Check browser console for Langfuse errors:

    [Langfuse] Integration enabled for LLM observability
  3. Verify Langfuse is accessible:

    curl $LANGFUSE_BASE_URL/api/health

Azure deployment issues

  1. Check container logs:

    ./scripts/manage-langfuse-azure.sh logs
  2. Verify container is running:

    ./scripts/manage-langfuse-azure.sh status
  3. Check PostgreSQL connectivity:

    az postgres flexible-server show \
    --resource-group vibe-langfuse-rg \
    --name <your-postgres-server>

High costs

If Azure costs are too high:

  1. Stop the container when not in use:

    ./scripts/manage-langfuse-azure.sh stop
  2. Use Langfuse Cloud free tier instead

  3. Reduce PostgreSQL tier:

    az postgres flexible-server update \
    --resource-group vibe-langfuse-rg \
    --name <your-server> \
    --sku-name Standard_B1ms

Best Practices

  1. Use descriptive trace names: Langfuse automatically uses the agent prompt as trace name
  2. Tag traces: Add metadata to traces for easier filtering
  3. Monitor token usage: Set up alerts for high token usage
  4. Regular cleanup: Archive or delete old traces to save storage
  5. Cost monitoring: Enable Azure Cost Management alerts

Security

  • Store .env.langfuse securely and never commit to git
  • Use Azure Key Vault for production credentials
  • Restrict PostgreSQL firewall rules to specific IPs
  • Enable SSL for PostgreSQL connections
  • Use Azure Private Link for container-to-database communication

Additional Resources

Langfuse Deployment Options - Comprehensive Comparison

What Happened with the Initial Deployment

Issues Encountered:

  1. Region restrictions: eastus, westus2 don't support PostgreSQL Flexible Server
  2. Provider registration: Required manual registration of Microsoft.DBforPostgreSQL and Microsoft.ContainerInstance
  3. Script timeout: PostgreSQL was created successfully but script timed out waiting for response
  4. Container API issues: Missing --os-type parameter in Azure CLI command

Root Cause: Azure subscription wasn't pre-configured with all required resource providers. This is common with new subscriptions.

What's Working Now (ALL FIXED):

  • ✅ PostgreSQL Flexible Server created and ready in centralus
  • ✅ Database created
  • ✅ Firewall configured
  • ✅ Container Instances deployment fixed (added --os-type Linux)
  • ✅ App Service deployment fixed (added WEBSITES_PORT=3000)
  • ✅ Container Apps deployment script created (RECOMMENDED)

Deployment Option Comparison

1. Azure Container Instances (ACI) ⭐ Current Approach

What It Is: Serverless containers without managing VMs

Pros:

  • ✓ Simplest Azure-native option
  • ✓ No cluster management
  • ✓ Pay per second
  • ✓ Fast startup (<1 minute)
  • ✓ Good for development/staging

Cons:

  • ✗ Limited scalability
  • ✗ No auto-scaling
  • ✗ Basic networking
  • ✗ Less suitable for production

Cost: ~$35/month (1 vCPU, 2GB RAM) + $12/month (PostgreSQL) Total: ~$47/month

Best For: Development, testing, small teams


What It Is: Managed Kubernetes-based platform (built on AKS/KEDA)

Pros:

  • ✓ Auto-scaling (0 to N replicas)
  • ✓ Built-in ingress with SSL
  • ✓ Better networking (VNet integration)
  • ✓ Revision management (blue/green deployments)
  • ✓ Can add PostgreSQL as managed add-on
  • ✓ Production-ready
  • ✓ No cluster management
  • ✓ Scale to zero (cost savings)

Cons:

  • ✗ Slightly more complex than ACI
  • ✗ Requires VNet understanding

Cost:

  • Free tier: 180,000 vCPU-seconds, 360,000 GiB-seconds/month
  • After free tier: ~$50-80/month (with PostgreSQL)

Best For: Production, scalable applications, cost optimization

Deployment:

# Create Container App with PostgreSQL add-on
az containerapp env create \
--name langfuse-env \
--resource-group vibe-langfuse-rg \
--location centralus

az postgres flexible-server create # (already done)

az containerapp create \
--name langfuse \
--resource-group vibe-langfuse-rg \
--environment langfuse-env \
--image ghcr.io/langfuse/langfuse:3 \
--target-port 3000 \
--ingress external \
--env-vars NODE_ENV=production \
--secrets database-url=... \
--cpu 1 --memory 2Gi \
--min-replicas 0 --max-replicas 3

3. Azure Kubernetes Service (AKS) ⭐⭐⭐ Enterprise

What It Is: Fully managed Kubernetes cluster

Pros:

  • ✓ Full Kubernetes features
  • ✓ Production-grade
  • ✓ Advanced networking/security
  • ✓ Multi-app hosting
  • ✓ Helm charts support
  • ✓ Complete control

Cons:

  • ✗ Most expensive
  • ✗ Requires Kubernetes expertise
  • ✗ Overhead for single app
  • ✗ More complex management

Cost:

  • Control plane: Free
  • Worker nodes: ~$73/month (2x Standard_B2s) minimum
  • PostgreSQL: $12/month Total: ~$85/month minimum

Best For: Large organizations, multiple applications, Kubernetes experts

Deployment:

az aks create \
--resource-group vibe-langfuse-rg \
--name langfuse-cluster \
--node-count 2 \
--node-vm-size Standard_B2s \
--enable-managed-identity

kubectl apply -f langfuse-deployment.yaml

4. VM with k3s ⭐⭐ Cost-Effective Advanced

What It Is: Lightweight Kubernetes (k3s) on a single VM

Pros:

  • ✓ Full Kubernetes capabilities
  • ✓ Lower cost than AKS
  • ✓ Can host multiple apps
  • ✓ Complete control
  • ✓ Good learning environment

Cons:

  • ✗ Manual VM management
  • ✗ Manual k3s maintenance
  • ✗ Single point of failure
  • ✗ Manual backups needed

Cost:

  • VM (Standard_B2s): ~$30/month
  • PostgreSQL: $12/month Total: ~$42/month

Best For: Learning Kubernetes, cost-conscious teams, tech-savvy users

Deployment:

# Create VM
az vm create \
--resource-group vibe-langfuse-rg \
--name langfuse-vm \
--image Ubuntu2204 \
--size Standard_B2s \
--admin-username azureuser \
--generate-ssh-keys

# SSH and install k3s
ssh azureuser@<vm-ip>
curl -sfL https://get.k3s.io | sh -

# Deploy Langfuse
kubectl apply -f langfuse.yaml

5. Azure App Service (Containers) ⚠️ Limited

What It Is: PaaS for web applications

Pros:

  • ✓ Simple deployment
  • ✓ Built-in CI/CD
  • ✓ Auto-scaling
  • ✓ Good for web apps

Cons:

  • ✗ Less flexible for containers
  • ✗ More expensive than ACI
  • ✗ Limited to HTTP/HTTPS
  • ✗ Not ideal for Langfuse

Cost: ~$55/month (Basic tier) + $12/month (PostgreSQL)

Best For: Traditional web apps, not recommended for Langfuse


6. Langfuse Cloud ⭐⭐⭐ Easiest

What It Is: Official Langfuse SaaS

Pros:

  • ✓ Zero infrastructure management
  • ✓ 5-minute setup
  • ✓ Free tier (50k observations/month)
  • ✓ Automatic updates
  • ✓ Professional support
  • ✓ High availability

Cons:

  • ✗ Data not in your Azure
  • ✗ Less control
  • ✗ Ongoing subscription cost

Cost:

  • Free: 50k observations/month
  • Hobby: $59/month (500k observations)
  • Pro: $299/month (5M observations)

Best For: Fast setup, teams without DevOps, testing phase

Setup:

# 1. Go to https://cloud.langfuse.com
# 2. Create account
# 3. Create project
# 4. Get API keys
./scripts/configure-langfuse.sh

Recommendation Matrix

Use CaseBest OptionWhy
Quick TestingLangfuse CloudNo infrastructure, 5 minutes
DevelopmentAzure Container AppsAuto-scale, cost-effective
Production (Small Team)Azure Container AppsBest balance of features/cost
Production (Enterprise)AKSFull control, multi-app support
Cost-ConsciousVM + k3sMost cost-effective self-hosted
Learning K8sVM + k3sFull K8s features, low cost
Current ACIMigrate to Container AppsBetter features, similar complexity

My Recommendation: Azure Container Apps

Why Container Apps?

  1. Best Balance: Production features without cluster complexity
  2. Cost Optimization: Scale to zero when not in use
  3. Future-Proof: Built on Kubernetes, can migrate to AKS later
  4. Better than ACI: Auto-scaling, better networking, SSL
  5. Easier than AKS: No cluster management, no node maintenance

Migration Path:

Current: ACI (working PostgreSQL)

Short term: Fix ACI deployment (1 hour)

Better: Migrate to Container Apps (2 hours)

Future: Migrate to AKS if needed (when team grows)

Cost Comparison (Monthly)

OptionComputeDatabaseTotalNotes
Langfuse CloudIncludedIncluded$0-299Free tier available
ACI$35$12$47Current approach
Container Apps$0-50*$12$12-62*Free tier + scale-to-zero
VM + k3s$30$12$42Requires management
AKS$73+$12$85+Minimum 2 nodes
App Service$55$12$67Not ideal for containers

Next Steps - Choose Your Path

Path A: Quick Win (5 minutes)

# Use Langfuse Cloud
./scripts/configure-langfuse.sh
# Enter cloud.langfuse.com URL and API keys

Path B: Fix Current ACI (30 minutes)

I'll create a fixed deployment script with proper parameters

Path C: Upgrade to Container Apps (2 hours)

I'll create a Container Apps deployment script

Path D: Full Control with k3s (4 hours)

I'll create a VM + k3s deployment script

Which path do you want to take?

YO