Langfuse Integrations Overview
Integrate your application with Langfuse to explore production traces and metrics.
Objective:
- Capture traces of your application
- Add scores to these traces to measure/evaluate quality of outputs
Main Integrations
Integration | Supports | Description |
---|---|---|
SDK | Python, JS/TS | Manual instrumentation using the SDKs for full flexibility. |
OpenAI | Python, JS/TS | Automated instrumentation using drop-in replacement of OpenAI SDK. |
Langchain | Python, JS/TS | Automated instrumentation by passing callback handler to Langchain application. |
LlamaIndex | Python | Automated instrumentation via LlamaIndex callback system. |
Haystack | Python | Automated instrumentation via Haystack content tracing system. |
LiteLLM | Python, JS/TS (proxy only) | Use any LLM as a drop in replacement for GPT. Use Azure, OpenAI, Cohere, Anthropic, Ollama, VLLM, Sagemaker, HuggingFace, Replicate (100+ LLMs). |
Vercel AI SDK | JS/TS | TypeScript toolkit designed to help developers build AI-powered applications with React, Next.js, Vue, Svelte, Node.js. |
API | Directly call the public API. OpenAPI spec available. |
Packages integrated with Langfuse
Name | Type | Description |
---|---|---|
Instructor | Library | Library to get structured LLM outputs (JSON, Pydantic) |
DSPy | Library | Framework that systematically optimizes language model prompts and weights |
Mirascope | Library | Python toolkit for building LLM applications. |
Ollama | Model (local) | Easily run open source LLMs on your own machine. |
Amazon Bedrock | Model | Run foundation and fine-tuned models on AWS. |
Flowise | Chat/Agent UI | JS/TS no-code builder for customized LLM flows. |
Langflow | Chat/Agent UI | Python-based UI for LangChain, designed with react-flow to provide an effortless way to experiment and prototype flows. |
Dify | Chat/Agent UI | Open source LLM app development platform with no-code builder. |
OpenWebUI | Chat/Agent UI | Self-hosted LLM Chat web ui supporting various LLM runners including self-hosted and local models. |
Promptfoo | Tool | Open source LLM testing platform. |
Unsure which integration to choose? Ask us on Discord or in the chat.
Request a new integration
We use GitHub Discussions to track interest in new integrations. Please upvote/add to the list below if you’d like to see a new integration.
End to end examples
If you want to see how things work together, you can look at the end-to-end examples below. They are Jupyter notebooks that you can easily run in Google Colab or locally.
Generally, we recommend reading the get started guides for each integration first.
Integrations
Integration Azure Openai LangchainIntegration DspyIntegration HaystackObservability & Tracing for InstructorIntegration LangchainOpen Source Observability for LangGraphIntegration LangserveIntegration Litellm ProxyIntegration Llama-indexIntegration Llama-index InstrumentationIntegration Llama-index Milvus-liteMonitoring LlamaIndex applications with PostHog and LangfuseIntegration MirascopeIntegration Mistral SdkOllama Observability and Tracing for local LLMs using LangfuseOSS Observability for OpenAI Assistants APIIntegration Openai SdkObserve OpenAI Structured Outputs with LangfuseJs Integration LangchainJs Integration Litellm ProxyJs Integration OpenaiJs Tracing Example Vercel Ai Sdk