Langfuse in the Enterprise
Langfuse addresses key challenges when deploying LLM-based applications within an enterprise. Being open source, Langfuse is ideal for enterprises to address data security and privacy concerns by self-hosting. This document outlines common queries related to using Langfuse in an enterprise setting.
Langfuse is licensed as an open-core project. It's core tracing features are MIT-licensed and freely available (also for commercial use). Some Langfuse features on the periphery are not available in the open-source version and cannot be used out of the box. Please refer to the Enterprise Edition FAQ here. Please reach out to enterprise@langfuse.com to discuss an enterprise license (self-hosted or cloud) for your team. Enterprise licenses start at $500/month.
Select Introductory Resources
- Overview: Slide Deck Langfuse (opens in a new tab) (July 2024)
- Why Langfuse?
- Langfuse has been included in the Thoughtsworks Tech Radar (opens in a new tab)
- Enterprise Edition (EE) FAQ
Introduction to Langfuse
Langfuse features along the development lifecycle
Resources
-
Product
- Scope of LLM Engineering Platform
- Positioning of Langfuse
- Technical documentation with detailed information on all features and integrations
- Interactive demo of the Langfuse platform
- Changelog
- Roadmap
- Traces as the core of the LLMOps workflow (webinar + slides)
- Langfuse has been included in the Thoughtsworks Tech Radar (opens in a new tab)
-
Langfuse Cloud
- Cloud Pricing
- Security and Compliance - includes information on GDPR, SOC2 and ISO27001 compliance
-
Self-hosting
- Open Source Licensing
- Feature availability in Enterprise Edition vs Open Source
- Architecture and component diagram: Contributing.md (opens in a new tab). Note that the architecture will change with Langfuse v3 (opens in a new tab).
- Langfuse Self-Hosting: EE Terms and Conditions
FAQ
We collect the most common questions and answers here. If you have questions that are not answered, please reach out to us: enterprise@langfuse.com
What deployment options are available for Langfuse?
- Managed Cloud (cloud.langfuse.com), see Pricing and Security page for details.
- Self-hosted on your own infrastructure. Contact us if you are interested in additional support. Note that some of the infrastructure requirements will change with Langfuse v3 (opens in a new tab).
What is the difference between Langfuse Cloud and the open-source version?
The Langfuse team provides Langfuse Cloud as a managed solution to simplify the initial setup of Langfuse and to minimize the operational overhead of maintaining high availability in production. You can chose to self-host Langfuse on your own infrastructure.
Some features are not available in the open-source version. Please refer to the overview here.
How does Authentication and RBAC work in Langfuse?
Langfuse offers a list of prebuilt roles which apply on an organizational and project level to restrict access (RBAC documentation).
If needed, environments (production, staging, development) can be separated into different projects in Langfuse to restrict access to production/sensitive data while making it easier to share development environments with the team and other stakeholders.
SSO with Langfuse is simple. Currently Google, GitHub, Azure AD, Okta, Auth0, and AWS Cognito are supported. We can easily add additional providers based on your requirements. As an enteprise customer, you can also enforce SSO for your organization.
What is the easiest way to try Langfuse?
The Hobby Plan on Langfuse Cloud (opens in a new tab) includes enough resources to try Langfuse for free while in a non-production environment, no credit card required.
Alternatively, you can quickly spin up Langfuse on your own machine using docker compose up
(docs).
If you require security and compliance features to run a POC, please reach out to us at enterprise@langfuse.com.
Common Enterprise LLM Platform Architecture
Langfuse aims to address the challenges of debugging, monitoring, and continuously improving LLM-based applications. It focuses on observability, evaluation, and prompt management.
Langfuse is often deployed alongside a central LLM Gateway that provides schema translation, rate limiting, and PII redaction. The LLM Gateway can be an internal service or an open-source project like LiteLLM. If you use LiteLLM, you can leverage the native integration (docs).
Talk to us
Schedule an introduction call to discuss your specific needs and requirements.