Langfuse Documentation
Open-source LLM observability platform for monitoring AI performance and managing evaluation datasets.
Langfuse provides monitoring and observability for LLM applications, enabling teams to track performance metrics, manage datasets, and debug model behavior. Used by AI teams to ensure reliability and compliance. Differentiates through open-source availability and integrated dataset management for continuous model evaluation and risk assessment.
Adjacent tooling.
Aporia
Monitor, test, and safeguard LLMs in production with observability and guardrails.
Dataiku EU AI Act Readiness
Platform helping organizations assess and manage EU AI Act compliance risks.
DataRobot
Real-time AI governance, monitoring and compliance platform for enterprises.
Earthian AI
Enterprise risk management platform purpose-built for AI systems.
IBM watsonx.governance
Unified AI governance platform for model lifecycle management and compliance tracking.
Lakera
LLM security and guardrails for enterprise AI deployment risk management.