What is Langfuse

Langfuse is an open-source LLM engineering platform that helps teams collaborate on debugging, analyzing, and iterating their applications.
Introduction to Langfuse: https://langfuse.com/

How to Configure Langfuse

  1. Register and log in to Langfuse on the official website
  2. Create a project in Langfuse. After logging in, click New on the homepage to create your own project. The project will be used to associate with applications in Dify for data monitoring.
Edit a name for the project. Create a project in Langfuse
  1. Create project API credentials. In the left sidebar of the project, click Settings to open the settings.
Create project API credentials In Settings, click Create API Keys to create project API credentials. Create project API credentials Copy and save the Secret Key, Public Key, and Host. Get API Key configuration
  1. Configure Langfuse in Dify. Open the application you need to monitor, open Monitoring in the side menu, and select Tracing app performance on the page.
Configure Langfuse After clicking configure, paste the Secret Key, Public Key, Host created in Langfuse into the configuration and save. Configure Langfuse Once successfully saved, you can view the status on the current page. If it shows as started, it is being monitored. View configuration status

Viewing Monitoring Data in Langfuse

After configuration, debugging or production data of the application in Dify can be viewed in Langfuse.

List of monitoring data

Trace the information of Workflow and Chatflow

Tracing workflow and chatflow
WorkflowLangFuse Trace
workflow_app_log_id/workflow_run_idid
user_session_iduser_id
workflow_name
start_timestart_time
end_timeend_time
inputsinput
outputsoutput
Model token consumptionusage
metadatametadata
errorlevel
errorstatus_message
[workflow]tags
[“message”, conversation_mode]session_id
conversion_idparent_observation_id
Workflow Trace Info
  • workflow_id - Unique ID of Workflow
  • conversation_id - Conversation ID
  • workflow_run_id - Workflow ID of this runtime
  • tenant_id - Tenant ID
  • elapsed_time - Elapsed time at this runtime
  • status - Runtime status
  • version - Workflow version
  • total_tokens - Total token used at this runtime
  • file_list - List of files processed
  • triggered_from - Source that triggered this runtime
  • workflow_run_inputs - Input of this workflow
  • workflow_run_outputs - Output of this workflow
  • error - Error Message
  • query - Queries used at runtime
  • workflow_app_log_id - Workflow Application Log ID
  • message_id - Relevant Message ID
  • start_time - Start time of this runtime
  • end_time - End time of this runtime
  • workflow node executions - Workflow node runtime information
  • Metadata
    • workflow_id - Unique ID of Workflow
    • conversation_id - Conversation ID
    • workflow_run_id - Workflow ID of this runtime
    • tenant_id - Tenant ID
    • elapsed_time - Elapsed time at this runtime
    • status - Operational state
    • version - Workflow version
    • total_tokens - Total token used at this runtime
    • file_list - List of files processed
    • triggered_from - Source that triggered this runtime

Message Trace Info

For trace llm conversation
MessageLangFuse Generation/Trace
message_idid
user_session_iduser_id
message_name
start_timestart_time
end_timeend_time
inputsinput
outputsoutput
Model token consumptionusage
metadatametadata
errorlevel
errorstatus_message
[“message”, conversation_mode]tags
conversation_idsession_id
conversion_idparent_observation_id
Message Trace Info
  • message_id - Message ID
  • message_data - Message data
  • user_session_id - Session ID for user
  • conversation_model - Conversation model
  • message_tokens - Message tokens
  • answer_tokens - Answer Tokens
  • total_tokens - Total Tokens from Message and Answer
  • error - Error Message
  • inputs - Input data
  • outputs - Output data
  • file_list - List of files processed
  • start_time - Start time
  • end_time - End time
  • message_file_data - Message of relevant file data
  • conversation_mode - Conversation mode
  • Metadata
    • conversation_id - Conversation ID
    • ls_provider - Model provider
    • ls_model_name - Model ID
    • status - Message status
    • from_end_user_id - Sending user’s ID
    • from_account_id - Sending account’s ID
    • agent_based - Whether agent based
    • workflow_run_id - Workflow ID of this runtime
    • from_source - Message source
    • message_id - Message ID

Moderation Trace Information

Used to track conversation moderation
ModerationLangFuse Generation/Trace
user_iduser_id
moderationname
start_timestart_time
end_timeend_time
inputsinput
outputsoutput
metadatametadata
[moderation]tags
message_idparent_observation_id
Message Trace Info
  • message_id - Message ID
  • user_id - user ID
  • workflow_app_log_id workflow_app_log_id
  • inputs - Input data for review
  • message_data - Message Data
  • flagged - Whether it is flagged for attention
  • action - Specific actions to implement
  • preset_response - Preset response
  • start_time - Start time of review
  • end_time - End time of review
  • Metadata
    • message_id - Message ID
    • action - Specific actions to implement
    • preset_response - Preset response

Suggested Question Trace Information

Used to track suggested questions
Suggested QuestionLangFuse Generation/Trace
user_iduser_id
suggested_questionname
start_timestart_time
end_timeend_time
inputsinput
outputsoutput
metadatametadata
[suggested_question]tags
message_idparent_observation_id
Message Trace Info
  • message_id - Message ID
  • message_data - Message data
  • inputs - Input data
  • outputs - Output data
  • start_time - Start time
  • end_time - End time
  • total_tokens - Total tokens
  • status - Message Status
  • error - Error Message
  • from_account_id - Sending account ID
  • agent_based - Whether agent based
  • from_source - Message source
  • model_provider - Model provider
  • model_id - Model ID
  • suggested_question - Suggested question
  • level - Status level
  • status_message - Message status
  • Metadata
    • message_id - Message ID
    • ls_provider - Model Provider
    • ls_model_name - Model ID
    • status - Message status
    • from_end_user_id - Sending user’s ID
    • from_account_id - Sending Account ID
    • workflow_run_id - Workflow ID of this runtime
    • from_source - Message source

Dataset Retrieval Trace Information

Used to track knowledge base retrieval
Dataset RetrievalLangFuse Generation/Trace
user_iduser_id
dataset_retrievalname
start_timestart_time
end_timeend_time
inputsinput
outputsoutput
metadatametadata
[dataset_retrieval]tags
message_idparent_observation_id
Dataset Retrieval Trace Info
  • message_id - Message ID
  • inputs - Input Message
  • documents - Document data
  • start_time - Start time
  • end_time - End time
  • message_data - Message data
  • Metadata
    • message_id - Message ID
    • ls_provider - Model Provider
    • ls_model_name - Model ID
    • status - Model status
    • from_end_user_id - Sending user’s ID
    • from_account_id - Sending account’s ID
    • agent_based - Whether agent based
    • workflow_run_id - Workflow ID of this runtime
    • from_source - Message Source

Tool Trace Information

Used to track tool invocation
ToolLangFuse Generation/Trace
user_iduser_id
tool_namename
start_timestart_time
end_timeend_time
inputsinput
outputsoutput
metadatametadata
[“tool”, tool_name]tags
message_idparent_observation_id
Tool Trace Info
  • message_id - Message ID
  • tool_name - Tool Name
  • start_time - Start time
  • end_time - End time
  • tool_inputs - Tool inputs
  • tool_outputs - Tool outputs
  • message_data - Message data
  • error - Error Message,if exist
  • inputs - Input of Message
  • outputs - Output of Message
  • tool_config - Tool config
  • time_cost - Time cost
  • tool_parameters - Tool Parameters
  • file_url - URL of relevant files
  • Metadata
    • message_id - Message ID
    • tool_name - Tool Name
    • tool_inputs - Tool inputs
    • tool_outputs - Tool outputs
    • tool_config - Tool config
    • time_cost - Time. cost
    • error - Error Message
    • tool_parameters - Tool parameters
    • message_file_id - Message file ID
    • created_by_role - Created by role
    • created_user_id - Created user ID

Generate Name Trace

Used to track conversation title generation
Generate NameLangFuse Generation/Trace
user_iduser_id
generate_namename
start_timestart_time
end_timeend_time
inputsinput
outputsoutput
metadatametadata
[generate_name]tags
Generate Name Trace Info
  • conversation_id - Conversation ID
  • inputs - Input data
  • outputs - Generated session name
  • start_time - Start time
  • end_time - End time
  • tenant_id - Tenant ID
  • Metadata
    • conversation_id - Conversation ID
    • tenant_id - Tenant ID

Langfuse Prompt Management

The Langfuse Prompt Management Plugin (community maintained) lets you use prompts that are managed and versioned in Langfuse in your Dify applications, enhancing your LLM application development workflow. Key features include:
  • Get Prompt: Fetch specific prompts managed in Langfuse.
  • Search Prompts: Search for prompts in Langfuse using various filters.
  • Update Prompt: Create new versions of prompts in Langfuse and set tags/labels.
This integration streamlines the process of managing and versioning your prompts, contributing to more efficient development and iteration cycles. You can find the plugin and installation instructions here.