/
or {
to insert special variable blocks or upstream node variables into the prompt as context content.
result
of knowledge retrieval needs to be configured in the context variable within the LLM node for association and assignment. After association, inserting the context variable at the appropriate position in the prompt can incorporate the externally retrieved knowledge into the prompt.
This variable can be used not only as external knowledge introduced into the prompt context for LLM responses but also supports the application’s citation and attribution feature due to its data structure containing segment reference information.
Refer to File Upload for guidance on building a Chatflow/Workflow application with file upload functionality.Conversation History To achieve conversational memory in text completion models (e.g., gpt-3.5-turbo-Instruct), Dify designed the conversation history variable in the original Prompt Expert Mode (discontinued). This variable is carried over to the LLM node in Chatflow, used to insert chat history between the AI and the user into the prompt, helping the LLM understand the context of the conversation.
gpt-4
.
JSON Schema Editor
name
, email
, age
without nested structures
Note: Object and array type fields can contain child fields.
Note: Deleting an object or array removes all its child fields.
“I need a JSON Schema for user profiles with username (string), age (number), and interests (array).”
order_details
, product_lists
)
pattern
(regex matching) or oneOf
(multiple type support)
“I need a JSON Schema for user profiles with username (string), age (number), and interests (array).”
result
of the knowledge retrieval node into the context variable of the LLM node;result
variable output by the Knowledge Retrieval Node also includes segmented reference information. You can view the source of information through the Citation and Attribution feature.
text
of the document extractor node into the prompt of the LLM node.