并不建议手动来构建 LLMModelConfig,而是允许用户可以在 UI 上选择自己想使用的模型,在这种情况下可以修改一下工具的参数列表,按照如下配置,添加一个 model 参数:
Copy
Ask AI
identity: name: llm author: Dify label: en_US: LLM zh_Hans: LLM pt_BR: LLMdescription: human: en_US: A tool for invoking a large language model zh_Hans: 用于调用大型语言模型的工具 pt_BR: A tool for invoking a large language model llm: A tool for invoking a large language modelparameters: - name: prompt type: string required: true label: en_US: Prompt string zh_Hans: 提示字符串 pt_BR: Prompt string human_description: en_US: used for searching zh_Hans: 用于搜索网页内容 pt_BR: used for searching llm_description: key words for searching form: llm - name: model type: model-selector scope: llm required: true label: en_US: Model zh_Hans: 使用的模型 pt_BR: Model human_description: en_US: Model zh_Hans: 使用的模型 pt_BR: Model llm_description: which Model to invoke form: formextra: python: source: tools/llm.py
请注意在该例子中指定了 model 的 scope 为 llm,那么此时用户就只能选择 llm 类型的参数,从而可以将上述用例的代码改成以下代码:
Copy
Ask AI
from collections.abc import Generatorfrom typing import Anyfrom dify_plugin import Toolfrom dify_plugin.entities.model.llm import LLMModelConfigfrom dify_plugin.entities.tool import ToolInvokeMessagefrom dify_plugin.entities.model.message import SystemPromptMessage, UserPromptMessageclass LLMTool(Tool): def _invoke(self, tool_parameters: dict[str, Any]) -> Generator[ToolInvokeMessage]: response = self.session.model.llm.invoke( model_config=tool_parameters.get('model'), prompt_messages=[ SystemPromptMessage( content='you are a helpful assistant' ), UserPromptMessage( content=tool_parameters.get('query') ) ], stream=True ) for chunk in response: if chunk.delta.message: assert isinstance(chunk.delta.message.content, str) yield self.create_text_message(text=chunk.delta.message.content)