Model Plugins
Introduction to the basic concepts and structure of model plugins. Model plugins allow Dify to call various models from different providers (such as OpenAI, Anthropic, Google, etc.), including large language models (LLMs), text embeddings, speech-to-text, and other types.
Model plugins enable the Dify platform to call all LLMs from a specific model provider. For example, after installing the OpenAI model plugin, the Dify platform can call models provided by OpenAI such as GPT-4
, GPT-4o-2024-05-13
, etc.
Model Plugin Structure
To facilitate understanding the concepts involved in developing model plugins, here is a brief introduction to the structure within model plugins:
- Model Provider: Companies that develop large models, such as OpenAI, Anthropic, Google, etc.
- Model Categories: Depending on the model provider, there are categories like Large Language Models (LLM), Text Embedding models, Speech-to-Text models, etc.
- Specific Models:
claude-3-5-sonnet
,gpt-4-turbo
, etc.
Code hierarchy structure in plugin projects:
Taking Anthropic as an example, the model plugin structure looks like this:
Taking OpenAI as an example, since it supports multiple model types, there are multiple model categories, structured as follows:
Model Configuration
Model plugins define model behavior and properties through configuration files. For detailed model design rules and configuration formats, please refer to the Model Design Rules document and Model Schema specifications.
Further Reading
- Quick Integration of a New Model - Learn how to add new models for already supported providers
- Model Design Rules - Learn detailed specifications for model configuration
- Model Schema - Gain a deeper understanding of model plugin architecture
- General Specification Definitions - Learn how to define plugin metadata
- Basic Concepts of Plugin Development - Return to the plugin development getting started guide