This page is being phased out as part of our documentation reorganization.
Click this card to be redirected to the updated version with the most current information.
If you notice any discrepancies or areas needing improvement in the new documentation, please use the “Report an issue” button at the bottom of the page.
dify
and copied it to the /usr/local/bin
path, you can run the following command to create a new plugin project:
LLM
type plugin.
/providers
path.
Here’s an example of the anthropic.yaml
configuration file for Anthropic
:
OpenAI
provides a fine-tuned model, you need to add the model_credential_schema
field.
The following is sample code for the OpenAI
family of models:
anthropic.py
, in the /providers
folder and implement a class
that inherits from the __base.provider.Provider
base class, e.g. AnthropicProvider
. The following is the Anthropic
sample code:
__base.model_provider.ModelProvider
base class and implement the validate_provider_credentials
vendor uniform credentials validation method, see AnthropicProvider.
validate_provider_credentials
implementation first and reuse it directly after the model credentials verification method is implemented. For other types of model providers, please refer to the following configuration methods.
Xinference
, you can skip the full implementation step. Simply create an empty class called XinferenceProvider
and implement an empty validate_provider_credentials
method in it.
Detailed Explanation:
• XinferenceProvider
is a placeholder class used to identify custom model providers.
• While the validate_provider_credentials
method won’t be actually called, it must exist because its parent class is abstract and requires all child classes to implement this method. By providing an empty implementation, we can avoid instantiation errors that would occur from not implementing the abstract method.