Learn about the Different Models Supported by Dify.

Dify is a development platform for AI application based on LLM Apps, when you are using Dify for the first time, you need to go to Settings --> Model Providers to add and configure the LLM you are going to use.

Dify supports major model providers like OpenAI's GPT series and Anthropic's Claude series. Each model's capabilities and parameters differ, so select a model provider that suits your application's needs. Obtain the API key from the model provider's official website before using it in Dify.

Model Types in Dify

Dify classifies models into 4 types, each for different uses:

  1. System Inference Models: Used in applications for tasks like chat, name generation, and suggesting follow-up questions.

    Providers include OpenAIAzure OpenAI ServiceAnthropic、Hugging Face Hub、Replicate、Xinference、OpenLLM、iFLYTEK SPARKWENXINYIYANTONGYIMinimax、ZHIPU(ChatGLM) OllamaLocalAI、.

  2. Embedding Models: Employed for embedding segmented documents in knowledge and processing user queries in applications.

    Providers include OpenAI, ZHIPU (ChatGLM), Jina AI(Jina Embeddings 2).

  3. Rerank Models: Enhance search capabilities in LLMs.

    Provider: Cohere.

  4. Speech-to-Text Models: Convert spoken words to text in conversational applications.

    Provider: OpenAI.

Dify plans to add more LLM providers as technology and user needs evolve.

Hosted Model Trial Service

Dify offers trial quotas for cloud service users to experiment with different models. Set up your model provider before the trial ends to ensure uninterrupted application use.

  • OpenAI Hosted Model Trial: Includes 200 invocations for models like GPT3.5-turbo, GPT3.5-turbo-16k, text-davinci-003 models.

Setting the Default Model

Dify automatically selects the default model based on usage. Configure this in Settings > Model Provider.

Model Integration Settings

Choose your model in Dify's Settings > Model Provider.

Model providers fall into two categories:

  1. Proprietary Models: Developed by providers such as OpenAI and Anthropic.

  2. Hosted Models: Offer third-party models, like Hugging Face and Replicate.

Integration methods differ between these categories.

Proprietary Model Providers: Dify connects to all models from an integrated provider. Set the provider's API key in Dify to integrate.

Dify uses PKCS1_OAEP encryption to protect your API keys. Each user (tenant) has a unique key pair for encryption, ensuring your API keys remain confidential.

Hosted Model Providers: Integrate third-party models individually.

Specific integration methods are not detailed here.

Using Models

Once configured, these models are ready for application use.

Last updated