Integrating AWS Bedrock Models (DeepSeek)
Last updated
Last updated
AWS Bedrock Marketplace is a comprehensive LLM deployment platform that enables developers to discover, evaluate, and utilize over 100 emerging foundational models (FMs) with ease.
This document explains how to deploy the DeepSeek model on the Bedrock Marketplace and integrate it with the Dify platform, facilitating the rapid development of AI applications powered by DeepSeek.
An AWS account with access to Bedrock.
1.1 In the Bedrock Marketplace, search for DeepSeek and select any version of the DeepSeek R1 model or its distilled variant.
1.2 Navigate to the model details page, click "Deploy", and complete the required configuration as prompted to execute a one-click deployment.
Note: Different model versions may require different compute instance configurations, which can affect the associated costs.
1.3 After deployment, review the automatically generated Endpoint on the Marketplace Deployments page. This parameter, which is identical to the SageMaker Endpoint, will be used for subsequent integration with the Dify platform.
2.1 Log in to the Dify management dashboard and navigate to the Settings page.
2.2 In the Model Provider section, locate SageMaker and click the "Add Model" button at the bottom-right corner of the SageMaker card to access the configuration interface.
2.3 On the SageMaker configuration page, complete the following fields:
Model Type: Select LLM.
Model Name: Enter a custom name for the model.
sagemaker endpoint: Input the Endpoint parameter obtained from the AWS Bedrock Marketplace. This parameter can be found on the Endpoint page.
Refer to the Marketplace Deployments page for the auto-generated Endpoint:
For Chatflow / Workflow Applications:
After completing the configuration, test the DeepSeek model within the Dify platform. Click on "Create Blank App" on the left-hand side of the Dify homepage, select either a Chatflow or Workflow application, and add an LLM node.
Refer to the screenshot below to verify that the model is generating responses correctly in the application preview.
In addition to testing with the Chatflow / Workflow app, you can also create Chatbot app for testing.
Ensure that the compute instance is configured correctly and that AWS permissions are properly set. If the issue persists, consider redeploying the model or contacting AWS customer support.
After configuring the model within Dify, invoke it via the provided interface to verify that the input data is processed correctly and that the output matches expectations.