Integrating AWS Bedrock Models (DeepSeek)
Last updated
Last updated
The AWS Bedrock Marketplace is a comprehensive platform for deploying large language models (LLMs). It allows developers to discover, test, and deploy over 100 emerging foundation models (FMs) seamlessly.
This guide will take the deployment of DeepSeek models as an example to demonstrate how to deploy model on the Bedrock Marketplace platform and integrate it into the Dify platform, helping you quickly build AI applications based on DeepSeek models.
An AWS account with access to Bedrock.
Navigate to the Bedrock Marketplace and search for DeepSeek.
Choose a DeepSeek model based on your requirements.
Go to the Model detail page and click Deploy.
Follow the instructions to configure the deployment settings.
Note: Model versions require different compute configurations, affecting costs.
Once deployment is complete, navigate to the Marketplace Deployments page to find the auto-generated Endpoint. This endpoint is equivalent to a SageMaker endpoint and will be used for connecting to the Dify platform.
Log in to the Dify management panel and go to the Settings page.
On the Model Provider page, select Amazon SageMaker.
Click Add Model and fill in the following information:
Model Type: Select LLM as the model type
Model Name: Provide a custom name for your model
SageMaker Endpoint: Enter the endpoint retrieved from the Bedrock Marketplace
Open Dify and select Create a Blank App.
Select either Chatflow or Workflow.
Add an LLM node.
Verify model responses (see screenshot below for expected responses).
Note: You can also create a Chatbot application for additional testing.
Ensure that the compute instance is configured correctly and that AWS permissions are properly set. If the issue persists, consider redeploying the model or contacting AWS customer support.