The world of AI is rapidly evolving, and the ability to create intelligent agents tailored to specific needs is becoming increasingly valuable. This article will guide you through the process of setting up your own AI agent using DeepSeek (a powerful language model) and Dify (an open-source AI application development platform), all on your local machine. This approach offers the benefits of data privacy, security, and customization.
Before diving into the deployment process, ensure your system meets the following requirements:
Hardware:
Software:
Ollama simplifies the processof running LLMs locally, allowing you to deploy DeepSeek with ease improving efficiency and speed.
Install Ollama:
ollama -v
in your terminal. This should display the installed Ollama version.Download and Run the DeepSeek Model:
ollama run deepseek-r1:7b
Alternative (Uncensored):
ollama run huihui_ai/deepseek-r1-abliterated:7b
*Note: The uncensored version might provide different outputs depending on the use case.
7b
in the command refers to the model size (7 billion parameters). Choose a model size appropriate for your hardware. Smaller models require less resources but might have lower performance. Use the ollama website to search for different model sizes.Test the Installation:
Dify provides a user-friendly interface for building and managing AI applications. Integrating DeepSeek with Dify empowers you to create more sophisticated agents with knowledge retrieval capabilities.
Download the Dify Code:
git clone https://github.com/langgenius/dify.git
Navigate to the Docker Directory:
dify/docker
directory within the downloaded Dify folder:cd dify/docker
Copy the Environment Configuration File:
.env.example
file and name it .env
:cp .env.example .env
Configure Docker Proxy (Optional):
/etc/docker/daemon.json
on Linux) and add the following content:{
"registry-mirrors": [
"https://dockerpull.org",
"https://docker.1panel.dev",
"https://docker.foreverlink.love",
"https://docker.fxxk.dedyn.io",
"https://docker.xn--6oq72ry9d5zx.cn",
"https://docker.zhai.cm",
"https://docker.5z5f.com",
"https://a.ussh.net",
"https://docker.cloudlayer.icu",
"https://hub.littlediary.cn",
"https://hub.crdz.gq",
"https://docker.unsee.tech",
"https://docker.kejilion.pro",
"https://registry.dockermirror.com",
"https://hub.rat.dev",
"https://dhub.kubesre.xyz",
"https://docker.nastool.de",
"https://docker.udayun.com",
"https://docker.rainbond.cc",
"https://hub.geekery.cn",
"https://docker.1panelproxy.com",
"https://atomhub.openatom.cn",
"https://docker.m.daocloud.io",
"https://docker.1ms.run",
"https://docker.linkedbus.com"
]
}
Restart Docker after making these changes.
Run Dify with Docker Compose:
docker compose up -d
Now that both DeepSeek and Dify are running, the next step is to connect them so you can use DeepSeek as the language model within your Dify applications.
Access Dify in Your Browser:
Open your web browser and navigate to http://127.0.0.1
.
You should see the Dify initialization page. Create an administrator account by providing the required information.
Configure the Model Provider:
Click on your avatar in the top right corner and select "Settings".
Navigate to "Model Providers" and select "Ollama".
Enter DeepSeek Model Information:
Fill out the form with the following information:
Model Name: deepseek-r1:7b
Base URL: http://<your_machine_ip>:11434
host.docker.internal
or your machine's actual IP address instead of 127.0.0.1
. Otherwise, Dify will not be able to access the Ollama service running on your host machine.Max Tokens: 32768
Save your configuartion.
With DeepSeek connected to Dify, you can now build your AI agent with a knowledge base.
Create a Knowledge Base:
Create an Agent:
Go to the "Projects" section and create a new project.
Configure the project as an "Agent".
Connect the knowledge base you created to the agent.
Define the agent's role and instructions to tailor its behavior.
This is where you leverage the knowledge base and instruct the agent on how to use it.
For more information and troubleshooting, refer to these resources:
By following these steps, you can establish a local environment for creating AI agents using DeepSeek and Dify, with local data processing. This setup allows you to experiment with AI, customize agents for specific tasks, and ensure data security within your local infrastructure, making it an excellent way to explore AI development.