Want to harness the power of DeepSeek AI without relying on cloud services? This guide provides a step-by-step walkthrough on how to set up and run the DeepSeek AI model locally using Ollama and Chatbox. You'll be able to run AI-powered interactions efficiently on your local machine.
Before diving in, ensure you have the following:
Ollama is a platform that simplifies running powerful AI models locally using Docker containers. It packages models and dependencies into a single, easy-to-distribute image.
DeepSeek AI is known for its advanced machine learning capabilities and is optimized for deep querying and AI analysis. In this guide, we'll use the deepseek-r1:8b
version.
ChatBox provides a user-friendly interface to interact with these AI models seamlessly.
In this guide, we'll use the Chatbox interface to interact with our locally hosted deepseek model provided by Ollama.
Open your terminal and execute the following Docker command:
docker run -d --name ollama -p 11434:11434 ollama/ollama
This command does the following:
-d
: Runs the container in detached mode (in the background).--name ollama
: Assigns the name "ollama" to the container for easy management.-p 11434:11434
: Maps port 11434 on your host machine to port 11434 in the container, which is the default port Ollama uses.ollama/ollama
: Specifies the Docker image to use, in this case, the official Ollama image.Once the Ollama container is running and has finished downloading, pull the DeepSeek AI model:
docker exec -it ollama ollama pull deepseek-r1:8b
This command fetches the deepseek-r1:8b
version of the model from the Ollama repository.
Make sure that Ollama is running and that the model has successfully been downloaded.
docker exec -it ollama ollama list
If the model is listed, you are good to go.
Go to the ChatBoxAI website and download the application for your operating system. Install the downloaded file.
Launch ChatBox and navigate to the settings. Select the local provider and add the ollama url http://localhost:11434
. Next, select the deepseek-r1:8b
model from the list.
With the model running and ChatBox configured, you can now interact with the DeepSeek AI model. Ask questions, run tasks, and explore its capabilities.
For example, try asking it to write a simple Python script. Though DeepSeek isn't specifically designed for coding, its response can be insightful.
By following this guide, you've successfully set up DeepSeek AI locally using Ollama and ChatBox. This setup allows you to leverage the power of advanced AI models on your own machine, enhancing privacy and control.
Consider supporting the author with a coffee if you found this article helpful!