How to Run DeepSeek Locally: A Step-by-Step Guide

Published:March 27, 2025

Reading Time: 4 minutes

Unlike traditional AI models, DeepSeek, the AI model released in 2025, excels at understanding context, making it ideal for handling complex real-world scenarios with intuitive and natural solutions.

So why would you ever run DeepSeek locally?

For one, it gives you total control over your data, providing privacy and security. In addition, local deployment can be much faster and more reliable than cloud-based services, especially in areas with poor internet connectivity.

In this article, I’ll show you how to download and run DeepSeek locally with Ollama in just a few easy steps. You can follow along, even if you’re not a tech expert.

Step-by-Step Guide to Running DeepSeek Locally

Step 1: Install Ollama

To run DeepSeek locally, you’ll need Ollama, a tool that simplifies the process of installing and managing AI models. Here’s how to get started:

Download Ollama: Visit the Ollama website and download the installer for your operating system (Windows, macOS, or Linux).

Run deepseek locally Ollama

Install Ollama: Run the installer and follow the prompts to complete the installation. It’s straightforward and should only take a few minutes.

Verify Installation: Open your terminal or command prompt and type ollama --version. This will confirm that Ollama is installed correctly.

run deepseek locally

Step 2: Choose Your DeepSeek Model

You can visit Ollama’s Github repository for a list of models with their corresponding size and download prompts.

The DeepSeek model comes in different sizes, each with its own set of requirements. Here’s a quick overview:

Model VersionParametersGPU Memory Required
7B7 Billion4.7 GB
671B671 Billion404GB

Though there are other DeepSeek model versions you can find here, including 1.5b, 7b, 14b, 32b, 70b, 671b.

For beginners, the 7B version might be better because it has a lighter weight and is easier to run on most computers. And that’s in fact what I used in this test.

Step 3: Download DeepSeek Model

Now, use Ollama to download the DeepSeek model:

  1. Open Terminal: Navigate to your terminal or command prompt.
  2. Download Command: Type ollama run deepseek-r1 to download the 7B model. The download time will depend on your internet speed.
  3. Monitor Progress: Keep an eye on the terminal as it downloads and installs the model.

Step 4: Verify Installation

After the download is complete, verify that DeepSeek is installed correctly:

  1. List Installed Models: Use the command ollama list to see all installed models.
  2. Check for DeepSeek: Look for DeepSeek in the list to confirm it’s ready to use.

Step 5: Run DeepSeek

Now that DeepSeek is installed, you can start using it:

  1. Run DeepSeek: Type ollama run deepseek-r1 to start interacting with DeepSeek.
  2. Explore Capabilities: Try out different tasks like coding assistance, data analysis or random tricky questions like I did.

And yeah, that’s about it.

Why Run DeepSeek Locally?

  1. Privacy and Security:
    • Running DeepSeek locally means your data is never uploaded to some random server. In industries such as healthcare and finance, for instance, this becomes even more relevant as protecting data is of utmost importance.
  2. Uninterrupted Access:
    • Local deployment also prevents hitting the rate limits, downtime, or service interruptions that some cloud services impose. This means no interruptions in your use of DeepSeek.
  3. Performance:
    • Local inference provides faster responses compared to cloud-based services, as it avoids API latency. This results in quicker interactions and more efficient workflow.
  4. Customization and Control:
    • Running DeepSeek locally allows you to customize the model to fit your specific needs. You can modify parameters, fine-tune prompts, and integrate the model into local applications.
  5. Cost Efficiency:
    • Running DeepSeek locally means you never pay cloud API fees. Over time, this can lead to considerable cost savings.
  6. Offline Availability:
    • Once the model is downloaded, you can use DeepSeek without an Internet connection. This is useful for regions with poor connectivity or when offline use is needed.
  7. Legal Compliance:
    • Running DeepSeek locally helps comply with data protection regulations like GDPR by keeping data within your own infrastructure, without worrying about the headaches associated with cross-border data transfers.

The Bottom Line

This should help you run DeepSeek locally. And finally keep in mind that you cannot bite off more than you can chew, so have fun at each step of what DeepSeek has to offer you, and the benefits of faster performance, better security, and lower cost.

FAQs

  1. Can I run DeepSeek locally?
    • Yes, you can run DeepSeek locally using tools like Ollama. This allows you to control your data and enjoy faster performance without relying on cloud services.
  2. How to run DeepSeek in cmd?
    • To run DeepSeek in the command prompt (cmd), you need to install Ollama first. Then, use the command ollama run deepseek-r1:1.5b or ollama run deepseek-r1 to start an interactive session with DeepSeek.
  3. How to use DeepSeek offline?
    • DeepSeek can be used offline once it’s installed locally. After setting it up with Ollama, you can interact with DeepSeek without an internet connection.

Onome

Contributor & AI Expert