25 Jan 2025

How to watch Deepseek R1 think

tl;dr: use together.ai with Open Web UI (formerly known as the Ollama Web UI)

Deepseek R1 came out recently, and it’s the first open source reasoning model that is in the same league as OpenAI’s o1 model. One of the coolest things about the new model is that you can see the reasoning tokens it uses to do its chain-of-thought before it answers you at the beginning of the output (the LLM simply replies with its thoughts surrounded by <think> and </think> tags.).

Unfortunately Deepseek doesn’t expose these thinking tokens through their own API (although they’re visible if you use their proprietary chat interface). But together.ai also offers API access to Deepseek R1, and they do expose the <think></think> info!

So the way I get ChatGPT like experience with Deepseek R1 is:

  • Spin up a local copy of Open Web UI (after running this command it’ll be available at http://localhost:3981)

    docker run -d -p 3981:8080 \
    		--add-host=host.docker.internal:host-gateway \
    		-v open-webui:/app/backend/data \
    		--name open-webui \
    		--restart always \
    		ghcr.io/open-webui/open-webui:main
    
  • Add together.ai to your models list in Open Web UI (Go to Settings → Admin Settings → Connections, and add a new connection for together.ai using base url https://api.together.xyz/v1)

And that’s it, after that just pick Deepseek R1 from the models list and start chatting