Table of Contents
Step-by-Step Guide: Installing Ollama and Open WebUI on VMware Fusion (Debian 12, M1 Mac)

💡 Setup Guide
🖥️ System Requirements
| Resource | Recommended |
|---|---|
| RAM | 16GB (Minimum: 8GB) |
| vCPUs | 4 (Minimum: 2) |
| Disk Space | 50GB+ |
| GPU | Not required (runs on CPU) |
| OS | Debian 12 |
1️⃣ Create a Debian 12 VM on VMware Fusion
- Download Debian 12 ISO from official Debian website.
- Create a new VM in VMware Fusion:
- Choose “Install from disk or image” → Select the Debian 12 ISO.
- Set CPU: 4 cores (Recommended: 4, Minimum: 2).
- Set RAM: 16GB (Minimum: 8GB).
- Set Disk size: 50GB+.
- Set Network to “Bridged” (so the VM gets its own IP like
192.168.2.43). - Click Finish to create the VM.
- Install Debian 12:
- Select “Standard system utilities”.
- Choose SSH server (if you want remote access).
- Install sudo (important for permissions).
- Update system after installation:
sudo apt update && sudo apt upgrade -y
2️⃣ Install Docker & Portainer
- Install required packages:
sudo apt install -y curl ca-certificates gnupg - Install Docker:
curl -fsSL https://get.docker.com | sudo bash - Enable Docker to start on boot:
sudo systemctl enable --now docker - Install Portainer:
docker volume create portainer_datadocker run -d \ --name portainer \ --restart=always \ -p 8000:8000 -p 9000:9000 \ -v /var/run/docker.sock:/var/run/docker.sock \ -v portainer_data:/data \ portainer/portainer-ce:latest - Access Portainer Web UI:
Open http://192.168.2.43:9000 in a browser and set up an admin password.
3️⃣ Deploy Ollama + Open WebUI in Portainer
Create a Stack in Portainer
- Go to Portainer UI (
http://192.168.2.43:9000). - Navigate to “Stacks” → “Add stack”.
- Enter a name (
deepseek-coder). - Paste the following
docker-compose.yml:version: “3.8”version: “3.8”
networks:
deepseek_ollama-net:
external: trueservices:
ollama:
image: ollama/ollama:latest
container_name: ollama
restart: unless-stopped
ports:
– “11434:11434”
networks:
– deepseek_ollama-net
volumes:
– ollama_data:/root/.ollama
environment:
– OLLAMA_HOST=0.0.0.0 # Erlaube Verbindungen von außen
logging:
driver: “json-file”
options:
max-size: “10m”
max-file: “3”webui:
image: ghcr.io/open-webui/open-webui:main
container_name: ollama-webui
restart: unless-stopped
ports:
– “3000:8080”
networks:
– deepseek_ollama-net
volumes:
– webui_data:/app/backend/data
environment:
– OLLAMA_API_BASE_URL=http://ollama:11434
depends_on:
– ollama
logging:
driver: “json-file”
options:
max-size: “10m”
max-file: “3”volumes:
ollama_data:
webui_data:
- Click “Deploy the stack” and wait for it to start.
4️⃣ Verify & Configure Open WebUI
Check if the containers are running
Run:
docker psExpected output:
CONTAINER ID IMAGE STATUS PORTSxxxxxxxxxx ollama/ollama:latest Up (running) 0.0.0.0:11434->11434/tcpxxxxxxxxxx ghcr.io/open-webui/open-webui:main Up (running) 0.0.0.0:3000->8080/tcpLoad DeepSeek-Coder-v2 into Ollama
docker exec -it ollama ollama pull deepseek-coder-v2Wait for the download to complete. Then check if it’s available:
docker exec -it ollama ollama listExpected output:
deepseek-coder-v2Restart Open WebUI
docker restart ollama-webuiOpen WebUI in browser
Visit:
👉 http://192.168.2.43:3000
5️⃣ Test the AI Model
Option 1: API Test via cURL
curl http://192.168.2.43:11434/api/generate -d '{ "model": "deepseek-coder-v2", "prompt": "Write a Python function to check for prime numbers.", "stream": false}'Option 2: Use Open WebUI
- Go to Open WebUI (
http://192.168.2.43:3000). - Go to “Settings” → “Connections”.
- Set API URL to:
http://192.168.2.43:11434 - Save & refresh the page.
- Try asking a question!
6️⃣ Troubleshooting
Check if containers are running
docker psCheck if Ollama has models
docker exec -it ollama ollama listIf empty, run:
docker exec -it ollama ollama pull deepseek-coder-v2Check if Ollama API is working
curl http://192.168.2.43:11434/api/tagsExpected output:
{"models":["deepseek-coder-v2"]}Check Open WebUI logs
docker logs ollama-webui🎯 Summary
✅ Debian 12 installed on VMware Fusion
✅ Docker & Portainer installed
✅ Ollama & Open WebUI deployed via Portainer
✅ DeepSeek-Coder-v2 installed & tested
✅ Web interface working on http://192.168.2.43:3000
Now you have a fully working LLM environment on Debian 12! 🎉🚀
Let me know if you need any refinements! 😊

