← Back to Home

Ollama Troubleshooting Guide

Common Issue: If you encounter "Address Already in Use" errors, this guide will help you resolve the issue.

Fixing "Address Already in Use" Errors

If you encounter Error: listen tcp 127.0.0.1:11434: bind: address already in use, follow the steps below to resolve the issue.

Step-by-Step Instructions

Step 1: Check What's Using the Port

lsof -ti:11434

This will show the process IDs (PIDs) using port 11434.

Step 2: Kill Existing Processes

# Kill specific processes (replace PID with actual process ID from Step 1)
kill -9 [PID]

# Or kill all Ollama processes at once
pkill -f ollama

Step 3: If Ollama Keeps Restarting

The Ollama desktop app may automatically restart the service. Try:

macOS:

# Quit the Ollama application entirely
osascript -e 'quit app "Ollama"'

# Or force quit if the above doesn't work
pkill -f "Ollama.app"

Windows:

# Kill Ollama processes
taskkill /F /IM ollama.exe

# Or stop the Ollama service
net stop ollama

Linux:

# Kill all Ollama processes
pkill -f ollama

# Or stop systemd service if applicable
sudo systemctl stop ollama

Step 4: Verify Port is Free

lsof -ti:11434

This should return nothing if the port is free.

Step 5: Start Ollama Fresh

# Using custom start script
./start-ollama.sh

# Or start directly
ollama serve

# Or start with specific configuration
OLLAMA_ORIGINS="chrome-extension://*,http://localhost:*,https://localhost:*" ollama serve

Quick Commands Reference

Initial Troubleshooting

ollama --help                    # Check available Ollama commands
ollama ps                        # List running Ollama models
pkill ollama                     # Attempt to kill Ollama processes

Port Investigation and Process Management

lsof -ti:11434                   # Find processes using port 11434
kill -9 [PID]                    # Force kill specific processes
pkill -f ollama                  # Kill all processes matching "ollama"
ps aux | grep ollama             # List all Ollama-related processes

Additional Tips

Persistent Restarts

If Ollama keeps restarting automatically:

Multiple Instances

Sometimes multiple Ollama instances can run simultaneously:

ps aux | grep ollama

Use this to see all running instances and kill them individually.

Alternative Ports

You can configure Ollama to use a different port:

# Set custom port
export OLLAMA_HOST=127.0.0.1:11435
ollama serve

# Or with full configuration
OLLAMA_HOST=127.0.0.1:11435 OLLAMA_ORIGINS="chrome-extension://*" ollama serve

Common Error Messages

Error Cause Solution
bind: address already in use Port 11434 is occupied Follow steps 1-5 above
connection refused Ollama not running Start Ollama with ollama serve
403 Forbidden CORS issues Start with OLLAMA_ORIGINS environment variable

Understanding CORS Issues (403 Forbidden)

What is CORS and Why Does It Happen?

CORS (Cross-Origin Resource Sharing) is a security feature built into web browsers to protect users from malicious websites.

Why Browsers Block Local Software Access

By default, web browsers prevent websites and extensions from accessing software running on your computer. This is a security feature, not a bug. Here's why:

Why Our Extension Needs Special Permission

Bookmark GPT Pro needs to communicate with Ollama (running on your computer) to provide AI features. Since this crosses the security boundary between "web content" and "local software," we need to explicitly tell Ollama to allow this connection.

The Solution: OLLAMA_ORIGINS

When you start Ollama with OLLAMA_ORIGINS="chrome-extension://*", you're essentially telling Ollama: "It's okay to accept requests from Chrome extensions, I trust them."

This is safe because:

Key Issue Resolution

In most cases, the key issue is that the Ollama desktop application automatically restarts the service even after killing individual processes. This is why you need to quit the application itself using the methods in Step 3.

Environment Variables

For browser extensions and web applications, make sure to set:

export OLLAMA_ORIGINS="chrome-extension://*,http://localhost:*,https://localhost:*"
export OLLAMA_HOST="127.0.0.1:11434"

Then start Ollama:

ollama serve

Prevention Tips

  1. Proper Shutdown: Always properly quit Ollama using ollama stop [model-name] before shutting down
  2. Desktop App: If using the desktop app, quit it properly from the menu bar rather than force-killing processes
  3. Check Before Starting: Use lsof -ti:11434 to check if the port is free before starting Ollama