Set up Reverse Proxy for Ollama

Follow these steps to expose Ollama through a secure proxy that speaks the headers Sidebar AI Chat expects.

1. Configure Ollama behind a reverse proxy

Run Ollama on a host you control and place a reverse proxy (Nginx or similar) in front of it. Point the proxy at the port where Ollama listens, require HTTPS for external traffic, and restrict access to trusted networks.

Ollama enforces a strict CORS policy and rejects unknown origins. Requests from the Chrome extension include a chrome-extension:// origin that Ollama blocks, so the proxy must add CORS headers for that ID.

If you need to install Nginx first, see Set up Nginx on Windows, Linux, and macOS.

Create new file (detecting OS...)

server {
        listen 8080;
        server_name localhost;

        client_max_body_size 100M;

        location / {
            if ($request_method = 'OPTIONS') {
                add_header 'Access-Control-Allow-Origin' 'chrome-extension://hkpcjhbbhndcianjdlcphihidaagedgn';
                add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT, DELETE';
                add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization';
                add_header 'Access-Control-Max-Age' 1728000;
                add_header 'Content-Type' 'text/plain; charset=utf-8';
                add_header 'Content-Length' 0;
                return 204;
            }

            # CORS headers for actual requests
            add_header 'Access-Control-Allow-Origin' 'chrome-extension://hkpcjhbbhndcianjdlcphihidaagedgn' always;
            add_header 'Access-Control-Allow-Methods' 'GET, POST, OPTIONS, PUT, DELETE' always;
            add_header 'Access-Control-Allow-Headers' 'DNT,User-Agent,X-Requested-With,If-Modified-Since,Cache-Control,Content-Type,Range,Authorization' always;
            add_header 'Access-Control-Expose-Headers' 'Content-Length,Content-Range' always;

            # Proxy to Ollama
            proxy_pass http://127.0.0.1:11434;
            proxy_set_header Host $host;
            proxy_set_header X-Real-IP $remote_addr;
            proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
            proxy_set_header X-Forwarded-Proto $scheme;

            # Remove/override Origin header to prevent leakage
            proxy_set_header Origin "";

            # Handle streaming responses
            proxy_buffering off;
            proxy_cache off;

            # Increase timeouts for long-running requests
            proxy_connect_timeout 60s;
            proxy_send_timeout 300s;
            proxy_read_timeout 300s;
        }
    }
    

Reload Nginx, open the sidebar, and start a test chat. You should see a streaming response from the model. If the request fails, inspect the browser console and Nginx logs for CORS or proxy errors.