Fixing 405 Method Not Allowed with Open WebUI + Continue in VS Code

VS Code  ·  Continue  ·  Open WebUI  ·  Ollama  ·  API

VS Code
Continue
Open WebUI
Ollama
LLM
API

If you are trying to use Open WebUI as an API backend for Continue in VS Code
and you keep seeing errors like:

POST /api/responses HTTP/1.1" 405

or

POST /api/chat/completions/responses HTTP/1.1" 405

this article may save you some time.

The Problem

At first glance it looks like an API key or routing issue. But the real cause is usually this:

Continue is using the OpenAI Responses API,
while Open WebUI only supports the classic chat completions endpoint.

Open WebUI supports these endpoints:

  • GET /api/models
  • POST /api/chat/completions

But Continue, depending on the selected provider, may try to call:

  • POST /api/responses
  • POST /api/chat/completions/responses

Open WebUI does not handle those endpoints, so the server responds with
405 Method Not Allowed.

Root Cause

The original Continue configuration used provider: openai. With that setting,
newer versions of Continue may switch to the OpenAI Responses API format,
which appends /responses to the request path.

Open WebUI does not implement the Responses API, so every request fails with a
405 before it even reaches the model.

The Fix

Change the provider in your Continue configuration from openai to
lmstudio:

Before

models:
  - name: OpenWebUI
    provider: openai
    model: llama3.1
    apiBase: http://localhost:3000/api
    apiKey: your_api_key

After

models:
  - name: OpenWebUI
    provider: lmstudio
    model: llama3.1
    apiBase: http://localhost:3000/api
    apiKey: your_api_key
That single change is all it takes. No proxy, no workarounds, no additional configuration.

Why This Works

The lmstudio provider in Continue uses the classic OpenAI-compatible
chat/completions request format instead of the newer Responses API style.
Open WebUI supports chat/completions, so the requests go through correctly.

Provider in ContinueEndpoint CalledOpen WebUI Result
openaiPOST /api/responses405 Error
lmstudioPOST /api/chat/completions200 OK

Verify Open WebUI is Working

You can test the API directly with curl before touching your Continue config.

List available models:

curl -H "Authorization: Bearer YOUR_API_KEY" \
  http://localhost:3000/api/models

Test a chat completion request:

curl -X POST http://localhost:3000/api/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "llama3.1",
    "messages": [
      { "role": "user", "content": "Hello" }
    ]
  }'
If these curl commands return a 200 OK response,
Open WebUI is working correctly and the issue is in the Continue provider setting.

Summary

If Continue in VS Code sends requests to /api/responses or
/api/chat/completions/responses and Open WebUI returns
405 Method Not Allowed, the fix is simple:

# Change this
provider: openai

# To this
provider: lmstudio

This makes Continue use the classic chat completions flow that Open WebUI supports,
instead of the newer Responses API that Open WebUI does not implement.

Note: This is not a bug in Open WebUI. It is a compatibility mismatch between
the Continue openai provider behavior, the OpenAI Responses API,
and Open WebUI’s supported endpoints. Using provider: lmstudio
is currently the simplest workaround.