Fixing 405 Method Not Allowed with Open WebUI + Continue in VS Code
Fixing 405 Method Not Allowed with Open WebUI + Continue in VS Code
Continue
Open WebUI
Ollama
LLM
API
If you are trying to use Open WebUI as an API backend for Continue in VS Code
and you keep seeing errors like:
POST /api/responses HTTP/1.1" 405or
POST /api/chat/completions/responses HTTP/1.1" 405this article may save you some time.
The Problem
At first glance it looks like an API key or routing issue. But the real cause is usually this:
while Open WebUI only supports the classic chat completions endpoint.
Open WebUI supports these endpoints:
GET /api/modelsPOST /api/chat/completions
But Continue, depending on the selected provider, may try to call:
POST /api/responsesPOST /api/chat/completions/responses
Open WebUI does not handle those endpoints, so the server responds with
405 Method Not Allowed.
Root Cause
The original Continue configuration used provider: openai. With that setting,
newer versions of Continue may switch to the OpenAI Responses API format,
which appends /responses to the request path.
Open WebUI does not implement the Responses API, so every request fails with a
405 before it even reaches the model.
The Fix
Change the provider in your Continue configuration from openai to
lmstudio:
Before
models:
- name: OpenWebUI
provider: openai
model: llama3.1
apiBase: http://localhost:3000/api
apiKey: your_api_keyAfter
models:
- name: OpenWebUI
provider: lmstudio
model: llama3.1
apiBase: http://localhost:3000/api
apiKey: your_api_keyWhy This Works
The lmstudio provider in Continue uses the classic OpenAI-compatible
chat/completions request format instead of the newer Responses API style.
Open WebUI supports chat/completions, so the requests go through correctly.
| Provider in Continue | Endpoint Called | Open WebUI Result |
|---|---|---|
openai | POST /api/responses | 405 Error |
lmstudio | POST /api/chat/completions | 200 OK |
Verify Open WebUI is Working
You can test the API directly with curl before touching your Continue config.
List available models:
curl -H "Authorization: Bearer YOUR_API_KEY" \
http://localhost:3000/api/modelsTest a chat completion request:
curl -X POST http://localhost:3000/api/chat/completions \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "llama3.1",
"messages": [
{ "role": "user", "content": "Hello" }
]
}'curl commands return a 200 OK response,Open WebUI is working correctly and the issue is in the Continue provider setting.
Summary
If Continue in VS Code sends requests to /api/responses or
/api/chat/completions/responses and Open WebUI returns
405 Method Not Allowed, the fix is simple:
# Change this
provider: openai
# To this
provider: lmstudio
This makes Continue use the classic chat completions flow that Open WebUI supports,
instead of the newer Responses API that Open WebUI does not implement.
the Continue
openai provider behavior, the OpenAI Responses API,and Open WebUI’s supported endpoints. Using
provider: lmstudiois currently the simplest workaround.







