VS Code Continue
Configure Continue, the open-source AI code assistant for VS Code, to route requests through agentgateway.
Before you begin
- Agentgateway running at
http://localhost:3000with a configured LLM backend. - VS Code with the Continue extension installed.
Example agentgateway configuration
# yaml-language-server: $schema=https://agentgateway.dev/schema/config
llm:
port: 3000
models:
- name: "*"
provider: openAI
params:
apiKey: "$OPENAI_API_KEY"Configure Continue
- Edit the
~/.continue/config.jsonfile to add your agentgateway endpoint. - Save the file and reload Continue in VS Code.
{
"models": [
{
"title": "agentgateway",
"provider": "openai",
"model": "gpt-4o-mini",
"apiBase": "http://localhost:3000/v1"
}
]
}Review the following table to understand this configuration.
| Field | Description |
|---|---|
title | Display name shown in the Continue model selector. |
provider | Set to openai for any OpenAI-compatible endpoint. |
model | The model name from your agentgateway backend configuration. |
apiBase | Your agentgateway URL with the /v1 path. |
Verify the connection
- Open the Continue sidebar in VS Code (
Cmd + Mon macOS,Ctrl + Mon Windows/Linux). - Select agentgateway from the model dropdown.
- Send a test message: “Hello, are you working?”