Windsurf
Verified Code examples on this page have been automatically tested and verified.Configure Windsurf, the AI code editor by Codeium, to route requests to your LLM through your agentgateway proxy.
Before you begin
- Agentgateway running at
http://localhost:3000with a configured LLM backend. - Windsurf installed.
Example agentgateway configuration
cat > /tmp/test-windsurf.yaml << 'EOF'
# yaml-language-server: $schema=https://agentgateway.dev/schema/config
llm:
port: 3000
models:
- name: "*"
provider: openAI
params:
apiKey: "$OPENAI_API_KEY"
EOFConfigure Windsurf
Configure Windsurf to route LLM requests through agentgateway. For more information, review the Windsurf documentation.
Open Windsurf Settings.
- macOS:
Cmd + ,or Windsurf > Settings - Windows/Linux:
Ctrl + ,or File > Preferences > Settings
- macOS:
Search for Http: Proxy.
Enter your agentgateway URL.
http://localhost:3000Save the settings.
Verify the connection
- Open the Windsurf chat panel.
- Send a message such as “test”.
- Windsurf responds through your agentgateway backend.