OpenAI

Configure OpenAI as an LLM provider in agentgateway.

Configuration

Review the following example configuration.
# yaml-language-server: $schema=https://agentgateway.dev/schema/config

llm:
  models:
  - name: "*"
    provider: openAI
    params:
      apiKey: "$OPENAI_API_KEY"
Review the following example configuration.
SettingDescription
nameThe model name to match in incoming requests. When a client sends "model": "<name>", the request is routed to this provider. Use * to match any model name.
providerThe LLM provider, set to openai for OpenAI models.
params.modelThe specific OpenAI model to use. If set, this model is used for all requests. If not set, the request must include the model to use.
params.apiKeyThe OpenAI API key for authentication. You can reference environment variables using the $VAR_NAME syntax.
ℹ️
For advanced routing scenarios that require path-based routing or custom endpoints, use the traditional binds/listeners/routes configuration format. See the Routing-based configuration guide for more information.

Connect to Codex

Use agentgateway as a proxy to your OpenAI provider from the Codex client.

  1. Create an agentgateway configuration without specifying a model, so the Codex client’s model choice is used.

    cat > config.yaml << 'EOF'
    # yaml-language-server: $schema=https://agentgateway.dev/schema/config
    
    llm:
      models:
      - name: openai
        provider: openAI
        params:
          apiKey: "$OPENAI_API_KEY"
    EOF
  2. Point Codex at agentgateway through one of the following methods.

    Codex uses the OPENAI_BASE_URL environment variable to override the default OpenAI endpoint. Use a base URL that includes /v1 so requests go to /v1/responses and OpenAI does not return 404.

    export OPENAI_BASE_URL="http://localhost:4000/v1"
    codex

    To override the base URL for a single run, set model_provider and the provider’s name and base_url (the -c values are TOML).

    codex -c 'model_provider="proxy"' -c 'model_providers.proxy.name="OpenAI via agentgateway"' -c 'model_providers.proxy.base_url="http://localhost:4000/v1"'

    To configure the base URL permanently, add the following to your ~/.codex/config.toml. For more information, see Advanced Configuration. The name field is required for custom providers.

    [model_providers.proxy]
    name = "OpenAI via agentgateway"
    base_url = "http://localhost:4000/v1"
Agentgateway assistant

Ask me anything about agentgateway configuration, features, or usage.

Note: AI-generated content might contain errors; please verify and test all returned information.

Tip: one topic per conversation gives the best results. Use the + button in the chat header to start a new conversation.

Switching topics? Starting a new conversation improves accuracy.
↑↓ navigate select esc dismiss

What could be improved?

Your feedback helps us improve assistant answers and identify docs gaps we should fix.

Need more help? Join us on Discord: https://discord.gg/y9efgEmppm

Want to use your own agent? Add the Solo MCP server to query our docs directly. Get started here: https://search.solo.io/.