iTranslated by AI

The content below is an AI-generated translation. This is an experimental feature, and may contain errors. View original article
😎

Using Codex CLI with Azure OpenAI

に公開

Introduction

I'm a ChatGPT Plus subscriber, but I hit the rate limit while using Codex and was told to wait 22 hours...

No way I can wait that long!

So, I looked into whether I could use the Azure OpenAI API, at least while the restriction is in place.

Important Notes

Even if you have gpt-5 set up in Azure OpenAI, there are regions where it cannot be used from Codex CLI.

Please note that it must support the Responses API, which is limited to the following regions:

Conclusion

You can achieve this in the following two steps:

  1. Set the API key in environment variables
  2. Write the settings in ~.codex\config.toml

Set the API key in environment variables

Windows

To set it as a user environment variable:

setx AZURE_OPENAI_API_KEY "your-azure-openai-api-key"

To set it as a system environment variable:

setx AZURE_OPENAI_API_KEY "your-azure-openai-api-key" -m

Bash

export AZURE_OPENAI_API_KEY="your-azure-openai-api-key"

This won't be persistent, so you can make it persistent by saving it to ~/.bashrc with a method like the following:

echo 'export AZURE_OPENAI_API_KEY="your-azure-openai-api-key"' >> ~/.bashrc

Write the settings in ~.codex\config.toml

[model_providers.azure]
name = "Azure OpenAI"
base_url = "https://<YOUR_RESOURCE_NAME>.openai.azure.com/openai/v1"
env_key = "AZURE_OPENAI_API_KEY"
wire_api = "responses"

<YOUR_RESOURCE_NAME> is the AI Foundry resource name on the Azure Portal.

Then, you can start it with Azure by running the following command:

codex --config model_provider=azure

If the status command displays the following, it is successful.

Please check the Model section. Looking at Account, you can also see that you are logged in with your OpenAI account.

/status
📂 Workspace
  • Path: ~
  • Approval Mode: on-request
  • Sandbox: workspace-write
  • AGENTS files: (none)

👤 Account
  • Signed in with ChatGPT
  • Login: xxxxxxxxxxxxxxxxxxxxxxxxxx
  • Plan: Plus

🧠 Model
  • Name: gpt-5
  • Provider: Azure
  • Reasoning Effort: Medium
  • Reasoning Summaries: Auto

💻 Client
  • CLI Version: 0.31.0

📊 Token Usage
  • Session ID: xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
  • Input: 0
  • Output: 0
  • Total: 0 

If you want to use Azure OpenAI by default

If you don't specify model_provider, it defaults to your OpenAI account. To always connect via Azure, configure it as follows:

model = "gpt-5-codex"
model_provider = "azure"

[model_providers.azure]
name = "Azure OpenAI"
base_url = "https://<YOUR_RESOURCE_NAME>.openai.azure.com/openai/v1"
env_key = "AZURE_OPENAI_API_KEY"
wire_api = "responses"

Please specify model_provider before any TOML blocks like [model_providers.azure].

Even if you don't specify model, it will connect if gpt-5 is the only deployment. However, if you have multiple deployments and want to select one, it is best to specify it as shown above. You can also specify the following options:

model_reasoning_effort = "high"
model_reasoning_summary = "detailed"

That's it!

Discussion