iTranslated by AI
Resolving Codex IDE "Token exchange failed" Error by Disabling WSL Support
Solving the "Token exchange failed" Error in Codex IDE by Disabling WSL Settings
TL;DR
- Symptoms:
- When signing in to Codex via the Cursor (VS Code) extension, the browser displays
Token exchange failed: error sending request for url (https://auth.openai.com/oauth/token)and login fails. - Reason:
- While there is connectivity to
auth.openai.com, the extension was starting thecodex app-serverinside WSL, andlocalhost:1455was occupied bywslrelay.exe. -
Solution:
Solved by turning OFFRun Codex In Windows Subsystem For Linux
Solution
You can resolve this by disabling the setting that runs Codex inside WSL in the Cursor (VS Code) settings.
- Open Settings in Cursor (VS Code) (e.g., "File -> Preferences -> Settings" in the menu, or
Ctrl+,). - Type
codexin the search bar. - Locate the following setting and turn it OFF:
-
Chatgpt: Run Codex In Windows Subsystem For Linux -
Setting description (as displayed in Cursor):
-
Run Codex In Windows Subsystem For Linux
Windows only: when Windows Subsystem for Linux (WSL) is installed,
automatically run Codex inside WSL. Recommended for improved sandbox security and better performance
- Agent mode on Windows currently requires WSL.
Changing this setting reloads VS Code to take effect.
- After changing the setting, Cursor will reload. Once finished, try to Sign in again.
My Environment
- OS: Windows
- IDE: Cursor (Likely the same for other VS Code-based editors)
- Extension: Codex (
openai.chatgpt) - WSL: Ubuntu (Codex was configured to run inside WSL in Cursor settings)
Symptoms
After completing the sign-in from the IDE extension, at the moment the browser redirected back to http://localhost:1455/auth/callback?..., the following error was displayed:
Token exchange failed: error sending request for url (https://auth.openai.com/oauth/token)
This message is troublesome because it makes you suspect things like "network, proxy, or certificates."
Background Knowledge on OpenAI/Codex Authentication
1) Sign-in Methods (Codex)
There are two main sign-in methods for Codex (CLI/IDE extension):
- Sign in with ChatGPT account: A browser opens for you to log in, and upon completion, a token is sent back to your local machine.
- Sign in with API Key: A method where you configure and use an API key from the OpenAI Platform.
Note: Depending on the environment (corporate proxy, headless, remote, etc.), the browser-based method can easily get stuck, so you might be prompted to use the Device Code method instead.
2) What is OAuth's "code → token exchange"?
OAuth's "Authorization Code Flow" is, roughly speaking, a process where authorization occurs in the browser → the app receives an authentication token.
Since the browser returned to http://localhost:1455/auth/callback?code=... in this case, it means it progressed at least until authorization → "received the authorization code (code)."
(If you check the URL, the code should be there.)
The sequence is as follows:
-
Login/Consent in the browser (Authorization)
The authorization server (OpenAI's Auth) redirects back to your local callback URL.
At that time, the URL includescode(authorization code) andstate(a random value for CSRF protection). -
Local side (IDE/Extension) exchanges
codefor a "token" (token exchange)
The IDE/Extension (more accurately, the authentication process it started) performs a POST tohttps://auth.openai.com/oauth/tokenand receives tokens including:-
access_token(used for API calls) -
refresh_token(if needed for long-term use)
Depending on the implementation, PKCE (code_verifier) may also be used here.
-
In this case, Token exchange failed: ... (https://auth.openai.com/oauth/token) means that the HTTP POST in step 2 failed.
Conceptual Diagram (edited from one generated by GPT-5.2 High):
3) The Role of localhost:1455 (What does it do?)
http://localhost:1455/auth/callback?... is a local callback HTTP server used to receive tokens (or a code) after browser login is complete.
Rather than just the fact that "the browser returns to 1455," the key is that the process holding 1455 must be able to POST to auth.openai.com/oauth/token afterward.
Note: Failures Can Also Occur Due to Python Certificates (CA Bundle)
In my case, the cause was the WSL setting, but similar "Token exchange failed" errors can be caused by Python TLS verification (certificate chain verification) depending on the environment.
When Should You Suspect a Python-related Cause?
If any of the following apply, there is a high possibility that it is a Python-related issue:
- The error log contains
CERTIFICATE_VERIFY_FAILEDorunable to get local issuer certificate. - The log shows expressions like
_ssl.c:(indicating the Python SSL layer is throwing an exception). -
curlworks, but a specific tool (such as Python/requests) fails with HTTPS.
This can happen when the OS certificate store is correct, but the CA bundle referenced by Python is different and lacks the necessary root certificates.
Why Does curl Work While Python Fails?
In many cases, the reason is the following difference:
-
curlcorrectly references the OS certificate store (or a bundled CA bundle). - On the other hand, Python (specifically
requests/urllib3) may use thecertifiCA bundle.
This occurs with special certificate chains, such as those used by corporate proxies (SSL inspection).
Cases where the callback at localhost:1455/auth/callback succeeds, but the subsequent call to auth.openai.com/oauth/token fails due to an SSL certificate verification error.
Cases where TLS/SSL inspection by an internal proxy prevents the CA from being trusted, causing the OAuth token POST to fail.
Checking the Python CA Bundle
python -c "import ssl; print(ssl.get_default_verify_paths())"
python -c "import certifi; print(certifi.where())"
The path from certifi.where() is the CA bundle that requests tends to reference.
Basic Approach for Resolution
Instead of "disabling verification," it is recommended to inform Python of the correct CA bundle.
Typical examples are:
SSL_CERT_FILEREQUESTS_CA_BUNDLE
Example (please adjust paths for your environment):
$env:SSL_CERT_FILE = (python -c "import certifi; print(certifi.where())")
$env:REQUESTS_CA_BUNDLE = $env:SSL_CERT_FILE
Then, launching the problematic tool (CLI, IDE, etc.) from the same shell may allow the environment variables to be inherited and resolve the issue.
Note:
- Since IDEs often only see the "environment variables at startup," it is best to completely restart the IDE after configuration.
- In corporate proxy environments, the handling of the internal Root CA (such as registration in the OS trust store) is also relevant.
Why Was it Not Python in This Case?
In the troubleshooting for this article, I confirmed that curl could reach and POST to auth.openai.com (returning an HTTP 400), so the issue was not that the entire terminal was failing TLS.
Furthermore, since localhost:1455 was occupied by wslrelay.exe and the extension logs showed that codex app-server was starting inside WSL, it was the quickest route to suspect the WSL settings.
(I should have checked sooner...)
Investigation 1: Is the Network/TLS Actually Down?
In this case.
First, I checked whether "the terminal could reach auth.openai.com."
-
TCP Connectivity
Test-NetConnection auth.openai.com -Port 443
Output ↓
TcpTestSucceeded : True
-
Can the Token Endpoint Be Reached? (Verification via HTTP Status)
Since /oauth/token is POST only, it is normal for a -I (equivalent to HEAD) request to return a 405.
curl.exe -I https://auth.openai.com/oauth/token
Output ↓
HTTP/1.1 405 Method Not Allowed
-
Next, confirm that the OpenID configuration returns a 200.
curl.exe -I https://auth.openai.com/.well-known/openid-configuration
Output ↓
HTTP/1.1 200 OK
-
Confirm that it is also possible to POST to
/oauth/token
curl.exe -sS -D - -o NUL -X POST "https://auth.openai.com/oauth/token" `
-H "Content-Type: application/x-www-form-urlencoded" `
--data "grant_type=authorization_code"
Output ↓
HTTP/1.1 400 Bad Request
Since there is no code, etc., a 400 is returned (which is expected).
What we have learned so far is that "authentication endpoints for OpenAI cannot be reached from this PC"
is not the case.
Investigation 2: Who is Holding localhost:1455?
http://localhost:1455/auth/callback?... is a local HTTP server for the callback used to receive tokens (or a code) after browser login is complete.
I monitored who was occupying this port.
1..600 | % {
$c = Get-NetTCPConnection -LocalPort 1455 -ErrorAction SilentlyContinue
if ($c) {
$c | Select State,LocalAddress,LocalPort,RemoteAddress,RemotePort,OwningProcess | Format-Table -Auto
$pids = $c | Select-Object -Expand OwningProcess -Unique
$pids | % { Get-Process -Id $_ | Select-Object Id,ProcessName,Path | Format-Table -Auto }
break
}
Start-Sleep -Milliseconds 100
}
Output ↓
State LocalAddress LocalPort RemoteAddress RemotePort OwningProcess
----- ------------ --------- ------------- ---------- -------------
Listen 127.0.0.1 1455 0.0.0.0 0 12528
Id ProcessName Path
-- ----------- ----
12528 wslrelay C:\Program Files\WSL\wslrelay.exe
In other words, port 1455 was being held by the WSL relay (wslrelay.exe).
Running codex login in this state fails due to a port conflict (as expected).
Error logging in: Port 127.0.0.1:1455 is already in use
Investigation 3: Is the Extension Running Inside WSL?
Looking at the Cursor output logs (Codex extension), I found that it was starting the codex app-server inside WSL. (I should have checked this much sooner!!!)
Log:
[CodexMcpConnection] Spawning codex app-server
[wsl] eligible distro list: Ubuntu
[spawn-codex-process] Spawning codex process inside WSL
[spawn-codex-process] WSL command: wsl.exe -d Ubuntu -- ... codex app-server ...
At this point, the following picture emerged:
-
auth.openai.comcan be reached normally from the terminal. - However, only the sign-in flow is failing.
- Moreover, the extension is running inside WSL, and
wslrelay.exeis involved with port 1455.
Cause
Cases have been reported where enabling the "Run Codex In Windows Subsystem For Linux" setting in the Codex extension causes sign-in to fail.
- GitHub Issue:
openai/codex#6413
(I was in this category.)
Solution (Restated)
Following the steps in the Solution section above, the issue was resolved simply by turning OFF Chatgpt: Run Codex In Windows Subsystem For Linux → reloading → and signing in again.
(The wording in the UI may vary slightly depending on your environment or version, but turning off the setting that appears when searching for "codex" regarding "running Codex in WSL" should work.)
Minimum Things to Suspect if the Cause is Different for the Same Symptoms
Even with the same Token exchange failed error, the following could be the cause depending on the environment (especially in corporate networks):
- Proxy/SSL Inspection (Untrusted Root CA)
- Firewall/Security Products blocking outbound communication for specific processes
- Time Desync (causing TLS failure)
In such cases, it's quickest to first verify whether "the terminal can reach the endpoint" using curl or Test-NetConnection, and then check the extension's output logs for certificate errors or timeouts.
Reference Links
- Codex Official (Authentication Overview, Device Code, etc.): Codex Authentication
- SSL Cert Verification: Official Documentation
- Codex CLI OAuth Login Fails: Failure due to SSL certificate verification error
- Token Exchange Fails Due to Custom CA in Corporate Proxy: OAuth token POST fails
- Known Issue (Sign-in failure when WSL is enabled): openai/codex #6413
Discussion