ChatGPT o3 API 80% price drop has no impact on performance
ChatGPT o3 API is now cheaper for developers, and there's no visible impact on performance.
On Wednesday, OpenAI announced it's cutting the price of its best reasoning model, o3, by 80%.

This means o3’s input price is now just $2 per million tokens, while the output price has dropped to $8 per million tokens.
"We optimized our inference stack that serves o3. Same exact model—just cheaper," OpenAI noted in a post on X.
While regular users typically don't use ChatGPT models via API, the price drop makes tools relying on the API much cheaper, such as Cursor and Windsurf.
In a post on X, the independent benchmark community ARC Prize confirmed that the o3-2025-04-16 model’s performance didn’t change after the price reduction.
"We compared the retest results with the original results and observed no difference in performance," the company said.
This confirms that OpenAI did not swap out the o3 model to reduce the price. Instead, the company truly optimized the inference stack that powers the model.
In addition, OpenAI rolled out o3-pro model in the API, which uses more compute to deliver better results.
Former Black Basta Members Use Microsoft Teams and Python Scripts in 2025 Attacks
SmartAttack uses smartwatches to steal data from air-gapped systems
Free online web security scanner