Discussion on the Cost of Running ChatGPT
I'm curious about the operational costs involved in running ChatGPT. How much does it cost to run ChatGPT, considering the computing resources, energy consumpti…
Alexander Jensen
March 14, 2026 at 03:12 PM
I'm curious about the operational costs involved in running ChatGPT. How much does it cost to run ChatGPT, considering the computing resources, energy consumption, and maintenance? Any insights or detailed breakdowns would be appreciated.
Add a Comment
Comments (4)
Training large language models like GPT-3 reportedly cost OpenAI several million dollars, including hardware, electricity, and engineering efforts.
It's quite expensive to run models like ChatGPT because they require powerful GPUs running continuously. Estimates suggest it costs thousands of dollars per day just for inference servers.
From what I've read, running inference for ChatGPT on cloud infrastructure can cost around $0.03 to $0.10 per 1,000 tokens, which adds up with millions of users.
OpenAI hasn't released exact numbers, but industry insiders estimate that maintaining ChatGPT's backend can cost upwards of $10,000 per day, depending on usage.