When you ask ChatGPT a question and end it with a “please” or say “thank you” afterward, it might feel like the polite thing to do. But did you know that these tiny expressions of kindness are actually costing OpenAI millions of dollars every year?
This surprising fact was revealed by none other than OpenAI’s CEO, Sam Altman. He recently admitted that simple, polite phrases like “please” and “thank you” are not as innocent as they seem—at least not from a business perspective.
In this article, we’ll break down why these polite prompts are adding up, what it means for OpenAI’s operations, and how this might affect the future of AI tools like ChatGPT.
The main reason behind the high costs is something very few users think about: tokens.
Every time you type something into ChatGPT, it doesn’t process your message as a sentence. It breaks it down into pieces called tokens. These tokens can be words, punctuation marks, or even parts of longer words. Each token takes computing power to understand and respond to.
Now, imagine millions of people using ChatGPT every day—and many of them are writing longer prompts with extra words just to be polite. Those extra words mean more tokens. And more tokens mean more computing power, more energy, more data processing, and ultimately, higher costs for OpenAI.
According to Altman, these polite extras are not a small issue. In fact, they’re costing the company tens of millions of dollars per year. Yes, you read that right. Your kindness to a chatbot comes with a price tag.
To understand this better, here’s a quick breakdown of how tokens work:
Each token adds cost because of the energy and processing time required to handle it. And when you scale that up to global usage, even polite habits start to feel expensive.
OpenAI is already spending a huge amount just to keep ChatGPT running. Reports suggest that the company burns through around $700,000 every single day in operating costs. That adds up to over $250 million a year—and a decent chunk of that is due to the polite habits of users.
With over 100 million active users and rising demand for advanced AI tools, it’s no surprise that costs are spiraling. Adding new features like image generation and voice chat only adds to the load.
And with new enterprise clients using GPT-4 Turbo at a large scale, every single token starts to matter in OpenAI’s budgeting.
Besides the financial burden, there’s another silent cost—the environment.
AI models like ChatGPT require massive amounts of energy. This energy comes from data centers packed with servers that need constant cooling. The more tokens processed, the more energy used.
So when polite users type longer prompts with “please” and “thank you,” they’re not just raising the electricity bill—they’re also increasing the carbon footprint of AI.
In fact, cooling these giant data centers often requires millions of gallons of water every day. This has raised concerns among environmentalists who believe that the growing popularity of generative AI may be unsustainable without major improvements.
Here’s where things get interesting. Even though politeness costs more, OpenAI isn’t asking people to stop.
In fact, Sam Altman believes that having conversations with AI that feel human is part of the goal. Saying “please” and “thank you” to ChatGPT is a sign that users are interacting with the model respectfully—almost like talking to a real person.
These small touches create trust and make users feel more comfortable. That’s valuable in its own way. So, while it may increase operational costs, it also improves user experience—a top priority for OpenAI.
This leaves OpenAI in a tricky position.
On one hand, the company wants to encourage natural, human-like conversations with AI. On the other, it must also manage costs and resource usage to keep the technology sustainable and profitable.
How can OpenAI keep both sides happy?
Altman’s honest admission about the cost of polite prompts is a reminder of how much we take for granted when using tools like ChatGPT.
As AI becomes more common in daily life, from writing emails to coding and customer service, the behind-the-scenes costs will only grow. Companies like OpenAI will need to find creative solutions to manage these expenses—without making users feel guilty for being polite.
The situation also raises big questions for the tech industry:
So, the next time you type “please” into ChatGPT or say “thank you” after a helpful response, remember this: those simple words are part of a much larger conversation—one about cost, energy, and the future of AI.
Being polite may not be free anymore, but it’s still worth it.
Read Next – U.S. Slaps Massive Charges on Chinese Oil Supertankers to Shift Global Power
Peanut butter is a beloved pantry staple worldwide, valued not just for its creamy texture…
The Southeastern Conference (SEC) has long been a juggernaut in collegiate sports, but in recent…
The 420 meme is more than just a number or internet joke — it’s a…
As the calendar turns and we welcome a new year, many look forward to celebrating…
Mountain Dew has long been a pioneer in the world of bold and unexpected soda…
The United States Postal Service (USPS) has announced plans to raise stamp prices five times…