In an era dominated by artificial intelligence (AI) technology, the cost of running these advanced systems is a topic of increasing discussion. Sam Altman, CEO of OpenAI, recently revealed a fascinating and somewhat humorous detail about the operational costs of ChatGPT.
According to Altman, seemingly polite actions such as Saying Thank You And Please To ChatGPT are costing OpenAI millions of dollars in electricity usage. This revelation came after a social media user questioned how much it costs OpenAI in energy consumption when users interact with the AI in a polite manner.
Altman’s response—”tens of millions of dollars well spent”—has sparked discussions on the operational efficiency and the hidden costs of AI systems. This blog will explore the key implications of Altman’s statement, the rising costs of AI operations, and what this means for the future of AI-powered technologies.
The Hidden Costs of Politeness: A Fun But Expensive Detail
The revelation about polite phrases costing millions of dollars might sound humorous at first, but it offers an intriguing look into the unseen side of artificial intelligence. Sam Altman’s comment about the added computational load caused by users saying “please” and “thank you” speaks volumes about the complexity of AI systems.
While these polite expressions are natural in human communication, their impact on a machine model, which has been designed to respond to even the slightest of prompts, is far from negligible.
Polite responses, such as “thank you” and “please,” require additional processing power. Every interaction with ChatGPT involves running complex algorithms, performing language modeling, and calculating contextual relevance.
When users add these courteous phrases, the system processes the entire conversation, including these additional, yet simple, words. This seemingly innocuous behavior adds more to the overall computational cost, making the entire exchange more energy-intensive.
This insight into how AI models consume energy sheds light on the hidden cost structure of AI-powered services. Altman’s mention of “tens of millions of dollars” is a recognition of the massive scale at which these operations function.
I wonder how much money OpenAI has lost in electricity costs from people saying “please” and “thank you” to their models.
— tomie (@tomieinlove) April 15, 2025
ChatGPT, for example, handles billions of queries daily, and even small increases in processing demands, like polite phrases, add up quickly. While this is an interesting facet of how AI models operate, it brings forth larger questions about energy efficiency and the future of sustainable AI practices.
The Rising Operational Costs of AI: Energy Consumption and Beyond
The operational costs of AI systems, particularly those utilizing large-scale models like ChatGPT, are a growing concern. OpenAI’s AI models, including ChatGPT, consume a significant amount of energy.
Each query processed by ChatGPT-4, for instance, requires approximately 2.9 watt-hours of electricity, which is about ten times the energy consumption of a standard Google search. Given that OpenAI receives over one billion queries per day, this translates into a daily energy consumption of around 2.9 million kilowatt-hours.
This immense energy requirement has sparked discussions about the environmental impact of AI technologies. The rising global demand for AI and its integration into various sectors—from customer service to content creation—adds to the strain on energy grids. While AI systems can offer significant benefits in terms of efficiency and innovation, their environmental footprint cannot be overlooked.

Additionally, the operational costs associated with AI aren’t just limited to energy consumption. The infrastructure needed to run these advanced models is also costly. Data centers, cooling systems, high-performance hardware, and ongoing maintenance all contribute to the financial burden. For companies like OpenAI, managing these expenses while ensuring that AI models remain accessible to users is a delicate balancing act.
As the adoption of AI models continues to grow, companies like OpenAI will need to find ways to optimize the performance of their systems without increasing operational costs exponentially. This may involve developing more energy-efficient algorithms, investing in renewable energy sources, or redesigning data centers for maximum energy efficiency.
The Future of AI Sustainability:
While OpenAI’s current operational costs are significant, the future of AI sustainability lies in improving energy efficiency. The energy consumption associated with AI systems like ChatGPT is unlikely to decrease on its own, and as AI usage continues to expand, so will the costs. However, there are several avenues OpenAI and other AI companies can explore to make their operations more sustainable.
One potential solution is to invest in more efficient machine learning models. Researchers are constantly working on developing AI algorithms that require less computational power while maintaining or even improving performance.
By making these models more efficient, companies can reduce the amount of energy needed to train and run them. The development of lightweight models could help lower electricity consumption significantly without compromising the user experience.

Another avenue for reducing AI’s environmental impact is through the adoption of renewable energy sources. As data centers and AI models require significant power, it is becoming increasingly important to source that power from clean energy sources. OpenAI, along with other tech giants, could lead the charge in ensuring that their operations are powered by sustainable energy, thereby reducing the carbon footprint of AI.
Additionally, the development of client-side solutions, as suggested by users in response to Altman’s post, could be a game-changer. By shifting some of the processing to the user’s device, AI systems could potentially reduce the load on central servers, leading to lower energy usage at the server side. Such innovations in AI infrastructure could play a significant role in making AI systems more energy-efficient and cost-effective.
The High Price of Politeness and the Future of AI Operations
Sam Altman’s remarks about the cost of politeness in AI interactions offer a lighthearted yet thought-provoking glimpse into the complex world of AI operations.
While saying “please” and “thank you” may seem like minor actions, their cumulative effect on computational costs cannot be ignored. OpenAI’s experience highlights the broader challenge of managing the energy consumption and infrastructure needs of AI models in an era of growing demand.

As AI technology becomes more integrated into daily life, understanding and addressing its operational costs will be crucial. Companies like OpenAI are faced with the challenge of scaling their AI models in a way that is both financially and environmentally sustainable.
Innovations in AI algorithms, infrastructure design, and renewable energy adoption will be key to ensuring that AI can continue to evolve without putting an unsustainable strain on resources.
Ultimately, while politeness may be costing OpenAI millions, the larger conversation is about the future of AI in a world where energy efficiency and sustainability are paramount.
As the technology advances, the hope is that AI can continue to deliver groundbreaking results without having a disproportionate impact on the environment or the financial stability of the companies that develop it.