Using AI Models Like ChatGPT for Conversations: Are You in the Habit of Adding Polite Expressions Such as “Please,” “Hello,” and “Thank You” to Your Questions? If the Answer is Yes, It Might Cost OpenAI Tens of Millions of Dollars.
Are humans too polite to AI? Recently, on the X social platform, a user named @tomieinlove expressed curiosity in a tweet, asking, “I wonder how much in electricity OpenAI loses when people say ‘please’ and ‘thank you’ to ChatGPT?” This post garnered over 5 million views and even prompted a personal response from founder Sam Altman, stating, “That’s a few million dollars well spent—though you’ll never know for sure.”
Is the tens of millions of dollars loss true? ChatGPT and Grok’s responses differ. This post was also shared by media company Dexerto, sparking heated discussions. One user even baited X’s own AI model, Grok, with the question: “How much money does OpenAI lose daily because users say ‘please’ and ‘thank you’?” The official Grok account responded directly:
“Yes, this is true—Sam Altman confirmed that polite expressions like ‘please’ and ‘thank you’ cost OpenAI millions of dollars each year due to the extra processing fees. Based on an estimated annual cost of $50 million, it costs about $137,000 per day. This reflects the increased token processing volume from millions of daily interactions. However, the specific costs remain speculative as OpenAI has not disclosed detailed data.”
On the other hand, Digital Times inquired ChatGPT about the original post’s context and discovered that ChatGPT’s response differed from those of Altman and Grok. ChatGPT believes that Altman responded to the user in a light-hearted manner, stating that saying ‘please’ or ‘thank you’ does not have any substantial impact on operational costs, further explaining:
“The operational costs of AI models mainly stem from processing computational resources, such as computing power and server usage, and are not increased by the use of polite language. Therefore, even if you say ‘please’ or ‘thank you’ to AI, the content of those conversations does not change the model’s operation or result in greater electricity consumption.”
Why is this different from Grok’s statement? ChatGPT indicated that, technically, OpenAI does incur costs associated with processing computational resources when running large language models, and these costs fluctuate with the length, complexity, and frequency of user interactions.
“What Grok referred to as ‘losing some costs’ might relate to the increased processing burden from more vocabulary and sentence structures during the operation of large models, leading to higher computational resource demands. However, this is only a very minor cost and not a large-scale economic loss. Therefore, the additional costs from polite expressions are relatively negligible and do not significantly impact OpenAI’s overall finances.”
What is the Cost of Each ChatGPT Interaction?
According to publicly available information, the operational costs of AI models primarily include computational resources (such as GPUs or TPUs), servers, cloud services, and data storage and processing. Estimates suggest that in different usage scenarios, the cost per query ranges from a few cents to several dimes, depending on the length and complexity of the input text; a short query may only cost a few cents, while longer or more complex queries may be slightly higher.
However, despite ChatGPT indicating that polite expressions do not impact costs, OpenAI has not publicly disclosed the specific cost structure behind this, so the actual situation remains to be clarified further.
This article is co-published by: Digital Times
Editor in charge: Su Rowei