OpenAI released new data on Monday that shows strong growth in enterprise use of its AI tools.
The report arrived only one week after CEO Sam Altman issued an internal “code red” memo that warned employees about rising competition from Google.
According to OpenAI, ChatGPT message volume grew eight times since November 2024.
Workers also reported saving up to one hour per day when using the company’s enterprise tools.
These gains come at a critical moment; although OpenAI leads the U.S. enterprise market, it still faces threats from several directions.
Today, about 36% of U.S. businesses use ChatGPT Enterprise, based on data from the Ramp AI Index.
That is more than double Anthropic’s 14.3% share. However, most of OpenAI’s revenue still comes from consumer subscriptions.
This is the same consumer base now targeted by Google’s Gemini. Because of this, OpenAI must strengthen enterprise adoption to balance its business model.
The company also faces new pressure from Anthropic, which earns most of its revenue from business clients.
A sizeable chunk also comes from open-weight model providers that appeal to enterprises seeking more control and lower costs.
Massive Commitments
OpenAI has committed $1.4 trillion to infrastructure spending over the next few years. As a result, enterprise growth is now essential.
During a briefing, Ronnie Chatterji, OpenAI’s chief economist, placed the moment in historical context.
He noted that consumer adoption matters. However, he said the largest economic gains appear when businesses adopt new technologies at scale.
He compared this development to earlier eras, including the rise of the steam engine.
Reasoning Token Use
The new report shows that companies using the OpenAI API are now consuming 320 times more reasoning tokens than they were one year ago.
These tokens support complex tasks, such as problem-solving and analysis. However, higher reasoning-token usage also raises energy consumption and cost.
Because of this, some analysts now question whether this rate of growth is sustainable for many companies.
Custom GPT Use
OpenAI also reported sharp growth in the use of custom GPTs. These tools let companies store internal knowledge or automate specific workflows.
Their use increased 19 times over the past year. They now account for 20% of enterprise messages.
The company highlighted BBVA, a digital bank, which uses more than 4,000 custom GPTs across its teams.
During the briefing, COO Brad Lightcap said this trend shows how teams adapt AI to their own needs.
He said companies are no longer relying on a single general tool. Instead, they create many small assistants that improve daily operations.
Time Savings and Upskilling

OpenAI stated that workers save between 40 and 60 minutes each day when using its enterprise tools.
However, the report did not measure time spent learning the systems, prompting effectively, or correcting mistakes.
Still, workers report strong skill gains. Three-quarters of employees surveyed said AI now helps them complete tasks they could not do before.
Coding stood out as a major example. OpenAI reported a 36% increase in coding messages from teams outside engineering, IT, and research.
This suggests that employees in other roles are now experimenting with technical tasks.
However, this carries some risk. More “vibe coding,” or code written by non-experts, can create bugs and security issues.
When asked about this, Lightcap pointed to Aardvark, a new security-focused agent that detects vulnerabilities. The tool is still in private beta.
Underutilized AI
Despite rising usage, OpenAI found that even heavy enterprise users rely more on basic features than advanced ones.
Many avoid tools for data analysis, reasoning, or search. Lightcap said this gap exists because companies need time to adjust their workflows.
He added that deeper adoption requires a mindset shift and more integration with company data.
Varied AI Adoption

The report also highlighted a widening gap in AI use among workers. Some “frontier” workers use advanced tools more often and save more time.
Others lag behind and use AI for only simple tasks. Lightcap said many firms still view AI as traditional software.
Others treat it as a core system, more like an operating platform. This difference in approach creates uneven adoption across industries.
OpenAI leaders framed the divide as an opportunity for slower adopters to advance.
Yet, for workers training AI systems that may someday automate parts of their roles, the idea of “catching up” may feel more like a countdown.

