Welcome to Eye on AI, with AI reporter Sharon Goldman. On this version…Open AI’s gigawatt arms race is underway in Abilene, Texas…Nscale proclaims record-breaking $1.1 billion Sequence B...OpenAI and Databricks strike AI agent deal…Trump administration will present Elon Musk’s xAI to federal companies.
Sam Altman stood outdoors Constructing 2 at OpenAI, Oracle, and SoftBank’s flagship Stargate knowledge heart in Abilene, Texas. He — together with the cluster of journalists peppering him with questions — appeared small in opposition to the backdrop of the sprawling 800-acre website, swarming with 1000’s of development staff and dotted with spools of fiber cable, metal beams, water pipes, and heavy equipment.
As I reported on Tuesday, we have been there for a media occasion to tout the progress of their high-profile and bold “Stargate” AI infrastructure challenge. They introduced an enlargement of the Abilene website, plus plans to construct 5 large new knowledge heart complexes throughout the U.S. over the following a number of years. Altogether, the initiative represents a whole lot of billions of {dollars} in funding — a challenge of mind-boggling scale. In Abilene alone, a crew of 6,400 staff has already flattened hills by shifting mountains of soil and laid sufficient fiber optic cable to wrap the Earth 16 instances.
“We cannot fall behind in the need to put the infrastructure together to make this revolution happen,” Altman instructed reporters in the course of the media occasion, which additionally included Clay Magouryk. considered one of Oracle’s two new CEOs, in addition to Texas Senator Ted Cruz. “What you saw today is just a small fraction of what this site will eventually be — and this site is just a small fraction of what we’re building. All of that still won’t be enough to serve even the demand of ChatGPT,” he added, referring to OpenAI’s flagship product.
Constructing AI with brute industrial drive
Altman and OpenAI have been relentless of their drive to “scale compute.” By this, they don’t imply chasing the following algorithmic breakthrough or elegant line of code. They imply brute industrial drive: hundreds of thousands of chips, sprawling campuses wired with fiber, and gigawatts of electrical energy — together with the gallons of water wanted to assist cool all that tools.. To OpenAI, scaling compute means piling on ever extra of this horsepower, betting that sheer scale — not software program magic — is what’s going to unlock not simply synthetic common intelligence (AGI), which the corporate defines as “highly autonomous systems that outperform humans at most economically valuable work,” however what it calls synthetic tremendous intelligence (ASI), that will hypothetically surpass human capabilities in all domains.
That’s why OpenAI retains pointing to a quantity: 10 gigawatts of capability throughout the Stargate challenge websites. Ten gigawatts — sufficient to energy roughly 7.5 million properties, or a complete regional grid — marks a shift in how AI capability is measured. At this scale, Altman defined to me with a fast handshake earlier than the press gaggle, firms like OpenAI don’t even hassle counting GPUs anymore. The unit of measure has develop into gigawatts: how a lot electrical energy the complete fleet of chips consumes. That quantity is shorthand for the one factor that issues: how a lot compute the corporate can hold working.
That’s why it was so putting to come back residence from Texas and skim Alex Heath’s Sources the very subsequent day. In it, Heath revealed an inside Slack be aware Altman had shared with workers on the identical day I noticed him in Abilene. Altman spelled out what he known as OpenAI’s “audacious long-term goal”: to construct not 10, not 100, however a staggering 250 gigawatts of capability by 2033. Within the be aware, he disclosed that OpenAI began the yr at “around” 230 megawatts of capability and is “now on track to exit 2025 north of 2GW of operational capacity.”
To place that into perspective: 250 gigawatts could be a couple of quarter of the complete U.S. electrical era capability, which hovers round 1,200 GW. And Altman isn’t simply speaking about electrical energy. The quantity is shorthand for the complete industrial system required to make use of it: the chips, the information facilities, the cooling and water, the networking fiber and high-speed interconnects to tie hundreds of thousands of processors into supercomputers.
‘A new core bet’ for OpenAI
Heath reported that Altman’s Slack be aware introduced OpenAI is “formalizing the industrial compute team,” led by Peter Hoeschele, who studies to president Greg Brockman. “The mission is simple: create and deliver massive usable compute as fast as physics allows, to power us through ASI,” Altman wrote. “In several years, I think this could be something like a gigawatt per week, although that will require us to completely reimagine how we build compute.”
“Industrial compute should be considered a new core bet (like research, consumer devices, custom chips, robotics, applications, etc.) which will hire and operate in the way it needs to run at maximum effectiveness for the domain,” Altman continued. “We’ve already invested hundreds of billions of dollars, and doing this right will cost trillions. We will need support from team members across OpenAI to help us move fast, unlock projects, and clear the path for the buildout ahead.”
1 / 4 of the U.S. energy grid. Trillions in value. Does that sound bonkers to you? It does to me — which is exactly why I hopped on a airplane to Dallas, rented a automobile, and drove three hours by way of rolling hills and ranches to Abilene to see for myself. The size of this one website is staggering. Imagining it multiplied by dozens is sort of unattainable.
I instructed Altman that the scene in Abilene jogged my memory a little bit of a tour I just lately took of Hoover Dam, one of many nice engineering feats of the twentieth century that produces about 2 gigawatts of energy at capability. Within the Nineteen Thirties, Hoover Dam was an emblem of American industrial would possibly: concrete, generators, and energy on a scale nobody had imagined.
Altman acknowledged that “people like to pick their historical analogies” and thought the “vibe was right” to check Stargate to Hoover Dam. It wasn’t his personal private favourite, nonetheless: “A recent thing I’ve thought about is airplane factories,” he stated. “The history of what went into airplane factories, or container ships, the whole industry that came around those,” he stated. “And certainly, everything that went into the Apollo program.”
The necessity for public consciousness
That’s after I realized: whether or not you assume Altman’s objectives make sense, appear nuts, or really feel downright reckless actually comes all the way down to what you consider about AI itself. For those who assume supercharged variations of AI will change the whole lot — and principally for the great, like curing most cancers — or you’re a China hawk that wishes to win the brand new AI ‘cold war’ with China, then Altman’s empire of knowledge facilities seems like a needed wager. For those who’re skeptical, it seems like the most important boondoggle since America’s grandest infrastructure follies: assume California’s long-awaited high-speed rail. For those who’ve learn Karen Hao’s Empire of AI, you may additionally be shouting that scaling isn’t inevitable — that constructing a ‘compute empire’ dangers centralizing energy, draining assets, and sidelining effectivity and security. And should you assume AGI will kill us all, like Eliezer Yudowsky? Nicely, you gained’t be a fan.
Nobody can predict the long run, in fact. My better concern is that there isn’t almost sufficient public consciousness of what’s taking place right here. I don’t imply simply in Abilene, with its mesquite shrubland floor into mud, and even OpenAI’s increasing Stargate ambitions across the US and past. I imply the huge, virtually unimaginable infrastructure buildout throughout Massive Tech — the buildout that’s propping up the inventory market, fueling a knowledge heart arms race with China, and reshaping vitality, land, and labor around the globe. Are we sleepwalking into the equal of an AI industrial revolution—and never a metaphorical one, however by way of precise constructing of bodily stuff—with out actually reckoning with its prices versus its advantages?
Even Sam Altman doesn’t assume sufficient folks perceive what he’s speaking about. “Do you feel like people understand what ‘compute’ is?” I requested him outdoors of Constructing 2. That’s, does the common citizen actually grok what Altman is saying in regards to the bodily manifestation of those mega knowledge facilities?
“No, that’s why we wanted to do this,” he stated in regards to the Abilene media occasion. “I don’t think when you hit the button on ChatGPT…you think of walking the halls here.”
In fact, Hoover Dam, too, was additionally divisive, controversial and regarded dangerous. However I wasn’t alive when it was constructed. This time I may see the mud rising in Abilene with my very own eyes — and whereas Altman talked about strolling the newly-built halls full of racks of AI chips, I walked away unsettled about what comes subsequent.
FORTUNE ON AI
Sam Altman’s AI empire will devour as a lot energy as New York Metropolis and San Diego mixed. Consultants say it’s ‘scary’ – by Eva Roytburg
Unique: Startup utilizing AI to automate software program testing within the age of ‘vibe coding’ receives $20 million in new enterprise funding – by Jeremy Kahn
OpenAI plans to construct 5 big U.S. ‘Stargate’ datacenters, a $400B problem to Meta and Microsoft within the relentless AI arms race – by Sharon Goldman
AI IN THE NEWS
Nscale proclaims record-breaking $1.1 billion Sequence B. UK cloud infrastructure firm Nscale introduced a $1.1 billion funding spherical, the biggest in UK and European historical past. The Sequence B, led by Aker ASA with participation from NVIDIA, Dell, Constancy, Point72, and others, will speed up Nscale’s rollout of “AI factory” knowledge facilities throughout Europe, North America, and the Center East. The corporate, which just lately unveiled partnerships with Microsoft, NVIDIA, and OpenAI to ascertain Stargate UK and launched Stargate Norway with Aker, says the funding will increase its engineering groups and GPU deployment pipeline because it races to ship sovereign, energy-efficient AI infrastructure at large scale.
OpenAI and Databricks strike AI agent deal. OpenAI and knowledge platform Databricks struck a multiyear deal anticipated to generate about $100M, the Wall Road Journal reported, making OpenAI’s fashions—together with GPT-5—natively accessible inside Databricks so enterprises can construct AI brokers on their very own knowledge “out of the box.” The partnership contains joint analysis, and OpenAI COO Brad Lightcap says the 2 intention to “far eclipse” the contracted determine. It targets a key adoption barrier—dependable, data-integrated brokers—tapping Databricks’ 20,000+ clients and $4B ARR footprint (Mastercard is already utilizing Databricks-built brokers for onboarding/assist). The transfer sits alongside Databricks’ mannequin partnerships (e.g., Anthropic) and a broader vendor push (Salesforce, Workday) to pair agent tooling with buyer knowledge, as OpenAI ramps its infrastructure ambitions with Oracle/SoftBank.
Trump administration will present Elon Musk’s xAI to federal companies. In keeping with the Wall Road Journal, Elon Musk’s xAI will likely be accessible to federal companies through the Basic Providers Administration for simply 42 cents—a part of a broader effort to convey prime AI techniques into authorities. The association mirrors comparable nominal-fee offers with Google (47 cents), OpenAI ($1), and Anthropic ($1), which means Washington is now working with all 4 main U.S. mannequin makers, every of which additionally has $200M Pentagon contracts. Officers say the low-cost entry is much less about income than securing a foothold in authorities AI adoption, the place automating bureaucratic processes is seen as a serious alternative. The transfer additionally highlights a thaw in Musk’s relationship with the White Home, whereas underscoring the administration’s push to foster competitors amongst frontier AI suppliers.
AI CALENDAR
Oct. 6-10: World AI Week, Amsterdam
Oct. 21-22: TedAI San Francisco. Apply to attend right here.
Nov. 10-13: Internet Summit, Lisbon.
Nov. 26-27: World AI Congress, London.
Dec. 2-7: NeurIPS, San Diego
Dec. 8-9: Fortune Brainstorm AI San Francisco. Apply to attend right here.
EYE ON AI NUMBERS
$923 Billion
That is how a lot AI-driven capital expenditures in fiscal yr 2025 will translate into whole U.S. financial output, in response to new financial modeling outcomes launched by IMPLAN.
The evaluation, primarily based on reported 2025 capital expenditure estimates, discovered that Amazon, Alphabet, Microsoft, and Meta are set to spend a file $364B on AI-driven capital expenditures in fiscal 2025—greater than all new U.S. industrial development in 2023. The modeling confirmed that these {dollars} will generate $923B in whole U.S. financial output, assist 2.7M jobs, and add $469B to GDP.
Each $1 invested, the report stated, yields $2.50 in impression, rippling from development and chip manufacturing to retail and native providers. For policymakers, it’s a reminder: Massive Tech’s AI buildout isn’t nearly knowledge facilities—it’s reshaping the broader U.S. financial system.
