Welcome to Eye on AI, with AI reporter Sharon Goldman. On this version: Information facilities in house are possible, however not prepared for launch…Accenture hyperlinks promotions to AI logins…AI pioneer Fei-Fei Li’s startup World Labs raises $1 Billion. Nvidia’s cope with Meta indicators a brand new period in computing energy.
The AI trade is on an influence journey—actually–and it’s getting determined. Information facilities already account for roughly 4% of U.S. electrical energy use, a share anticipated to greater than double by 2030 as operating and coaching AI fashions more and more require gigawatts of energy. Analysts challenge world data-center energy demand might rise as a lot as 165% by the tip of the last decade, whilst new era and transmission infrastructure lag years behind want. In response, hyperscalers are scrambling—reducing offers to construct their very own gasoline vegetation, exploring small nuclear reactors, and looking for energy wherever they’ll discover it.
Towards that backdrop, it’s not shocking that a few of the trade’s largest gamers are beginning to look to outer house for an answer.
In a characteristic story printed this morning, I dig into how—whilst tech corporations are on monitor to spend greater than $5 trillion globally on Earth-based AI information facilities by the tip of the last decade—Elon Musk is arguing the way forward for AI computing energy lies in house, powered by photo voltaic power. Musk has prompt that the economics and engineering might align inside just some years, even predicting that extra AI computing capability may very well be in orbit than on Earth inside 5.
The thought of orbital house facilities itself isn’t new. Way back to 2015, Fortune was already asking the query: What if we put servers in house?
What’s modified is the urgency. At the moment’s energy crunch has pushed the idea again into critical dialog, with startups like Starcloud getting consideration and Large Tech leaders like former Google CEO Eric Schmidt, Alphabet CEO Sundar Pichai, and Amazon’s Jeff Bezos all turning their consideration to the chances of launching information facilities into orbit.
Nevertheless, whereas Musk and different bulls argue that space-based AI computing might turn out to be cost-effective comparatively rapidly, many consultants say something approaching significant scale stays a long time away. Constraints round energy era, warmth dissipation, launch logistics, and value nonetheless make it impractical—and for now, the overwhelming share of AI funding continues to move into terrestrial infrastructure. Small-scale pilots of orbital computing could also be possible within the subsequent few years, they argue, however house stays a poor substitute for Earth-based information facilities for the foreseeable future.
It’s not exhausting to grasp the enchantment, although: Speaking with sources for this story, it grew to become clear that the thought of knowledge facilities in house is not science fiction—the physics largely try. “We know how to launch rockets; we know how to put spacecraft into orbit; and we know how to build solar arrays to generate power,” Jeff Thornburg, a SpaceX veteran who led growth of SpaceX’s Raptor engine, instructed me. “And companies like SpaceX are showing we can mass-produce space vehicles at lower cost.”
The issue is that all the pieces else, from constructing large photo voltaic arrays to reducing launch prices, strikes much more slowly than at the moment’s AI hype cycle. Nonetheless, Thornburg mentioned in the long term, the power pressures driving curiosity in orbital information facilities are unlikely to vanish. “Engineers will find ways to make this work,” he mentioned. “Long term, it’s just a matter of how long is it going to take us.”
FORTUNE ON AI
Google CEO Sundar Pichai says AI spending nonetheless is smart regardless of bubble fears – by Beatrice Nolan
Invoice Gates pulls out of India’s AI summit on the final minute, within the newest blow to an occasion dogged by organizational chaos – by Beatrice Nolan
Elon Musk is pushing to construct information facilities in house. However they gained’t remedy AI’s energy issues anytime quickly – by Sharon Goldman
Who’s OpenClaw creator Peter Steinberger? The millennial developer caught the eye of Sam Altman and Mark Zuckerberg – by Eva Roytburg
Unique: Bain and Greylock wager $42 million that AI brokers can lastly repair cybersecurity’s messiest bottleneck – by Lily Mae Lazarus
AI IN THE NEWS
Accenture hyperlinks promotions to AI logins. Accenture is starting to trace senior staff’ use of its inside AI instruments—and factoring that information into management promotion selections—highlighting how even AI-heavy consultancies are struggling to get prime employees to vary how they work. In keeping with inside communications seen by the Monetary Occasions, promotion to management roles will now require “regular adoption” of AI instruments, with Accenture monitoring particular person log-ins for some senior managers as a part of this summer time’s expertise critiques. The transfer displays a broader problem throughout consulting and accounting companies, the place executives say senior companions are much more immune to AI adoption than junior employees, prompting a “carrot and stick” method. Whereas Accenture says it has educated greater than 550,000 staff in generative AI and is reorganizing round an AI-centric “Reinvention Services” unit, the coverage has drawn inside criticism—together with claims that some instruments are unreliable—and underscores the widening hole between AI ambition and day-to-day enterprise use.
Nvidia’s cope with Meta indicators a brand new period in computing energy. A new Wired story argues that Nvidia’s newest cope with Meta marks a shift in how AI computing energy is being constructed. It’s not nearly shopping for extra highly effective GPUs to coach AI fashions; corporations now want a full stack of chips to run them at scale. Alongside billions of {dollars}’ price of Nvidia GPUs, Meta can be shopping for Nvidia’s Grace CPUs—making it the primary main tech firm to publicly decide to these chips at scale. Analysts say the transfer displays how newer AI techniques, particularly so-called “agentic” AI that runs duties repeatedly, rely closely on conventional CPUs to coordinate information, handle workflows, and assist inference. A latest Semianalysis report underscores the purpose, noting that some AI information facilities now require tens of hundreds of CPUs simply to deal with the info produced by GPUs—an infrastructure burden that hardly existed earlier than the AI increase.
EYE ON AI NUMBERS
1%
In keeping with JLL’s new North America Information Heart Report, data middle emptiness stays at a record-low 1% for the second consecutive yr, regardless of unprecedented building to assist the AI increase, a “powerful statistic that challenges bubble concerns.” With 92% of capability underneath growth already pre-leased or owner-occupied, the report mentioned at the moment’s buildout “reflects sustained structural demand rather than cyclical imbalance.”
The report additionally pointed to greater than 35 gigawatts of knowledge middle capability underneath building in North America, roughly equal to the annual electrical energy consumption of the UK or Italy. At the moment, 64% of capability underneath building is positioned in markets together with West Texas, Tennessee, Wisconsin, and Ohio. In actual fact, Texas, when considered as a single market, might overtake Northern Virginia because the world’s largest information middle market by 2030, the report mentioned.
AI CALENDAR
Feb. 16-21: AI Motion Summit, New Delhi, India.
Feb. 24-26: Worldwide Affiliation for Protected & Moral AI (IASEAI), UNESCO, Paris, France.
March 2-5: Cellular World Congress, Barcelona, Spain.
March 16-19: Nvidia GTC, San Jose, Calif.
April 6-9: HumanX, San Francisco.
