IBM CEO says there is 'no way' spending on AI data centers will pay off

businessinsider.com

176 points by nabla9 7 hours ago


1vuio0pswjnm7 - an hour ago

One thing we saw with the dot-com bust is how certain individuals were able to cash in on the failures, e.g., low cost hardware, domain names, etc. (NB. prices may exceed $2)

Perhaps people are already thinking about they can cash in on the floor space and HVAC systems that will be left in the wake of failed "AI" hype

Octoth0rpe - 6 hours ago

> Krishna also referenced the depreciation of the AI chips inside data centers as another factor: "You've got to use it all in five years because at that point, you've got to throw it away and refill it," he said

This doesn't seem correct to me, or at least is built on several shaky assumptions. One would have to 'refill' your hardware if:

- AI accelerator cards all start dying around the 5 year mark, which is possible given the heat density/cooling needs, but doesn't seem all that likely.

- Technology advances such that only the absolute newest cards can be used to run _any_ model profitably, which only seems likely if we see some pretty radical advances in efficiency. Otherwise, it seems like assuming your hardware is stable after 5 years of burn in, you could continue to run older models on that hardware at only the cost of the floorspace/power. Maybe you need new cards for new models for some reason (maybe a new fp format that only new cards support? some magic amount of ram? etc), but it seems like there may be room for revenue via older/less capable models at a discounted rate.

mbreese - 6 hours ago

I would add an addendum to this -- there is no way the announced spending on AI data centers will all come to fruition. I have no doubt that there will be a massive build-out of infrastructure, but it can't reach the levels that have been announced. The power requirements alone will stop that from happening.

skeeter2020 - 6 hours ago

The interesting macro view on what's happening is to compare a mature data center operation (specifically a commoditized one) with the utility business. The margins here, and in similar industries with big infra build-out costs (ex: rail) are quite small. Historically the businesses have not done well; I can't really imagine what happens when tech companies who've only ever known huge, juicy margins experience low single digit returns on billions of investment.

bluGill - 6 hours ago

I question depreciation. those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question. cpu's stopped getting exponetially faster 20 years ago, (they are faster but not the jumps the 1990s got)

ic_fly2 - 6 hours ago

IBM might not have a data strategy or AI plan but he isn’t wrong on the inability to generate a profit.

A bit of napkin math: NVIDIA claims 0.4J per token for their latest generation 1GW plant with 80% utilisation can therefore produce 6.29 10^16 tokens a year.

There are ~10^14 tokens on the internet. ~10^19 tokens have been spoken by humans… so far.

myaccountonhn - 6 hours ago

> In an October letter to the White House's Office of Science and Technology Policy, OpenAI CEO Sam Altman recommended that the US add 100 gigawatts in energy capacity every year.

> Krishna also referenced the depreciation of the AI chips inside data centers as another factor: "You've got to use it all in five years because at that point, you've got to throw it away and refill it," he said.

And people think the climate concerns of AI are overblown. Currently US has ~1300 GW of energy capacity. That's a huge increase each year.

badmonster - 6 hours ago

He's right to question the economics. The AI infrastructure buildout resembles the dot-com era's excess fiber deployment - valuable long-term, but many individual bets will fail spectacularly. Utilization rates and actual revenue models matter more than GPU count.

bmadduma - 17 minutes ago

No wonder why he is saying that, they lost AI game, no top researcher wants to work for IBM. Spent years developing Watson, it is dead. I believe this is a company that should not be existed.

pharos92 - 24 minutes ago

I find it disturbing how long people wait to accept basic truths, as if they need permission to think or believe a particular outcome will occur.

It was quite obvious that AI was hype from the get-go. An expensive solution looking for a problem.

The cost of hardware. The impact on hardware and supply chains. The impact to electricity prices and the need to scale up grid and generation capacity. The overall cost to society and impact on the economy. And that's without considering the basic philosophical questions "what is cognition?" and "do we understand the preconditions for it?"

All I know is that the consumer and general voting population loose no matter the outcome. The oligarchs, banking, government and tech-lords will be protected. We will pay the price whether it succeeds or fails.

My personal experience of AI has been poor. Hallucinations, huge inconsistencies in results.

If your day job exists within an arbitrary non-productive linguistic domain, great tool. Image and video generation? Meh. Statistical and data-set analysis. Average.

kenjackson - 6 hours ago

I don't understand the math about how we compute $80b for a gigawatt datacenter. What's the costs in that $80b? I literally don't understand how to get to that number -- I'm not questioning its validity. What percent is power consumption, versus land cost, versus building and infrastructure, versus GPU, versus people, etc...

pjdesno - 4 hours ago

> $8 trillion of CapEx means you need roughly $800 billion of profit just to pay for the interest

That assumes you can just sit back and gather those returns indefinitely. But half of that capital expenditure will be spent on equipment that depreciates in 5 years, so you're jumping on a treadmill that sucks up $800M/yr before you pay a dime of interest.

criddell - 6 hours ago

> But AGI will require "more technologies than the current LLM path," Krisha said. He proposed fusing hard knowledge with LLMs as a possible future path.

And then what? These always read a little like the underpants gnomes business model (1. Collect underpants, 2. ???, 3. Profit). It seems to me that the AGI business models require one company has exclusive access to an AGI model. The reality is that it will likely spread rapidly and broadly.

If AGI is everywhere, what's step 2? It seems like everything AGI generated will have a value of near zero.

scroot - 6 hours ago

As an elder millennial, I just don't know what to say. That a once in a generation allocation of capital should go towards...whatever this all will be, is certainly tragic given current state of the world and its problems. Can't help but see it as the latest in a lifelong series of baffling high stakes decisions of dubious social benefit that have necessarily global consequences.

Animats - 6 hours ago

How much has actually been spent on AI data centers vs. amounts committed or talked about? That is, if construction slows down sharply, what's total spend?

nashashmi - 5 hours ago

Don’t worry. The same servers will be used for other computing purposes. And maybe that will be profitable. Maybe it will be beneficial to others. But This cycle of investment and loss is a version of distribution of wealth. Some benefit.

The banks and loaners always benefit.

eitally - 6 hours ago

At some point, I wonder if any of the big guys have considered becoming grid operators. The vision Google had for community fiber (Google Fiber, which mostly fizzled out due to regulatory hurdles) could be somewhat paralleled with the idea of operating a regional electrical grid.

Ekaros - 5 hours ago

How much of Nvidias price is based on 5 year replacement cycle? If that stops or slows with new demand could it also affect things? Not that 5 years does not seem very long horizon now.

maxglute - 6 hours ago

How long can ai gpus stretch? Optmistic 10 years and we're still looking at 400b+ profit to cover interests. The factor in silicon is closer to tulips than rail or fiber in terms of depreciated assets.

bluGill - 6 hours ago

This is likely correct overall, but it can still pay off in specific cases. However those are not blind investments they are targeted with a planned business model

ninjaa - an hour ago

What does Jim Cramer have to say?

wmf - 6 hours ago

$8T may be too big of an estimate. Sure you can take OpenAI's $1.4T and multiply it by N but the other labs do not spend as much as OpenAI.

sombragris - 44 minutes ago

"yeah, there's no way spending in those data centers will pay off. However, let me show you this little trinket which runs z/OS and which is exactly what you need for these kinds of workloads. You can subscribe to it for the low introductory price of..."

matt_s - 3 hours ago

There is something to be said about what the ROI is for normal (i.e. non AI/tech) companies using AI. AI can help automate things, robots have been replacing manufacturing jobs for decades but there is an ROI on that which I think is easier to see and count, less humans in the factory, etc. There seems to be a lot of exaggerated things being said these days with AI and the AI companies have only begun to raise rates, they won't go down.

The AI bubble will burst when normal companies start to not realize their revenue/profit goals and have to answer investor relations calls about that.

jmclnx - 6 hours ago

I guess he is looking directly at IBM's cash cow, the mainframe business.

But, I think he is correct, we will see. I still believe AI will not give the CEOs what they really want, no or very cheap labor.

qwertyuiop_ - 6 hours ago

The question no one seems to be answering is what would be the EOL for these newer GPUs that are being churned out of NVDIA ? What % annual capital expenditures is refresh of GPUs. Will they be perpetually replaced as NVIDIA comes up with newer architectures and the AI companies chase the proverbial lure ?

parapatelsukh - 6 hours ago

The spending will be more than paid off since the taxpayer is the lender of last resort There's too many funny names in the investors / creditors a lot of mountains in germany and similar ya know

devmor - 6 hours ago

I suppose it depends on your definition of "pay off".

It will pay off for the people investing in it, when the US government inevitably bails them out. There is a reason Zuckerberg, Huang, etc are so keen on attending White House dinners.

It certainly wont pay off for the American public.

oxqbldpxo - 6 hours ago

FB playbook. Act (spend) then say sorry.

verdverm - 7 hours ago

IBM CEO is steering a broken ship and it's not improved course, not someone who's words you should take seriously.

1. The missed the AI wave (hired me to teach watson law only to lay me off 5 wks later, one cause of the serious talent issues over there)

2. They bought most of their data center (companies), they have no idea about building and operating one, not at the scale the "competitors" are operating at