Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

66% Positive

Analyzed from 6293 words in the discussion.

Trending Topics

#more#years#data#investment#railroads#gpus#project#roi#still#bubble

Discussion (280 Comments)Read Original on HackerNews

timmg4 days ago
This tweet shows it as a percentage of US GDP:

https://x.com/paulg/status/2045120274551423142

Makes it a little less dramatic. But also shows what a big **'n deal the railroads were!

manquer3 days ago
GDP adjustments are warranted, but it is more stark than both the estimates suggest.

The megaprojects of the previous generations all had decades long depreciation schedules. Many 50-100+ year old railways, bridges, tunnels or dams and other utilities are still in active use with only minimal maintenance

Amortized Y-o-Y the current spends would dwarf everything at the reported depreciation schedule of 6(!) years for the GPUs - the largest line item.

phreeza3 days ago
That's definitely true for some of them, but for others it's not so clear, like the Apollo or Manhattan projects? Those of course also have lasting impact but it's more in terms of knowledge, which at least arguably we are also accruing with these data centers.
manquer3 days ago
Not just knowledge.

RS-25 - It was designed as HG-3 during the 60s for Saturn-V and manufactured for the Space Shuttle and refurbished for SLS and just launched last month.

Vehicle assembly building - Built for Saturn-V launches been in active use and continues today .

Crawler-transporters - Hanz and Franz were built in 1966 for Apollo and still used for launches.

There are plenty of other examples from Apollo program of actual hardware being repurposed and used for later missions.

In other mega space projects, Hubble is still doing active research, 35 years after launch, voyager is sending data close to 50 years later.

It is a whole another topic whether they should be used, how NASA is funded , and this is why makes programs like SLS or the shuttle are so expensive and so forth.

The point is these mega projects had a long lifetime of value, albeit with higher maintenance costs for the tech heavy ones like Apollo than say a bridge or a dam does.

gravypod3 days ago
The side effects of spending funds on these mega projects is also something to consider. NASA spending has created a huge pile of technologies that we use day to day: https://en.wikipedia.org/wiki/NASA_spin-off_technologies.
delusional3 days ago
> NASA spending has created a huge pile of technologies that we use day to day

We're a little too early to know if that's the case here too. I do foresee a chance at a reality where AI is a dead end, but after it we have a ton of cheap GPU compute lying about, which we all rush to somehow convert into useful compute (by emulating CPU's or translating traditional algorithms into GPU oriented ones or whatever).

Lerc3 days ago
The shovels and labour used to make those things where not depreciated.

The GPUs are the shovels, not the project. AI at any capability will retain that capbibilty forever. It only gets reduced in value by superior developments. Which are built upon technologies that the previous generation developed.

kennywinker3 days ago
Calling the GPUs the shovels is bonkers because a) shovels are cheap, GPUs are not. And b) when you build a bridge the bridge doesn’t need shovels to be passable. Without GPUs, the datacenter is useless, the model is useless, etc.

If anything, the GPUs are the steel that the bridge is made of. Each beam can be replaced, but if too many fail the bridge is impassible. A bridge with a 6 year lifespan for each beam is insane.

loandbehold3 days ago
You need to separate training and inference usage of GPUs for this analysis.
jiggawatts3 days ago
> retain that capbibilty forever

Not really. The base training data cutoff will quickly render models useless as they fail to keep up with developments.

Translating some Farsi news articles about the war was hilarious, Gemini Pro got into a panic. ChatGPT either accused me of spreading fake news, or assumed this was some sort of fantasy scenario.

brookst3 days ago
I’m not sure tax depreciation rates are the best measure here. Those GPUs will be used for much longer than 6 years, and the returns from the businesses will be an order of magnitude longer.
vmbm3 days ago
The jury is still out on this. Those tax based deprecation schedules are largely a relic of traditional data centers, where workloads are fairly moderate compared to AI use cases. Additionally, power and rack space constraints can complicate things quite a bit. If next gen chips are significantly more efficient and you are currently constrained by power availability, you might pull your old servers and replace them with the newer ones regardless of how much useful life you have left.
mfuzzey3 days ago
actually the physical lifetime (not financial depreciation) for AI data center GPUs is even lower (3 to 4 years)
elil173 days ago
I think there's more nuance to it. The real asset is the models that are being created.

Imagine this world: the bubble "pops" in a couple years. The GPUs stick around for a few more years after that. At the end, we pretty much don't train new foundation models anymore - no one wants to spend the money on the hardware needed to make a real advance.

People continue to refine, distill, and optimize the existing foundation models for the next century or two, just like people keep laying new track over old railway right of ways.

pembrook3 days ago
Only half of the rail capacity that existed during the railroad boom times was still in use by the 1970s. Lots of it was never really used at all after various railroads went bankrupt. But your point still stands.

That said, I'm pretty sure in a compute-hungry AI world you aren't going to retire GPUs every 6 years anymore. Even if compute capacity jumps such that current H100s only represent 10% of total compute available in 6 years, you're still running those H100s until they turn to dust.

I just think it's hard to compare localized railroad infrastructure to globalized AI capacity and say one was more rational than the other on a % of GDP basis until the history actually plays out.

If you compare global investment in nuclear weapons it would dwarf the manhattan project and AI thus far, and yet, 99.99999% of nuclear weapons investment is just "wasted" capacity in that it has never been "used." But the value it has created in other ways (MAD-enabled peace) has surely been profitable on net. Nobody would have predicted this at the time.

Playing armchair internet pessimist about the "new thing" always makes you feel smart but is usually not a good idea since you always mis-price what you don't know about the future (which is almost everything).

wr23 days ago
Also railways would always have alternative uses at that time - e.g. logistics in warfare.

What other uses do GPU's have that are critical...? lol

In addition to your points, this is why I always laugh when people do backward comparisons. What characteristics do they share in common? Very little.

jamesknelson3 days ago
GPUs do have a use in warfare though. I mean, LLMs are basically offensive weapons disguised as software engineers.

Sure, LLMs can kind of put together a prototype of some CRUD app, so long as it doesn’t need to be maintainable, understandable, innovative or secure. But they excel at persisting until some arbitrary well defined condition is met, and it appears to be the case that “you gain entry to system X” works well as one of those conditions.

Given the amount of industrial infrastructure connected to the internet, and the ways in which it can break, LLMs are at some point going to be used as weapons. And it seems likely that they’ll be rather effective.

FWIW, people first saw TNT as a way to dye things yellow, and then as a mining tool. So LLMs starting out as chatbots and then being seen as (bad) software engineers does put them in good company.

jhide3 days ago
On the topic of warfare, wars are fought differently now. Compute will be mentioned in the same breath as total manufacturing output if a global war between superpowers erupts. In highly competitive industries this is already the case. Compute will be part of industrial mobilization in the same way that physical manufacturing or transportation capacity were mobilized in WWII. I’m not an expert on military computing but my intuition is that FLOPS are probably even more easily fungible into wartime compute than widget makers, and the US was able to go widgets->weapons on an unbelievable scale last time.
naasking3 days ago
> What other uses do GPU's have that are critical...? lol

GPUs are essential to every kind of scientific and engineering simulation you can think of. AI-accelerated simulations are a huge deal now.

rayiner3 days ago
Great point!
tripletao4 days ago
This seems to show the railroads peaking around 9% of GDP. While that's lower than some of the other unsourced numbers I've seen, it's much higher than the numbers I was able to find support for myself at

https://news.ycombinator.com/item?id=44805979

The modern concept of GDP didn't exist back then, so all these numbers are calculated in retrospect with a lot of wiggle room. It feels like there's incentive now to report the highest possible number for the railroads, since that's the only thing that makes the datacenter investment look precedented by comparison.

chromacity4 days ago
But doesn't that overstate it in the other direction? Talking about investments in proportion to GDP back when any estimate of GDP probably wasn't a good measure of total economic output?

We're talking about the period before modern finance, before income taxes, back when most labor was agricultural... Did the average person shoulder the cost of railroads more than the average taxpayer today is shouldering the cost of F-35? (That's another line in Paul's post.)

topspin4 days ago
The F-35 case is interesting. Lockheed Martin can, given peak rates seen in 2025, produce a new F-35 approximately every 36 hours, as they fill orders for US allies arming themselves with F-35's. US pilot training facilities are brimming with foreign pilots. It's the most successful export fighter since the F-16 and F-4, and presently the only means US allies have to obtain operational stealth combat technology.

What that means for the US is this: if the US had to fight a conventional war with a near-peer military today, the US actually has the ability to replace stealth fighter losses. The program isn't some near-dormant, low-rate production deal that would take a year or more to ramp up: it's a operating line at full rate production that could conceivably build a US Navy squadron every ~15 days, plus a complete training and global logistics system, all on the front burner.

If there is any truth to Gen Bradley's "Amateurs talk strategy, professionals talk logistics" line, the F-35 is a major win for the US.

palmotea4 days ago
> Lockheed Martin can, given peak rates seen in 2025, produce a new F-35 approximately every 36 hours ... it's a operating line at full rate production that could conceivably build a US Navy squadron every ~15 days, plus a complete logistics and training system, all on the front burner.

That's amazing. I had no idea the US was still capable of things like that.

I wonder if there's a way to get close to that, for things that aren't new and don't have a lot of active orders. Like have all the equipment setup but idle at some facility, keep an assembly teams ready and trained, then cycle through each weapon an activate a couple of these dormant manufacturing programs (at random!) every year, almost as a drill. So there's the capability to spin up, say F-22 production quickly when needed.

Obviously it'd cost money. But it also costs a lot of money to have fighter jets when you're not actively fighting a way. Seems like manufacturing readiness would something an effective military would be smart to pay for.

bluedino3 days ago
> Lockheed Martin can, given peak rates seen in 2025, produce a new F-35 approximately every 36 hours

Until we run out of materials

https://mwi.westpoint.edu/minerals-magnets-and-military-capa...

bombcar4 days ago
That's the problem with going too far using "money" or "GDP" - you can roughly compare the WWII 45% of GDP spent with today - https://www.davemanuel.com/us-defense-spending-history-milit... because even by WWII much was "financialized" in such a way that it appears on GDP (though things like victory gardens, barter, etc would explicitly NOT be included without effort - maybe they do this?).

As you get further and further into the past you have to start trying to measure it using human labor equivalents or similar. For example, what was the cost of a Great Pyramid? How does the cost change if you consider the theory that it was somewhat of a "make work" project to keep a mainly agricultural society employed during the "down months" and prevent starvation via centrally managed granaries?

helterskelter4 days ago
You don't even need to go that far back to run into issues, when I read Pride and Prejudice, I think Mr. Darcy was one of the richest people in England at around £10,000/year, but if you to calculate his wealth in today's terms it wasn't some outrageous sum (Wikipedia is telling me ~£800,000/year). The thing is that the economy was totally different back then -- labor cost practically nothing, but goods like furniture for instance were really expensive and would be handed down for generations.

With £800K today, you may not even be able to afford the annual maintenance for his mansion and grounds. I knew somebody with a biggish yard in a small town and the garden was ~$40K/yr to maintain. Definitely not a Darcy estate either.

Thinking about it, an income of £800K is something like the interest on £10m.

chaos_emergent4 days ago
I posted just that on the Twitter feed but then I realized that railroad started at the beginning of an industrial revolution where labor was a far larger portion of GDP compared to industrial production. So it kind of makes sense that the first enabling technology consumed far more GDP than current investments do, even on a marginal basis.
cousin_it3 days ago
Wild graphic. US spending on one flying killing machine (the F-35) is comparable to total spending on the Marshall plan to reconstruct Europe after WWII, or the interstate highway system, or all datacenters combined. Priorities!
marche1013 days ago
I don't think that's right - the scale is logarithmic. The Marshall Plan is 20 times as expensive
brookst3 days ago
It’s hazardous to blend fixed and variable costs.
SlinkyOnStairs3 days ago
> Makes it a little less dramatic. But also shows what a big *'n deal the railroads were!

It also makes it more dramatic, consider the programs on the list and what they have in common.

* The Apollo program. A government-funded science project. No return on investment required.

* The Manhattan Project. A government-funded military project. No return on investment required.

* The F-35 program. A government funded military project. No return on investment required.

* The ISS. A government funded science project. No return on investment required.

* The Interstate Highway System. A government funded infrastructure project. No return on investment required.

* The Marshall Plan. A government funded foreign policy project. No return on investment required.

The actual return on investment for these projects is in the very long term of decades; Economic development, national security, scientific progress that benefits the entire country if not the entire world.

Consider the Marshall Plan in particular. It's a massive money sink, but it's nature as a government project meant it could run at losses without significant economic risk and could aim for extremely long term benefits. It's been paying dividends until January last year; 77 years.

And that dividend wasn't always obvious; Goodwill from Europe towards the US is what has prevented Europe from taking similar actions as China around the US' Big Tech companies. Many of whom relied extensively on 'Dumping' to push European competitors out of business, a more hostile Europe would've taken much more protectionist measures and ended up much like China, with it's own crop of tech giants.

And then there's the two programs left out. The railroads and AI datacenters. Private enterprise that simply does not have the luxury of sitting on it's ass waiting for benefits to materialize 50 years later.

As many other comments in this thread have already pointed out: When the US & European railroad bubbles failed, massive economic trouble followed.

OpenAI's need for (partial) return on investment is as short as this year or their IPO risks failure. And if they don't, similar massive economic trouble is assured.

herbst3 days ago
European railroad bubble failed?

Can you explain that? I really have no idea what you are referring to?

SlinkyOnStairs3 days ago
The search term is the "Railway Mania", which predominant describes the UK's railroad bubble, with smaller similar booms on mainland europe. (You will have to look up French and German sources for the best info on those)

The bubble failed in the sense that massive commitments for new railways were made, and then the 1847 economic crisis caused investment to dry up, which collapsed the bubble and put a halt to the railroad construction boom. Those railway commitments never materialized, and stock market crashes followed.

I'm also being a little cheeky with what "massive economic trouble" entails; While the stock market was heavy on railroads and crashed right into a recession, the world in the mid-1800s was much less financialized so the consequences in absolute terms were less pronounced than a similar bubble-collapse would be today. As such, the main historical comparison is structural.

(Similarly, the AI bubble is likely to burst "by itself" unless OpenAI's IPO is truly catastrophically bad. What's more likely is that a recession happens and then the recession triggers a stock market collapse, which then intensify eachother. And so these historical examples of similar situations may prove illustrative.)

yabutlivnWoods3 days ago
You're actually arguing those highly technical engineering projects provided nothing to humanity investing labor in them because they were not a financial success?

Just confirms my suspicion HN is not a forum for intellectual curiosity. It's been entirely subsumed by MBAs and wannabe billionaires.

SlinkyOnStairs3 days ago
> You're actually arguing those highly technical engineering projects provided nothing to humanity investing labor because they were not a financial success?

No. Re-read the comment.

I specifically say "No return on investment required" not "Has no return on investment". It didn't matter whether these projects earned back their money in the short term, or whether it takes the longer term of many decades.

The ISS hasn't earned back it's $150 billion, and it won't for a pretty long time yet. Doesn't mean it's not a good thing for humanity. Just means that it'd be a bad idea to have the project ran & funded by e.g. SpaceX. The project would've failed, you just can't get ROI on $150 billion within the timeframe required. SpaceX barely survived the cost of developing it's rockets. (And observe how AI spending is currently crushing the profitability of the newly-merged SpaceX-xAI.)

I'm not even saying "AI doesn't provide anything to humanity", I was saying that AI needs trillions of dollars in returns that do not appear to exist, and so it's likely to collapse.

dghlsakjg4 days ago
The railroads and the interstate are arguably the biggest and broadest impact, especially in 2nd order effects (everything West of the Mississippi would be vastly different economically without them).

I am not an ai-booster, but I would not be surprised at AI having a similar enabling effect over the long term. My caveat being that I am not sure the massive data center race going on right now will be what makes it happen.

delecti4 days ago
I agree that AI will probably have bigger effects that we could possibly predict right now. But unlike past booms/bubbles, I suspect the infrastructure being built now won't be useful after it resolves. The railroads, interstate system, and dotcom fiber buildout are all still useful. AI will need to get more efficient to be useful as established technology, so the huge datacenters will be overbuilt. And almost none of the Nvidia chips installed in datacenters this year will still be in use in 5 years, if they're even still functional.
dralley3 days ago
Early railroads didn't have a lot of standardization, so plenty of that investment did get deprecated
Danox3 days ago
The era of the AI data center will be brief because the models will get better and the computers will get more powerful, particularly on the desktop, laptop and phone/tablet . The transition will be like going from mainframe computers to personal computers.
whattheheckheck4 days ago
All of the trucks and carts and tools to build the railroads dont exist anymore. Just like the gpus wont either
throwaway274484 days ago
Is there really that much inefficiency in our distribution of goods and services such that AI could have this much impact?
fyrn_4 days ago
I think the bet is more labor replacement, not saying that's particularly reasonable either
crote4 days ago
> I would not be surprised at AI having a similar enabling effect over the long term.

The big difference is that the current AI bubble isn't building durable infrastructure.

Building the railroads or the interstate was obscenely expensive, but 100+ years down the line we are still profiting from the investments made back then. Massive startup costs, relatively low costs to maintain and expand.

AI is a different story. I would be very surprised if any of the current GPUs are still in use only 20 years from now, and newer models aren't a trivial expansion of an older model either. Keeping AI going means continuously making massive investments - so it better finds a way to make a profit fast.

TeMPOraL3 days ago
GPUs are consumables, not infrastructure. Model weights are the lasting thing.

It's always like that with software. You can still run an OS or a program made 20 years ago, in some cases that program may in fact have no modern replacements available (think niche domains) - meanwhile, in those 20 years, you've probably churned through 5-10 generations of computing hardware.

ezst3 days ago
And I'm not an AI doomer, but hell no, give me another space program/station over this every single time and pretty please. We are not pioneering new engineering science or creating a pipeline of hard research and innovation that will spread in and better our everyday lives for the decades to come. We are overbuilding boring data centers packed with single-purpose chips that WILL BE obsolete within a couple years, for what? For the unhinged hope that LLM chatbots will somehow develop intelligence, and/or that people by the billions will want to pay a hefty price for dressed-up plagiarism machines. There is no indication that LLMs are a pathway to meaningful and transformative AI. Without that, there is no technical merit for the data centers being built currently to constitute future-proof infrastructure like highways and railroad networks did. There is no economical framework in which this somehow trickles down to or directly empowers the individual. This is a sham of ludicrous proportions, a sickening waste.
dTal3 days ago
>There is no indication that LLMs are a pathway to meaningful and transformative AI.

Reality check, they are already astoundingly meaningful and transformative AI. They can converse in natural language, recall any common fact off the top of their heads, do research online and synthesize new information, translate between different human languages (and explain the nuances involved), translate a vague hand wavey description into working source code (and explain how it works), find security vulnerabilities, and draw SVGs of pelicans on bicycles. All in one singularly mind-blowing piece of tech.

The age of computers that just do what you tell them to, in plain language, is upon us! My God, just look at the front page! Are we on the same HN?

operatingthetan4 days ago
>I am not an ai-booster, but I would not be surprised at AI having a similar enabling effect over the long term. My caveat being that I am not sure the massive data center race going on right now will be what makes it happen.

Maybe? It seems as if the tech is starting to taper off already and AI companies are panicking and gaslighting us about what their newest models can actually do. If that's the case the industry is probably in trouble, or the world economy.

EFreethought3 days ago
> AI companies are panicking and gaslighting us about what their newest models can actually do

I think they have been gaslighting us from the beginning.

bigfatkitten3 days ago
Bernie Madoff and his ilk made way for Sam Altman and his friends.

Like Madoff, they’re desperate to pump their Ponzi scheme for as long as they can.

maxglute3 days ago
Depreciation schedule:

Tulips: weeks

GPUs: 6 years

Fiber: 20-50 years

Rail, roads, bridges: 50-100+ years

Hyperscalers closer to tulips than other hard infra.

casey23 days ago
What rail, road or bridge in the US lasts 50 years? The maintenance of rail over 6 years costs more than replacing all the GPUs in a data center, even at their current markup.
bdangubic3 days ago
have you seen our rails, roads and bridges?!? 50 year old ones in many places are being referred to as “new ones” :)

the only reason any “maintenance” on them is expensive is corruption which at municipal level rivals current administration in some places

chatmasta3 days ago
I’m surprised there is no broadband rollout or telecom network on there. I guess it’s hard to quantify the cost within a specific event?
mongol3 days ago
Indeed. Or for that matter, electrification?
hyperbovine3 days ago
The railroad buildout was a lot more, idk, tangible. Most of that money was spent employing millions of people to smelt iron, lay track, build bridges, blow up mountains, etc. It’s a lot more exciting than a few freight loads of overpriced GPUs.
wr23 days ago
Also a good point - railroads for sure brought a lot more optimism.

LLMs+Data centres on the other hand...

j-bos4 days ago
As sibling comments mentioned deceptive comparison as well. How about comparing in percentage of Gross Energy Output. https://www.sciencedirect.com/science/article/abs/pii/S09218...
LeCompteSftware3 days ago
It seems a little silly to put 71 years of private-and-public-sector infrastructure development alongside something highly targeted like the Manhattan Project. It might make more sense to compare the Manhattan Project to the first transcontinental railroad, as a similar targeted but enormously ambitious project amounting to a major technical milestone.

Likewise I don't think it makes sense to compare post-ChatGPT hyperscaler data center construction with all 19th-century US railroad construction. Why not include the already considerable infrastructure of pre-AI AWS/Azure? The relevant economic change isn't "AI," it's having oodles of fast compute available online and a market demanding more of it. OTOH comparing these data centers to the Manhattan Project is wrong in the opposite direction: we should really be comparing a specific headline-grabber like Stargate.

This categorization is just a confusing mishmash. The real conclusion to draw here is that we tend to spend more on long-term and broadly-defined things than we do on specific projects with specific deadlines. Indeed.

globular-toast3 days ago
Were? How else do you expect to get goods around by land?
lukeschlather4 days ago
This seems like a total category error. The Railroads are the only example that actually seems comparable, in being an infrastructure build out that's mostly done by a variety of private companies. Examples of things that would be worth comparing to the datacenter boom are factory construction and utilities (electrification in the first half of the 20th century, running water, gas pipes.)
wisemanwillhear4 days ago
For some reason this reminds me of people at work who walk up and say we did x bazillion things in n time, and then pause and expect us to express shock at how amazing that is and how much more productive they are than other teams. So what. Without a proper comparison to something equivalent I can't evaluate whether it's exceptional. I could treat each molecule as a thing and tell people how incredibly many things I eat on average per minute, but if I explain no one would find this to be exceptional.
nullhole3 days ago
"Rice is great if you're really hungry and want to eat two thousand of something" - Mitch Hedberg
contingencies3 days ago
Also, if you have huge amounts of fresh water and a tropical climate... you can get three harvests per year. Unlike most staple crops.
ZeWaka3 days ago
50 story points!!!
KylerAce3 days ago
The only thing that matters
0xbadcafebee4 days ago
Fwiw, Railroads were the reason for some of the biggest bank collapses in history. Panic of 1873 was literally called "The Great Depression" (until a greater depression hit). 20 years later was the Panic of 1893. Both were due to over-investment and a bubble bursting, and they took out tons of banks and businesses.

We're seeing exactly the same thing with AI, as there is massive investment creating a bubble without a payoff. We know that the value will lower over time due to how software and hardware both gets more efficient and cheaper. And so far there's no evidence that all this investment has generated more profit for the users of AI. It's just a matter of time until people realize and the bubble bursts.

And when the bubble does burst, what's going to happen? Most of the investment is from private capital, not banks. We don't know where all that private capital is coming from, so we don't know what the externalities will be when it bursts. (As just one possibility: if it takes out the balance sheets of hyperscalers and tech unicorns, and they collapse, who's standing on top of them that collapses next? About half the S&P 500 - so 30% of US households' wealth - but also every business built on top of those mega-corps, and all the people they employ) Since it's not banks failing, they probably won't be bailed out, so the fallout will be immediate and uncushioned.

tracerbulletx3 days ago
Have you seen video of a slime mold searching for food? It grows like crazy in a bunch of simultaneous search paths, expending tons of energy following a rough directional gradient looking for food. Once one of the branches finds the food all of the other search paths shrivel up and die off. I think slime molds are much better analogies for these situations than bubbles.
wr23 days ago
Lol a bit dramatic at the end. There will be a correction in stocks that were priced in for growth related to AI.

But what I see is the two big costs for America:

1) Less money being invested into risky AI projects in general, in both public (via cash flows from operations) and private markets 2) The large tech firms who participated in large capex spend related to AI projects won't be trusted with their cash balances - aka having to return more cash and therefore less money for reinvestment

All the hype and fanfare that draws in investment at al comes with a cost - you gotta deliver. People have an asymmetric relationship between gains and losses.

keeda4 days ago
> We're seeing exactly the same thing with AI, as there is massive investment creating a bubble without a payoff.

...

And so far there's no evidence that all this investment has generated more profit for the users of AI.

If you look around a bit, you will find evidence for both. Recent data finds pretty high success in GenAI adoption even as "formal ROI measurement" -- i.e. not based on "vibes" -- becomes common: https://knowledge.wharton.upenn.edu/special-report/2025-ai-a... (tl;dr: about 75% report positive RoI.)

The trustworthiness, salience and nuances of this report is worth discussing, but unfortunately reports like this gets no airtime in the HN and the media echo chamber.

Preliminary evidence, but given this weird, entirely unprecedented technology is about 3+ years old and people are still figuring it out (something that report calls out) this is significant.

0xbadcafebee3 days ago
75% report positive ROI (and the VPs are much more "optimistic" than the middle managers who are closer to the work) - but how much ROI? 1%? The fact that they don't quote a figure at all is pretty telling. And that's the ROI of the people buying the AI services, which are often heavily subsidized. If it costs a billion dollars to give a mid-sized company a 1% ROI, that doesn't sound sustainable.

I would love to see another report that isn't a year old with actual ROI figures...

SlinkyOnStairs3 days ago
> The trustworthiness, salience and nuances of this report is worth discussing, but unfortunately reports like this gets no airtime in the HN and the media echo chamber.

It honestly just isn't that interesting. (Being most notable for people misunderstanding and misrepresenting the chart on page 46 of the report as being "ROI" rather than "ROI measurement")

In terms of ROI figures, it's really just a survey with the question "Based on internal conversations with colleagues and senior leadership, what has been the return on investment (ROI) from your organization's Gen AI initiatives to date?".

This doesn't mean much. It's not even dubiously-measured ROI data, it's not ROI data at all, it's just what the leadership thinks is true.

And that's a worrying thing to rely on, as it's well documented (and measured by the report's next question) that there's a significant discrepancy in how high level leadership and low-level leadership/ICs rate AI "ROI".

One of the main explanations of that discrepancy being Goodhart's law. A large amount of companies are simply demanding AI productivity as a "target" now, with accusations of "worker sabotage" being thrown around readily. That makes good economy-wide data on AI ROI very hard to get.

jeffbee4 days ago
The other categorical error is that the American people paid the railroads a monumental subsidy to get the job done. We gave them almost 10% of the territory.
lenerdenator4 days ago
Given the size of some of these data centers, the incentives packages that local governments often give their developers, and the impact on the electric grid that can, in some cases, raise costs for other ratepayers, I'd say the comparison could be similar.

The one Google's putting in KC North is 500 acres [0] and there were $10 billion in taxable revenue bonds put up by the Port Authority to help with the cost.

This for a company that could pay for that in cash right now.

[0] https://fox4kc.com/news/google-confirms-its-behind-new-data-...

jeffbee4 days ago
That's the opposite of a subsidy. KC stakes nothing of value and gets a defined revenue for the next 25 years.
therobots9274 days ago
The problem is that once built, railroads provided economic value right off the bat.

I would love to hear about the economic value being generated by these LLMs. I think a couple years is enough time for us to start putting some actual numbers to the value provided.

lukeschlather4 days ago
Equating this buildout with LLMs is also a category error. Waymo (self-driving cars) depends on the same infrastructure, and there are a variety of other robotics programs which are actually functioning, you can see them in operation. They all require a lot of GPUs to train and run the models which operate the robotics.
therobots9274 days ago
What % of GPUs are running self driving software or robotics?

And what is the ROI on either of those right now?

throwaway274484 days ago
It's not clear that Waymo is an improvement over existing infrastructure so much as ensuring that fewer humans benefit from each car ride (which was already pathetically low).
Danox3 days ago
Is Waymo a good example when Google has third world people sitting at a screen operating the vehicle on the other side of the world, how can it performance be trusted?
JumpCrisscross4 days ago
> once built, railroads provided economic value right off the bat

If they were laid on a sensible route, completed on budget and time, and savvily operated. Many railroads went bust.

stefan_4 days ago
"Infrastructure build out"? Everything put into these datacenters is worthless well before 10 years have gone by.

We aren't even getting infrastructure out of it, they are just powering it with gas turbines..

jeffbee4 days ago
This isn't true and you can easily prove it to yourself by renting a Sandy Bridge CPU or a TPUv2 from Google today.
negura4 days ago
regardless, it's true that AI-related spending is the largest mobilization of capital in history
Danox3 days ago
And it’s probably useless at the end of the day because everything will reduce down from a centralized location to your desktop/laptop/tablet/phone. OpenAI, Microsoft, Meta, Google, Oracle dreams of a centralized computing location will not hold up.
spprashant3 days ago
I think all misgivings about AI would go away fast, if it solved one important problem for humanity. Carbon nanotubes for space elevators, sustainable nuclear fusion, or something in that ilk.
sixhobbits3 days ago
Personalized medical treatment looks like the most promising candidate so far, would that do it for you?
pona-a3 days ago
There's a video by Siliconversations [0] about it. Medicine is first and foremost limited by high-quality data, not intelligence. If OpenAI built a superhuman AGI tomorrow, it would not change a thing about the state of cancer treatment, at least not for a while.

Trying to design a cancer cure by setting a trillion alight on AI is like trying to achieve UBI by funneling citizen's taxes into Polymarket, so they may operate their free supermarket.

[0] https://www.youtube.com/watch?v=ijTxAfFUHkY

hephaes7us3 days ago
I don't think the above poster is talking about finding novel treatments, but rather that they're talking about aiding in diagnosis and navigating existing treatment options.

We always wish that our doctors would stay up to date on all of the current medical literature as they practice, and some of them do. In theory, AI systems could greatly accelerate a person's ability to retrieve and extract insights from the current body of knowledge.

Of course, that is highly fraught, but, in theory, I think I see what they're going for.

zozbot2343 days ago
How can we be sure of that when we don't even know what improved "intelligence" might look like in this context? Especially given the increased importance of "big data" (genomics, proteomics, metabolomics etc.) to the field and the sheer amount of obscure data that's currently buried in all sorts of archival sources and might be resurfaced with some "intelligence".
spprashant3 days ago
Yes. But unfortunately that domain suffers from ambiguity which LLMs are bad at.

Medical treatment has never been about asking questions and getting perfect answers. Excellent doctors and nurse practitioners have a great intuition for which questions to ask based on cues during patient assessment.

therobots9273 days ago
What exactly does “personalized medical treatment” entail?

Writing prescriptions?

Ok, I can see how AI could theoretically do that (assuming it doesn’t hallucinate and kill a bunch of people). Oh and don’t think it’ll be so easy to give AI the legal authority to prescribe controlled substances. And insurance companies may take issue with expensive prescriptions written by a chat bot.

Perform surgeries? Stitch wounds?

That’s decades away. And that also opens a legal can of worms. Maybe the AI lawyers can figure something out.

jnpnj3 days ago
maybe he's referring to people doing research on their own variant of disease through LLMs to find cures
whiplash4513 days ago
You could take the same issue with all productivity changes that came before AI (typewriters, laptops, etc)
therobots9273 days ago
Well it can build a killer front end and that’s great at impressing investors.
operatingthetan4 days ago
Is this an appropriate spend and risk? I'm starting to feel as if we have been collectively glamoured by AI and are not making sound decisions on this.
staplor3 days ago
I’ve had similar thoughts, but I’ve come around to this buildout being rational. All of the big ai labs are still jockeying for compute and are having trouble keeping up with inference demand.
theonemind3 days ago
It doesn't seem like it to me. I like watching Ed Zitron rant about it on YouTube. It's fun.
therobots9273 days ago
Same. He’s very knowledgeable about this and very skeptical. Not to mention hilarious.

I’m getting my popcorn ready for the bubble pop.

therein4 days ago
I really dislike the term hyperscaler. Comes off very insincere. They came up with it themselves, didn't they? What's the official definition supposed to be now? Companies that are setting up as many GPU/TPU server clusters as possible for a demand that's yet to exist?
rcxdude4 days ago
Hyperscale exists as a term pre-LLM-hype. It mainly exists to describe the kind of datacenteres that companies like google and amazon have been building for at least a decade now: very large, very highly integrated and customised hardware, with a focus on cloud deployment and management strategies. This is to distinguish from just a large datacenter built with commodity server parts from a set of vendors (i.e. the kinds of servers 99% of people will be able to lay their hands on. Another way to put it is that if you're not writing your own BIOS/BMC/etc, you're probably not hyperscaling).
tim3333 days ago
Some history: https://ocient.com/blog/the-history-of-hyperscale-in-computi...

>The term “hyperscale” first emerged in the late 1990s, heralding a paradigm shift in the world of computing. It was primarily used to describe the awe-inspiring scale and capabilities of data centers...

coffeefirst4 days ago
I have concluded the entire public discourse surrounding AI has no relationship to real stuff that you can go, test, and point at.

There’s a loop of everyone is saying stuff because everyone else is saying stuff that turns into a sort of reality inspired fan fiction.

It’s not just that it’s wrong or imprecise, that I expect, it’s that the folklore takes on a life of its own.

bombcar4 days ago
It always makes me think of a hyperactive toddler running around in circles, which oddly fits most thought leaders who use the term.
lenerdenator4 days ago
That's not fair to the toddlers; their crap tends to be safely contained in a diaper as opposed to their heads.
cidd4 days ago
Nobody really uses the term in the Valley except probably C-level people talking to Wall street investors.
mikrl4 days ago
Superscaler sounds too much like superscalar…
dlenski3 days ago
There's a pretty big missing case in this comparison: nuclear weapons.

The US spent ~$12 trillion in ~2024 dollars on nuclear weapons between 1940 and 1996, and the vast majority of that spending was in the 1950s and early 1960s.

https://en.wikipedia.org/wiki/Nuclear_weapons_of_the_United_...

hargup3 days ago
Justin Lebar (he built xla compiler and worked at OpenAI) has an amazing talk about this subject https://youtu.be/cyJU32ivIlk?si=gYuHtzMJIvaSqcht
mattas4 days ago
Is this _actual_ spend? Like dollars actually changing hands?

Or is this "we said we are going to invest $X"? What about the circular agreements?

therobots9273 days ago
A lot of this is committed capital and the datacenters haven’t even broken ground.
uejfiweun3 days ago
Does anyone have any plans for what to do with all these chips and things once they are obsolete? I can't imagine they are all just going to go to some scrap heap.
moogly3 days ago
I have bad news for you...
losvedir4 days ago
Does anyone know what's included in "datacenter capex"? In particular, does that include spending for associated power generation? Because whether or not the AI craze pans out, if we've built a whole bunch of power plants (and especially solar, wind, hydro, etc) that would be a big win.
measurablefunc4 days ago
You can't run a data center on solar or wind (even w/ batteries included). Everything they're building runs on gas & coal like what Musk got running for xAI.
RealityVoid4 days ago
You can and _must_ if you want competitive costs. Musk famously overpaid in order to get speed of deployment.

I was reading geohot's musings about building a data center and doing so cost effectively and solar is _the_ way to get low energy costs. The problem is off-peak energy, but even with that... you might come off ahead.

And that dude is anything but a green fanatic. But he's a pragmatist.

rangestransform3 days ago
That’s because Rs let NIMBYs and the fossil fuel lobby call the shots, and Ds let NIMBYs and degrowthers call the shots. I bet China isn’t powering their datacenters with gas turbines
therobots9273 days ago
They aren’t building out datacenters the way the US is. The arms race is a myth.
Advertisement
danielmarkbruce3 days ago
It's not a project. It's just a lot of money being spent on compute across hundreds/thousands of similar projects.

An analogy would be "all the money spent on transportation infra" over some period of time.

SpicyLemonZest4 days ago
Gentle reminder that the cost of producing well-formatted graphs is much, much lower than it used to be. We grew up in a world where the mere existence of this graph would prove that someone put a great deal of effort into making it, and now it does not. I have no specific reason to doubt the information, but if you want to have reliable epistemic practices, you can no longer treat random graphs you find on social media as presumptively true.
djoldman3 days ago
Just for context, Amazon+Microsoft+Alphabet+Meta+Oracle total revenue for the 5 years ending in 2025 was...

~$6.5 trillion

philip12093 days ago
Would love to see Apple’s china investment on this chart.
kerblang4 days ago
Adjusted for inflation?

edit - sorry, it is in fact adjusted, text is kinda hard to see

anigbrowl4 days ago
It literally says 'Inflation-adjusted costs' on the right side of the graph, right under the main title, FFS.
kerblang4 days ago
There's no need to be snide
throwaway274484 days ago
Further evidence that the US, for whatever reason, lacks basic ability to rationally use resources.
danielmarkbruce3 days ago
It's the richest large country by miles.
throwaway274481 day ago
That doesn't imply we know what to do with it. by all evidence we are one of the worst countries on earth at rationally distributing resources
danielmarkbruce1 day ago
"it" didn't down from the sky. "it" came about from good capital allocation.
tim3333 days ago
The US has mostly ended up quite prosperous if not that environmentally friendly.
guywithahat4 days ago
If you adjust for GDP railroads were much more expensive, and I don't think they're viewed as a mistake https://x.com/finmoorhouse/status/2044985790212583699?s=20
ElevenLathe4 days ago
It's not totally clear that the gigantic push to run rail lines through undeveloped parts of North America "ahead of demand" for reasons of genocide (aka "white settlement"), especially the transcontinental routes, was the smartest investment, even leaving aside the horrific crime it represents. We probably would have gotten greater ROI connecting more developed places on a piecemeal basis and extending the rail network more slowly in the West (and probably even more rapidly in the developed East) instead of founding new towns along brand-new rail lines. There is a reason the federal government was so involved in the finance of these things: left alone, private Eastern capital would not have done things the way they were done, which was chiefly to "open the frontier" aka accelerate the genocide.

I certainly think it was a mistake.

emp173443 days ago
It’s just a classic bubble. They’ve happened before, and while they are irrational, the market sorts itself eventually.
arisAlexis3 days ago
Because this is the last invention of man and they realize this
silexia3 days ago
This is the sentiment most AI experts have, and nearly half of them believe it will lead to our extinction.
negura4 days ago
as of november last year, data centre capex was only 60% of their revenues. which provides the bussiness justification to increase investment further
abofh3 days ago
Not if you include tax breaks as mega projects
amelius3 days ago
We could have had a space elevator by now.
claaams3 days ago
Or free education and universal healthcare and a better society. I would even take someone reanimating dinosaurs like jurassic park over this.
pstuart3 days ago
We could have had non-carbon energy independence by now.
big-and-small3 days ago
China is working on this.
therobots9273 days ago
How much has china invested in GPU buildout?
therobots9273 days ago
But the AI fanatics claimed that AI would solve cold fusion, making that whole thing moot.

The only problem is, if AI doesn’t solve cold fusion, we’re back to square one. And a few trillion dollars in the hole.

danilocesar3 days ago
There’s a joke that in a couple of years, after spending trillions of dollars, burning mountains of coal to run country-sized datacenters and boiling all the oceans, we finally achieve AGI.

Then the first question we ask it is: 'How do we fix climate change?' And it answers: 'you can start by unplugging me'

boxedemp3 days ago
If we're counting, the USA was already pretty deep in the hole. Anybody that has experienced crippling debt understands there's a point of no return where you just embrace it.

And that point is right before rock bottom.

Advertisement
bawana3 days ago
only 20% of health care spending!
vrganj3 days ago
And 0% of people cured.
dilberx3 days ago
everything is business, less about humanity
thelastgallon3 days ago
I wonder what percentage of GDP is spent on crypto.
tim3333 days ago
Crypto's a funny one economically as while there are some real costs like mining, a lot of the money is just swapped around like A makes a coin, B pays A a million for some but the million isn't really spent, just swapped between crypto idiots.
hashmap3 days ago
if you think datacenters are a waste (they are), wait til you hear about department of war spending
metalman4 days ago
we, the people, are the ultimate mega project, and it's showing
jgalt2123 days ago
Just wait until the DAOs become agentic!
cactacea4 days ago
Really shows where our priorities are at as a country. SMH