GenAI FOMO has spurred businesses to light nearly $40B on fire
theregister.com244 points by rntn 3 days ago
244 points by rntn 3 days ago
Well run large companies often waste a lot, in order to (1) hedge risks of being left behind, (2) ensure they have options in the future in possible growth or new efficiency areas, and (3) to start on long learning curves for skills and capabilities that appear likely to be a baseline necessity in the long run.
Bonfires of money.
Predictably. Because all three of those concerns require highly speculative action to properly address.
That doesn't make those reasons invalid. Failures are expected, especially in early days. And are not a sign they are making spurious bets, or starry eyed about industry upheavals. The minimal return is still experience gained and a ramped up institutional focus.
How many of us here speed up our overall development by coding early on new projects before we have complete clarity? Writing code we will often throw away?
Agreed. Smug dismissal of new ideas is such a lazy shortcut to trying to look smart. I'd much rather talk to someone enthusiastic about something than someone who does nothing but sit there and say "this thing sucks" every time something happens even if person #2 is incidentally right a lot of the time.
Smug acceptance of new ideas is such a lazy shortcut to trying to look smart. I'd much rather talk to someone who has objectively analyzed the concept rather than someone who does nothing but sit there and say "this thing is great" for no real reason other than "everyone is using it".
Incidentally, "everyone" is wrong a lot of the time.
While I agree in principle, someone has to make decisions about resource allocation and decide that some ideas are better than others. This takes a finely tuned sense of BS detector and technical feasibility estimation, which I would argue is a real skill. It thus becomes subtly different to be an accurate predictor of success vs default cynic if 95% of ideas aren’t going to work.
> if 95% of ideas aren’t going to work
Well, if only 95% of our ideas don't work, with a little hard work and sacrifice, we are livin' in an optimish paradise.
Or (4), because a sales team convinced them there was some magic wand they could buy that would triple their productivity.
You put my thoughts into words better than I could.
This reminds me of the exploration-exploration trade-off in reinforcement learning: you want to maximise your long term profits but, since your knowledge is incomplete, you must acquire new knowledge, which companies do by trying stuff. Prematurely dismissing GenAI could mean missing out on new efficiencies, which take time to be identified.
You're giving company executives way too much credit in general. I'm sure there are unicorns out there where conscientious stewards of the company's long-term health are making measured, rational choices that may pay off in a decade, but it's a tiny minority of companies. Most are run by narcissistic short-term corporate raiders whose first priority is looting the company for their own profit and second priority is cosplaying as a once-in-a-generation genius "thought leader" in a never-ending battle to beat away the nagging (and correct) thought that they're nepo-babies who have no clue what they're doing. These morons are burning money because they are stupid and/or because it benefits them and their buddies in the short-term. They burned money on blockchain bullshit, they burned money on Web 2.0 bullshit, they are burning money on AI, and they will be burning money on the next fad too. The fact that AI might actually turn out to be something real is complete serendipity; it has nothing to do with their insight or foresight. The only reason they ever look smart is because they're great at taking credit for every win, everyone else immediately forgets all their losses, and op-ed writers and internet simps all compete to write the most sycophantic adulations of their brilliance. They could start finger-painting their office windows with their own feces and the Wall Street Journal would pump out op-eds saying "Here's why smearing poop on your windows is actually a Really Brilliant Business Move made by Really Good-Looking Business Winners!" Just go back and re-read your comment but think "blockchain" instead of "AI" and you'll see clearly how silly and sycophantic it really is.
> Well run large companies [...]
Yes, of course. Incompetent leaders do incompetent things.
No argument or surprise.
The point I made was less obvious. Competent leaders can also/often appear to throw money away, but for solid reasons.
Similarly, VC sets barrels of money on fire.
And depending on how you look at it, science itself is experimentation, but at least it mostly results in publications in the end, that may or may not be read, but at least serve as records of areas explored.
I think your take covers science too.
Scientists and mathematicians often burn barrels of time and unpublished ideas, not to mention following their curiosities into random pursuits, that give their subconscious the free space to crystalize slippery insights.
With their publishable work somehow gelling out of all that.
Well, at least AI is going to be better than the blockchain hype. No one knew what “blockchain” was, how it worked, or what could be used for.
I had a very hard time explaining once you put something in the chain, you can’t easily pull it back out. If you wanted to verify documents, all you have to do is put a hash in a database table. Which we already had.
It has exactly one purpose: prevent any single entity from controlling the contents. That includes governments, business executives, lawyers, judges, and hackers. The only good thing is every single piece of data can be pulled out into a different data structure once you realize your mistake.
Note, I’m greatly oversimplifying all the details and I’m not referring to cryptocurrency.
> has exactly one purpose: prevent any single entity from controlling the contents.
I'd like to propose a different characterization: "Blockchain" is when you want unrestricted membership and participation.
Allowing anybody to spin up any number of new nodes they desire us the fundamental requirement which causes a cascade of other design decisions and feedback systems. (Mining, proof-of-X, etc.)
In contrast, deterring one entity from taking over can also be done with a regular distributed database, where the nodes--and which entities operate them--are determined in advance.
Sure, blockchain development has always been deeply tied to ideas of open membership and participation. I like those ideas too.
But that's a poor definition of a blockchain. A blockchain is merely a distributed ledger with certain properties from cryptography.
If you spin up a private bitcoin network, it's a blockchain even if nobody else knows or cares about it. Now, are non-open blockchains at all useful? I suspect so, but I don't know of any great examples.
The wide space between 'membership is determined in advance' and 'literally anyone can make a million identities at a whim' is worth exploring, IMO.
> A blockchain is merely a distributed ledger with certain properties from cryptography.
If we charitably assume "blockchain" has some engineering meaning (and it isn't purely a word for marketing/scamming) then there's some new aspect which sets it apart from the distributed-databases we've been able to make just fine for decades.
Uncontrolled participation is that key aspect. Without that linchpin, almost all the other new stuff becomes weirdly moot or actively detrimental.
> If you spin up a private bitcoin network, it's a blockchain even if nobody else knows or cares about it.
That's practically a contradiction in terms. It may describe the ancestry of the project, but it doesn't describe what/how it's being used.
Compare: "If you make a version of Napster/Gnutella with all the networking code disabled, it's still a Peer-to-Peer file sharing client even when only one person uses it."
I interpreted "private" from the comment above yours to mean membership determined by some authority. So your example doesn't hold well, because networking would still be enabled in the file sharing fork, but on a private network rather than the open internet.
You misunderstand. The analogy wouldn't be Napster with networking code disabled. The analogy is Napster on a LAN. Only those on the LAN can access it so it's not open to the world, but nonetheless you've still got a p2p file-sharing client.
And yes. I'm using the engineering definition. I don't believe in letting a gaggle of marketers and scammers define my terms. A blockchain is a specific technology. It doesn't mean 'whatever scam is happening this week', even if said scam involves a blockchain.
I don't blame you for associating blockchains with scams and fully open projects, that's undeniably what we've seen it used for. But that's not what defines a blockchain.
"A scalpel can only be used for surgery"
"If you use a scalpel to cut a steak, it's still a scalpel."
"There must be some new aspect to scalpels! We've been able to make steak knives for decades!"
On reflection I was incorrect to mention Napster, which (at least in its most-popular incarnation) was still centralized for indexing and searching, with P2P only for bulk file data. Please pretend said "Gnutella" alone.
> The analogy is [the P2P application] on a LAN.
The analogy is the P2P application where regular clients can only discover a Special Master Client that must be running on a fixed IP, which only permits connections if you have credentials for a user-account arranged in advance.
In each case, the system's centerpiece feature is being voided, but that feature is different between them.
1. For "Blockchain", the centerpiece is unrestricted participation. (Other decentralization is an indirect effect.)
2. For P2P file sharing, the centerpiece is how nobody needs to run an expensive/vulnerable central server, but it wasn't a contradiction in terms to have a private peer-network.
let's use the term xype for anything being hyped without a good reason .. :-)
> No one knew what “blockchain” was, how it worked, or what could be used for.
Not the blockchain itself, but the concept of an immutable, append only, tamper-proof ledger underpinning it is a very useful one in many contexts where the question of authenticity of datasets arises – the blockchain has given us a ledger database.
The ledger database is more than just a hash as it also provides a cryptographic proof that the data is authentic via hash chaining, no ability to delete or modify a record, and the entire change history of any record. All these properties make the ledger databases very useful in many contexts, especially the ones where official documents are involved.
I often feel that immutability is very much over-rated and goes against real word. Lot of legal system is build on reverting things. Thus things being harder to revert is not actually desirable property.
No. Official and legal records detest the in place changes, love a full history of changes, and also love to walk the entire history of changes back. You do not have to convince me, you can talk to the state auditors instead to get their perspective.
Consider two simple and common scenarios (there are more): citizenship certificates and the recognition of prior learning.
1. Citizenship – the person with the name of Joanna Doe in the birth certificate was issued with a citizenship certificate in the same name. As time goes by, Joanna Doe changes the legal name to Joanna Smith; the citizenship certificate is reissued with the new legal name. We do not want to truly update the existing record in the citizenship database and simply change Doe -> Smith as it will create a mismatch with the name in the birth certificate. If we use a ledger database, an update operation will create a new revision of the same record and all subsequent simple query operations will return the latest updated revision with the new name. The first revision, however, will still be retained in the table's cryptographically verifiable audit/history log.
Why should we care? Because Joanna Smith can accidentally throw their new citizenship certificate away and later they will want to renew their passport (or the driver's licence). The citizenship certificate may be restored[0] by presenting the birth certificate in the original name and the current or expired passport, but the passport is in the new name. From a random system's point of view, Joanna Doe and Joanna Smith are two distinct individuals with no apparent link between them. However, the ledger database can provide proof that it is the same person indeed because the unabridged history of name changes is available, it can be queried and relied upon.
2. Recognition of prior learning – a person has been awarded a degree at institution A. Credits from Institution A contributed to a degree at Institution B. The degree at B is revoked due to issues with source evidence (i.e. Institution A). The ledger database makes such ripple effects deterministic – a revocation event at B triggers rules that re-evaluate dependent awards and enrolments at partners, with a verifiable trail of who was notified and when. If Institution A later corrects its own records, Institution B and downstream bodies can attach a superseding record rather than overwrite, preserving full lineage. The entire drama unfolded will always be available.
2½. Recognition of prior learning (bonus) – an employer verified the degree on the hiring date. Months later it is revoked. The employer can present a ledger proof that, on the hiring date, the credential existed and was valid. It reduces dispute risk and supports fair-use decisions such as probation reviews rather than immediate termination.
All this stuff is very real and the right tech stack (i.e. the ledger DB) reduces the complexity tremendously.
[0] Different jurisdictions have different rules but the procedure is more or less similar amongst them.
For all that stuff, I like the blockchain implementation known as "git".
I have heard an argument for git before, and it is a viable fit for some use cases.
The problem, however, stems from the fact that the git commit history can be modified, which automatically disqualifies git in many other use cases, e.g. official government-issued documents, financial records, recognition of prior learning, and similar – anywhere where the entire, unabridged history of records is required.
It can't without destroying every subsequent commits' digest, unless you find a way to generate commits with identical SHA digests.
There is no such a thing as «subsequent commits» in git.
Commits in git are non-linear and form a tree[0][1]. A commit can be easily deleted without affecting the rest of the tree. If the git commit represent a subtree with branches dangling off it, deleting such a commit will delete the entire subtree. Commits can also be moved around, detached and re-attached to other commits.
[0] https://www.baeldung.com/ops/git-objects-empty-directory#2-g...
[1] https://www.baeldung.com/ops/git-trees-commit-tree-navigatio...
True but irrelevant. You can't remove a commit in a way that someone else with a copy of the repo can't detect, in exactly the same way and for the same reason that you can't remove a blockchain entry without making instantly obviously to later items.
Very much relevant. The definition of the ledger database includes immutability of datasets as a hard constraint[0][1]. The mere fact that the git commit history can be altered automatically disqualifies git from being an acceptable alternative to the ledger database in highly regulated environments.
If strict compliance is not a hard requirement (open source project are the prime example), git can be used to prove provenance, provided you trust the signer’s public key or allowed signers file.
[0] https://www.techtarget.com/searchCIO/definition/ledger-datab...
It is immutable in exactly the same way. The git commit history cannot by altered, except in the same sense that you could manually edit the backing store of a blockchain to alter the data, and with the same consequence that the alteration would be instantly noticeable in either case.