The post-GeForce era: What if Nvidia abandons PC gaming?
pcworld.com115 points by taubek 4 days ago
115 points by taubek 4 days ago
If they do it'll likely be part of an industry wide push to kill off the home-built PC market. It's no secret that MS and others want the kind of ecosystem Apple has and governments want more backdoor access to tech. And which mfg wouldn't want to eliminate partial upgrades/repairs. Imagine that the only PC you could buy one day has everything tightly integrated with no user serviceable or replaceable parts without a high-end soldering lab. Now, since it's impractical to build your own they can raise the price to purchase one above reach of most people and the PC market succeeds in their rental PC aspirations.
You're not thinking big enough. Their ultimate goal is gaming (or any computing really) available only in the cloud.
Yeah they want a return to TV era where censors curtail content
Everyone will own a presentation layer device. Anyone who can only afford the 1GB model can only get SNES quality visuals.
Snow Crash and Neuromancer have displaced the Bible as cognitive framework for tech rich.
Am working on an app that generates and syncs keys 1:1 over local BT and then syncs them again to home PC (if desired). The idea being cut out internet middle men and go back to something like IRC direct connect, that also requires real world "touch grass" effort to complicate greedy data collectors.
Testing now by sharing IP over Signal and then 1:1'ing over whatever app. Can just scaffold all new protocols on top of TCP/IP again.
It’s not really about censorship though, it’s about having control over a rent economy where there is no ownership. It provides maximum profit potential
Different entities have different goals but are cooperating in making this happen so they each get what they want. Global corporations get guaranteed income streams from most of the population while governments and ideological groups get control over the flow of communication between people to ensure correct think.
See, I wrote that out but then I thought, “Nah, that’s too conspiracy for this crowd.” But lo! Yeah. Not excited about the emerging status quo.
Even bigger than that, it’s all a slow march to a sort of narcissistic feudalism.
This. Your home PC is just another appliance now. Welcome to your Copilot future.
Maybe then the year of Linux (or OpenBSD?) on the desktop would finally arrive. Maybe anti-trust could get used. Maybe parts could get scrapped from data centres.
Interesting times they would be!
Linux isn’t much use if you can’t get hold of (non-locked-down) hardware to run it on
It's already here right now, unironically. There's no need for Windows for gaming now. I just build a new rig with a 7900 XTX and with Steam on Arch Linux everything just works with absolutely no hassle or thinking. This was the only value Windows still had and now that's over.
I broadly concur. These days, gaming is usually very easy on Linux.
Except: If I want to kill some time being chaotic in GTA:V Online, and do that in Linux, then that is forbidden. Single player mode works beautifully, but multiplayer is verboten.
(And I'm sure that there are other games that similarly feature very deliberate brokenness, but that's my example.)
There are still some pain points with Linux distros: With some, an upgrade can leave you unable to boot into a graphical login screen. This can also happen if you leave a Linux installation, like Manjaro, alone for a year and then do an update.
That is a problem if you happen to have a nvidia GPU, and, as the article says, by nvidia forcing it, you will not be able to have that brand of customer gamer GPU anymore.
manjaro is arch under the hood and arch is supposed to be updated fairly frequently.
For every Linux distro, there's always someone who tells you "you've done it wrong" whenever it fails. You never get this with anything else. It's unacceptable and I reject it.
OpenBSD is starting to look enticing.
FOSS is more divided than ever, which is an interesting situation given the timing when they should be a solid place for individuals to turn to against the centralization of control. It's quite convenient that so many petty little wars have broken out across the FOSS landscape at just the right time.
Stadia worked, when conditions were good, Geforce Now exists. No cheaters in multiplayer (though there are always new ones), it's a way to go. They're even doing a thing with cellphones as merely devices playing a full screen video stream that you can interact with.
If they do it all gamers will boycott LLMs. Which would be a godsend. Decades trying to save power, moving to LED, trying to improve efficiency everywhere, and now... We are wasting terawatts in digital parrots.
I think China will then try to sell their own PC parts instead, their semiconductor industry is catching up so who knows in a decade.
But perhaps then the US will probably reply with tariffs on the PC parts (or even ban them!) Which is slowly becoming the norm for US economic policy, and which won't reverse even after Trump.
There is definitely a part of me which feels like with the increasing ram prices and similar. Its hard for people to have a home lab.
To me what also feels is that there becomes more friction in an already really competitive and high-friction business of creating cloud.
With increasing ram prices which I (from my knowledge) would only decrease in 2027-2028 or when this bubble pops, It would be extremely expensive for a new entry of cloud provider in this space.
When I mention cloud provider, what I mean aren't the trifecta of AWS,Azure or GCP but rather all the other providers who bought their own hardware and are co-locating it to a datacenter and selling their services targeted at low/mid-range vps/vds servers
I had previously thought about creating cloud but in this economy and the current situations, I'd much rather wait.
The best bet right now for most people creating cloud /providing such services is probably whitewashing any other brand and providing services on top that make you special.
The servers are still rather cheap but the mood that I can see in providers right now is that they are willing to hold the costs for some time to not create a frenzy (so they still have low prices) but they are cautiously waiting and looking for the whole situation and if recent developments continue happening in such a way, I wouldn't be surprised if server providers might raise some prices because the effective underlying hardware's ram/prices increased too.
Feel the same way here. Can't help but get the vibe that big tech wants to lock consumers out, eliminate the ability to have personal computing/self-hosted computing. Maybe in tandem with governments, not sure, but it's certainly appetizing to them from a profit perspective.
The end goal is the elimination of personal ownership over any tech. They want us to have to rent everything.
Honestly its not the fact that they want us to rent everything but rather that effectively an AI tax is happening on us general public (or even hobbyists) where the price of hardware/ram is increasing because of AI demands.
I don't exactly think that they did it on purpose to chokehold the supply but it sure damn happened and that doesn't change the fact that prices of hardware might / (already?) increase
That might be a bit on the paranoid side. It could just be that it's far more profitable right now for companies to sell only to data centres. That way, they don't need to spend money on advertising, or share their revenues with third-party sellers.
I doubt that this would ever happen. But...
If it does, I think it would be a good thing.
The reason is that it would finally motivate game developers to be more realistic in their minimum hardware requirements, enabling games to be playable on onboard GPUs.
Right now, most recent games (for example, many games built on Unreal Engine 5) are unplayable on onboard GPUs. Game and engine devs simply don't bother anymore to optimize for the low end and thus they end up gatekeeping games and excluding millions of devices because for recent games, a discrete GPU is required even for the lowest settings.
They're not targeting high-end PCs. They're targeting current generation consoles, specifically the PS5 + 1080p. It just turns out that when you take those system requirements and put them on a PC—especially a PC with a 1440p or 2160p ultrawide—it turns out to mean pretty top of the line stuff. Particularly if as a PC gamer you expect to run it at 90fps and not the 30-40 that is typical for consoles.
1440p and 2160p is a total waste of pixels, when 1080p is already at the level of human visual acuity. You can argue that 1440p is a genuine (slight) improvement for super crisp text, but not for a game. HDR and more ray tracing/path tracing, etc. are more sensible ways of pushing quality higher.
Without disagreeing with the broad strokes of your comment, it feels like 4K should be considered standard for consoles nowadays - a very usable 4K HDR TV can be had for $150-500.
Thats a waste of image quality for most people. You have to sit very close to a 4k display to be able to perceive the full resolution. On PC you could be 2 feet from a huge gaming monitor, but an extremely small percentage of console players have the tv size and distance ratio where they would get much out of full 4k. Much better to spend the compute on higher framerate or higher detail settings.
I think higher detail is where most of it goes. A lower resolution, upscaled image of a detailed scene, at medium framerate reads to most normal people as "better" than a less-detailed scene rendered at native 4k, especially when it's in motion.
Assuming you can render natively at high FPS, 4k makes a bigger difference on rendered images than live action because it essentially brute forces antialiasing.
You wish. Games will just be published cloud-only and you can only play them via thin clients.
It's pretty consistently been shown that this just can't provide low-enough latency for gamers to be comfortable with it. Every attempt at providing this has experience has failed. There's few games where this can even theoretically be viable.
The economics of it also have issues, as now you have to run a bunch more datacenters full of GPUs, and with an inconsistent usage curve leaving a bunch of them being left idle at any given time. You'd have to charge a subscription to justify that, which the market would not accept.
I am pretty sure that the current demand of gpu's can pretty much eat the left idle time issue at major datacenters because of the AI craze.
Not that its good or bad tho but we could probably have something more akin to spot instances of gpu being given for gaming purposes.
I do see a lot of company are having GPU access costs per second/instant shutdown/restart I suppose but overall I agree
My brother recently came for the holidays and I played ps5 for the first time on his mac connected to his room 70-100 kms away and honestly, the biggest factor of latency was how far the wifi connection (which was his phone's carrier) and overall, it was a good enough experience but I only played mortal kombat for a few minutes :)
Current datacenter GPUs are optimized for LLM compute, not for real-time rendering. The economics for running such beefy GPUs just for game streaming won't add up.
Consoles and their install base set the target performance envelope. If your machine can't keep up with a 5 year old console then you should lower expectations.
And like, when have onboard GPUs ever been good? The fact that they're even feasible these days should be praised but you're imagining some past where devs left them behind.
True. Optimization is completely dead. Long gone are the days of a game being amazing because the devs managed to pull crazy graphics for the current hardware.
Nowadays a game is only poorly optimized if it's literally unplayable or laggy, and you're forced to constantly upgrade your hardware with no discernible performance gain otherwise.
Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying. The “optimisation” you talk about was the CPU in the ps4 generation was so weak and tech was moving so fast that any pc bought in 2015 onwards would easily brute force overpower anything that had been built for that generation.
> Crazy take, in the late 90s/early 00s your GPU could be obsolete 9 months after buying.
Not because the developers were lazy, but because newer GPUs were that much better.
There were lazy devs back then too but I feel lazy devs have become the norm now.
I work in gamedev, historically AAA gamedev.
If you think that the programmers are unmotivated (lazy) or incompetent; you’re wrong on both counts.
The amount of care and talent is unmatched in my professional career, and they are often working from incomplete (and changing) specifications towards a fixed deadline across multiple hardware targets.
The issue is that games have such high expectations that they didn’t have before.
There are very few “yearly titles” that allow you to nail down the software in a nicer way over time, its always a mad dash to get it done, on a huge 1000+ person project that has to be permanently playable from MAIN and where unit/integration tests would be completely useless the minute they were built.
The industry will end, but not because of “lazy devs”, its the ballooned expectations, stagnant revenue opportunity, increased team sizes and a pathological contingent of people using games as a (bad) political vehicle without regard for the fact that they will be laid off if they can’t eventually generate revenue.
—-
Finally, back in the early days of games, if the game didn’t work, you assumed you needed better hardware and you would put the work in fixing drivers and settings or even upgrading to something that worked. Now if it doesn’t work on something from before COVID the consensus is that it is not optimised enough. I’m not casting aspersions at the mindset, but it’s a different mentality.
Most gamers don't have the faintest clue regarding how much work and effort a game requires these days to meet even the minimum expectations they have.
That's bullshit. I don't care about graphics, I play lots of indie games, some of them are made by a single person. There are free game engines, so basically all one needs for a successful game is just a good idea for the game.
And a friend of mine still mostly plays the goddamn Ultima Online, the game that was released 28 years ago.
and if a new game came out today that looked and played the same as Ultima online… What would you (and the rest of gamers) think about it?
Your expectations of that game are set appropriately. Same with a lot of Indy games, the expectation can be that its in early access for a decade+. You would never accept that from, say, Ubisoft.
> The amount of care and talent is unmatched in my professional career, and they are often working from incomplete (and changing) specifications towards a fixed deadline across multiple hardware targets.
I fully agree and I really admire people working on the industry. When I see great games which are unplayable in the low end because of stupidly high minimum hardware requirements, I understand game devs are simply responding to internal trends within the industry, and especially going for a practical outcome by using an established game engine (such as Unreal 5).
But at some time I hope this GPU crunch forces this same industry to allocate time and resources either at the engine or at the game level to truly optimize for a realistic low end.
the lead time for a new engine is about 7 years (on the low end).
I don’t think any company that has given up their internal engine could invest 7 years of effort without even having revenue from a game to show for it.
So the industry will likely rally around Unreal and Unity- and I think a handful of the major players will release their engines on license… but Unreal will eat them alive due to the investments in Dev UX (which is much-much higher than proprietary game engines IME). Otherwise the only engines that can really innovate are gated behind AAA publishers and their push for revenue (against investment for any other purpose).
All this to say, I’m sorry to disappoint you, its very unlikely.
Games will have to get smaller and have better revenues.
I'm not implying at all that every game company should develop their own in-house engine.
But maybe, just maybe, they could request Epic or Unity to optimize their engines better for the lower end.