AMD and Sony's PS6 chipset aims to rethink the current graphics pipeline
arstechnica.com317 points by zdw 2 days ago
317 points by zdw 2 days ago
Could the PS6 be the last console generation with an expressive improvement in compute and graphics? Miniaturization keeps giving ever more diminishing returns each shrink, prices of electronics are going up (even sans tariffs), lead by the increase in the price of making chips. Alternate techniques have slowly been introduced to offset the compute deficit, first with post processing AA in the seventh generation, then with "temporal everything" hacks (including TAA) in the previous generation and finally with minor usage of AI up-scaling in the current generation and (projected) major usage of AI up-scaling and frame-gen in the next gen.
However, I'm pessimistic on how this can keep evolving. RT already takes a non trivial amount of transistor budget and now those high end AI solutions require another considerable chunk of the transistor budget. If we are already reaching the limits of what non generative AI up-scaling and frame-gen can do, I can't see where a PS7 can go other than using generative AI to interpret a very crude low-detail frame and generating a highly detailed photorealistic scene from that, but that will, I think, require many times more transistor budget than what will likely ever be economically achievable for a whole PS7 system.
Will that be the end of consoles? Will everything move to the cloud and a power guzzling 4KW machine will take care of rendering your PS7 game?
I really can only hope there is a break-trough in miniaturization and we can go back to a pace of improvement that can actually give us a new generation of consoles (and computers) that makes the transition from an SNES to a N64 feel quaint.
My kids are playing Fortnite on a PS4, it works, they are happy, I feel the rendering is really good (but I am an old guy) and normally, the only problem while playing is the stability of the Internet connection.
We also have a lot of fun playing board games, simple stuff from design, card games, here, the game play is the fun factor. Yes, better hardware may bring more realistic, more x or y, but my feeling is that the real driver, long term, is the quality of the game play. Like the quality of the story telling in a good movie.
Yes, that's something I failed to address in my post. I myself have also been happier playing older or just simpler games than chasing the latest AAA with cutting edge graphics.
What I see as a problem though is that the incumbent console manufacturers, sans Nintendo, have been chasing graphical fidelity since time immemorial as the main attraction for new generations of consoles and may have a hard time convincing buyers to purchase a new system once they can't irk out expressive gains in this area. Maybe they will successfully transition into something more akin to what Nintendo does and focus on delivering killer apps, gimmicks and other innovations every new generation.
Or perhaps they will slowly fall into irrelevance and everything will converge into PC/Steam (I doubt Microsoft can pull off whatever plan they have for the future of xbox) and any half-decent computer can run any game for decades to come and Gabe Newell becomes the richest person in the world.
Every generation thinks the current generation of graphics won't be topped, but I think you have no idea what putting realtime generative models into the rendering pipeline will do for realism. We will finally get rid of the uncanny valley effect with facial rendering, and the results will almost certainly be mindblowing.
Every generation also thinks that the uncanny valley will be conquered in the next generation ;)
The quest for graphical realism in games has been running against a diminishing-returns-wall for quite a while now (see hardware raytracing - all that effort for slightly better reflections and shadows, yay?), what we need most right now is more risk-taking in gameplay by big budget games.
I think the inevitable near future is that games are not just upscaled by AI, but they are entirely AI generated in realtime. I’m not technical enough to know what this means for future console requirements, but I imagine if they just have to run the generative model, it’s… less intense than how current games are rendered for equivalent results.
I don't think you grasp how many GPUs are used to run world simulation models. It is vastly more intensive in compute that the current dominant realtime rendering or rasterized triangles paradigm
I don't think you grasp what I'm saying? I'm talking about next token prediction to generate video frames.
Yeah, which is pretty slow due to the need to autoregressively generate each image frame token in sequence. And leading diffusion models need to progressively denoise each frame. These are very expensive computationally. Generating the entire world using current techniques is incredibly expensive compared to rendering and rasterizing triangles, which is almost completely parallelized by comparison.
Okay you clearly know 20x more than me about this, so I cannot logically argue. But the vague hunch remains that this is the future of video games. Within 3 to 4 years.
I don't think that will ever happen die to extreme hardware requirements. What I do see happen is that only an extremely low fidelity scene is rendered with only basic shapes, no or very little textures etc. that is them filled in by AI. DLSS taken to the extreme, not just resolution but the whole stack.
I’m thinking more procedural generation of assets. If done efficiently enough, a game could generate its assets on the fly, and plan for future areas of exploration. It doesn’t have to be rerendered every time the player moves around. Just once, then it’s cached until it’s not needed anymore.
Even if you could generate real-time 4K 120hz gameplay that reacts to a player's input and the hardware doesn't cost a fortune, you would still need to deal with all the shortcomings of LLMs: hallucinations, limited context/history, prompt injection, no real grasp of logic / space / whatever the game is about.
Maybe if there's a fundamental leap in AI. It's still undecided if larger datasets and larger models will make these problems go away.
Realtime AI generated video games do exist, and they're as... "interesting" as you might think. Search YouTube for AI Minecraft
Good luck trying to tell a "cinematic story" with that approach, or even trying to prevent the player from getting stuck and not being able to finish the game, or even just to reproduce and fix problems, or even just to get consistent result when the player turns the head and then turns it back etc etc ;)
There's a reason why such "build your own story" games like Dwarf Fortress are fairly niche.
Unreal engine 1 looks good to me, so I am not a good judge.
I keep thinking there is going to be a video game crash soon, over saturation of samey games. But I'm probably wrong about that. I just think that's what Nintendo had right all along: if you commoditize games, they become worthless. We have endless choice of crap now.
In 1994 at age 13 I stopped playing games altogether. Endless 2d fighters and 2d platformer was just boring. It would take playing wave race and golden eye on the N64 to drag me back in. They were truly extraordinary and completely new experiences (me and my mates never liked doom). Anyway I don't see this kind of shift ever happening again. Infact talking to my 13 year old nephew confirms what I (probably wrongly) believe, he's complaining there's nothing new. He's bored or fortnight and mine craft and whatever else. It's like he's experiencing what I experienced, but I doubt a new generation of hardware will change anything.
> Unreal engine 1 looks good to me, so I am not a good judge.
But we did hit a point where the games were good enough, and better hardware just meant more polygons, better textures, and more lighting. The issues with Unreal Engine 1 (or maybe just games of that era) was that the worlds were too sparse.
> over saturation of samey games
So that's the thing. Are we at a point where graphics and gameplay in 10-year-old games is good enough?
Are we at a point where graphics and gameplay in 10-year-old games is good enough?
Personally, there are enough good games from the 32bit generation of consoles, and before, to keep me from ever needing to buy a new console, and these are games from ~25 years ago. I can comfortably play them on a MiSTer (or whatever PC).
Yep, I have a mister and a steam deck that's mainly used for emulators and old pc games. I'm still chasing old highs
If the graphics aren’t adding to the fun and freshness of the game, nearly. Rewatching old movies over seeing new ones is already a trend. Video games are a ripe genre for this already.
Now I'm going to disagree with myself... there came a point where movies started innovating in storytelling rather than the technical aspects (think Panavision). Anything that was SFX-driven is different, but the stories movies tell and how they tell them changed, even if there are stories where the technology was already there.
I get so sad when I hear people say there’s no new games. There are so many great, innovative games being made today, more than any time in history. There are far more great games on Steam than anyone can play in a lifetime.
Even AAAs aim to create new levels of spectacle (much like blockbuster movies), even if they don’t innovate on gameplay.
The fatigue is real (and I think it’s particularly bad for this generation raised to spend all their gaming time inside the big 3), but there’s something for you out there, the problem is discoverability, not a lack of innovation.
This so much. Anyone that's saying games used to be better is either not looking or has lost their sight to nostalgia.
"if you commoditize games, they become worthless"
???? hmm wrong??? if everyone can make game, the floor is raising making the "industry standard" of a game is really high
while I agree with you that if everything is A then A is not meaning anything but the problem is A isn't vanish, they just moved to another higher tier
You probably have a point and it's not something I believe completely. My main problem I think is I have seen nothing new in games for 20 years at least.
Gunpei yokoi said something similar here:
https://shmuplations.com/yokoi/
Yokoi: When I ask myself why things are like this today, I wonder if it isn’t because we’ve run out of ideas for games. Recent games take the same basic elements from older games, but slap on characters, improve the graphics and processing speed… basically, they make games through a process of ornamentation.
That's the Nintendo way. Avoiding the photorealism war altogether by making things intentionally sparse and cartoony. Then you can sell cheap hardware, make things portable etc.
also nintendo vision which is "mobile gaming" are
handheld devices like switch,steam deck etc is really the future while phone is also true for some extend but gaming on a phone vs gaming on a handheld is really world of a differences
give it few generations then traditional consoles would obsolete, I mean we are literally have a lot of people enjoy indie game in steam deck right now
It sounds like even the PS6 isn’t going to have an expressive improvement, and that the PS5 was the last such console. PS5 Pro was the first console focused on fake frame generation instead of real output resolution/frame rate improvements, and per the article PS6 is continuing that trend.
What really matters is the cost.
In the past a game console might launch at a high price point and then after a few years, the price goes down and they can release a new console at a high at a price close to where the last one started.
Blame crypto, AI, COVID but there has been no price drop for the PS5 and if there was gonna be a PS6 that was really better it would probably have to cost upwards of $1000 and you might as well get a PC. Sure there are people who haven’t tried Steam + an XBOX controller and think PV gaming is all unfun and sweaty but they will come around.
Inflation. PS5 standard at $499 in 2019 is $632 in 2025 money which is the same as the 1995 PS 1 when adjusted for inflation $299 (1995) to $635(2025). https://www.usinflationcalculator.com/
Thus the PS6 should be around 699 at launch.
The main issue with inflation is that my salary is not inflation adjusted. Thus the relative price increase adjusted by inflation might be zero but the relative price increase adjusted by my salary is not.
The phrase “cost of living increase” is used to refer to an annual salary increase designed to keep up with inflation.
Typically, you should be receiving at least an annual cost of living increase each year. This is standard practice for every company I’ve ever worked for and it’s a common practice across the industry. Getting a true raise is the amount above and beyond the annual cost of living increase.
If your company has been keeping your salary fixed during this time of inflation, then you are correct that you are losing earning power. I would strongly recommend you hit the job market if that’s the case because the rest of the world has moved on.
In some of the lower wage brackets (not us tech people) the increase in wages has actually outpaced inflation.
Typically "Cost Of Living" increases target roughly inflation. They don't really keep up though, due to taxes.
If you've got a decent tech job in Canada your marginal tax rate will be near 50%. Any new income is taxed at that rate, so that 3% COL raise, is really a 1.5% raise in your purchasing power, which typically makes you worse off.
Until you're at a very comfortable salary, you're better off job hopping to boost your salary. I'm pretty sure all the financial people are well aware they're eroding their employees salaries over time, and are hoping you are not aware.
Tax brackets also shift through time, though less frequently. So if you only get COL increases for 20 years you’re going to be reasonably close to the same after tax income barring significant changes to the tax code.
In the US the bottom tax brackets where 10% under 2020 $19,750 then 12% next bucket, in 2025 it’s 10% under $23,850 then 12% next bracket. https://taxfoundation.org/data/all/federal/historical-income...