Nvidia says frame generation and upscaling will require us to rethink benchmarks

Daniel Sims

Posts: 1,777   +48
Staff
In context: PC gaming and graphics card benchmarks used to focus primarily on frame rates. Although average frame rates still draw the most attention, other factors like latency, 1% lows, upscaling, and frame generation have made performance analysis more complex. Nvidia's RTX 50 series GPUs will soon throw multi-frame rendering into the mix, and the company has suggestions (of course) for properly measuring its impact.

After unveiling its next-generation GPUs and the latest changes to DLSS upscaling at CES 2025, Nvidia is offering outlets some advice for analyzing how upcoming games perform on the new graphics cards. Although the company's presentation predictably paints the Blackwell cards in the best possible light, the newly introduced multi-frame rendering technology will undeniably require new approaches.

The introduction of DLSS and later AMD's FSR forced benchmarks to add bar charts accounting for the performance gains from upscaling. Image reconstruction has also made analyzing image quality more important.

Frame generation (see our feature: Nvidia DLSS 3: Fake Frames or Big Gains?), which applies an interpolated frame between every traditionally rendered frame, added another dimension to technical reviews. The technology makes image quality comparisons even more important, and examinations must now account for the latency it adds.

Multi-frame generation further complicates things by adding two or three interpolated frames using flip metering functionality. Nvidia says the new technology, exclusive to RTX 50 series GPUs, is necessary to maintain frame consistency – another crucial benchmarking factor.

Click to enlarge

Tom's Hardware reports that, in a CES presentation, the company advised outlets to switch from using the FrameView utility to MsBetweenDisplayChange as it can more accurately account for DLSS4, flip metering, and frame rate fluctuations.

When technical reviews are published in the coming weeks after the first RTX Blackwell cards launch in late January, fairly comparing their frame generation results to those of AMD GPUs could prove challenging, as FSR 3 still only produces one interpolated frame for every rendered frame.

A game running at 240fps using AI-generated frames might look smoother than the same game running at 120fps with fewer interpolated frames, or 60fps with only rendered frames, but all of those profiles might feel noticeably different.

Eurogamer's early analysis of Cyberpunk 2077 running with DLSS 4 showed that multi-frame rendering adds minimal latency and visual artifacts, but it remains unclear whether other titles will exhibit similar results.

Following CES, Nvidia has provided more in-depth information about DLSS 4 and how each 50 series card compares to its direct predecessor. Most of the charts use multi-frame generation to depict performance improvements ranging between 200 and 400 percent. However, the Resident Evil 4 Remake and Horizon Forbidden West, which haven't been updated to receive multi-frame rendering support, show far more modest (and more realistic) raw performance improvements of between 15 and 30 percent.

Permalink to story:

 
DLSS 3 isn’t perfect, but let’s be real, it’s absolutely necessary for games like Cyberpunk 2077 with RT Overdrive. Without DLSS, traditional rendering would tank performance so hard it’d be unplayable for most people. And for even more demanding games, DLSS 3 is pretty much a must.

I’ve had a great experience with DLSS 3 so far, but yeah, I’m running a high-end RTX desktop and laptop, so I get that not everyone’s setup will benefit the same way. That said, it makes total sense for Nvidia to keep pushing DLSS forward, and DLSS 4 feels like the next logical step.

As the article pointed out, high frame rates don’t always mean it feels smooth, and that’s something that’ll need more refinement. Still, DLSS 4 seems like a big win to me, as it adds just a tiny bit of latency for the massive boost in frame rate, which is a trade-off I’ll happily take.

We’ll have to see how this plays out in independent benchmarks and how well games adopt the tech, but I’m optimistic. The 50-series feels like Nvidia’s vision for the future of gaming with AI doing the heavy lifting, almost like “offline rendering” for extremely detailed games. Nvidia to me, is miles ahead of the competition when it comes to the next big thing, and you can always count on Nvidia.
 
DLSS and Frame Generation are the best things to happen to gaming, like ever. Those that say that they are 'fake frames' are clueless imbeciles that haven't a clue.
Or maybe the grown-ups want honest benchmarking to remain (whether you choose to include DLSS upscaling as an extra or not) and for tech sites to not start doing do the GPU equivalent of making silly CPU video encoding benchmark charts where all the new CPU's are tested only with QuickSync on for 720p source videos then bragging about "+300% uplifts" vs all the older ones tested with QuickSync off with 1080p source videos then yelling "everyone but me is a clueless imbecile!"... ;)
 
Very funny, Jensen.

Measuring performance without upscaling or frame generation is never going away. Nor should it.
You most definitely should measure it with upscaling/FG consideration, because that is what will be used, and those that are 'Native Dummies' are just that, dummies, that in their stubbornness will continue to whine and complain about poor performance that is caused all by themselves, it's pretty sad actually, they probably think the earth is the center of the universe too... #facepalm
 
DLSS 3 isn’t perfect, but let’s be real, it’s absolutely necessary for games like Cyberpunk 2077 with RT Overdrive. Without DLSS, traditional rendering would tank performance so hard it’d be unplayable for most people. And for even more demanding games, DLSS 3 is pretty much a must.

I’ve had a great experience with DLSS 3 so far, but yeah, I’m running a high-end RTX desktop and laptop, so I get that not everyone’s setup will benefit the same way. That said, it makes total sense for Nvidia to keep pushing DLSS forward, and DLSS 4 feels like the next logical step.

As the article pointed out, high frame rates don’t always mean it feels smooth, and that’s something that’ll need more refinement. Still, DLSS 4 seems like a big win to me, as it adds just a tiny bit of latency for the massive boost in frame rate, which is a trade-off I’ll happily take.

We’ll have to see how this plays out in independent benchmarks and how well games adopt the tech, but I’m optimistic. The 50-series feels like Nvidia’s vision for the future of gaming with AI doing the heavy lifting, almost like “offline rendering” for extremely detailed games. Nvidia to me, is miles ahead of the competition when it comes to the next big thing, and you can always count on Nvidia.

DLSS is one thing, frame generation is another totally different one. I agree that DLSS is an amazing technology, the AI upscaler is amazing, but I dont have the same opinion about frame gen, they allways show artifacts whatever small they are there, I have never left that on for long.

Also whoever is onto high refresh rate gaming is because they want every bit of competitive advantage possible, and those gamers loathe the latency added bit frame gen, they prefer to play on lowest quality settings with draw distance on ultra.

On the other side of the coin are non competitive gamers that like thing to look pretty (Im on this camp) but then again for me even 60 hz is OK and the artifacts added by frame gen are a no go when I want things to look "pretty", so its just a no go for me
 
You most definitely should measure it with upscaling/FG consideration, because that is what will be used, and those that are 'Native Dummies' are just that, dummies, that in their stubbornness will continue to whine and complain about poor performance that is caused all by themselves, it's pretty sad actually, they probably think the earth is the center of the universe too... #facepalm
It's not that people are dummies for not wanting it. It's that it's being marketed as if it is native performance on their cards and charging extra for non-native performance. There are things that are great about this tech and there are things that are not great about it.

This is more about price than it is what the features are. People would be a lot more accepting of this tech if we were charged for native performance with upscaling and frame gen being bonus on top of it. The 5070 isn't a 4090. The 5070 isn't going to perform like a 4090 in games that don't have DLSS4 support.
 
DLSS and Frame Generation are the best things to happen to gaming, like ever. Those that say that they are 'fake frames' are clueless imbeciles that haven't a clue.
For games that do not employ frame generation higher frame rates bring two benefits:
1. lower latency
2. smoother display

When games run at low frame rates you start to notice:
1. that it takes time for the camera to change according to your input (latency)
2. the slide show (because there is a big visual difference between two consecutive frames)

what frame gen does is alleviate point 2 by creating frames and exacerbate point 1 by increasing the actual latency. So, in effect, for people and games that value latency, frame generation is fake frames. for people and games that are not sensible to latency but to visual transition, frame generation is an improvement.

Frame rate was a good indication for smoothness, latency and usually render quality was comparable between cards at the same settings. The problem with FG is that we now need to asses separately latency and render quality. People that understand these things value the technology for what it brings (smoother display). But, we have to admit that marketing is abusing (tricking unsuspecting buyers) by comparing normal framerate with FG. In light of these marketing tricks users are right to call it Fake Frames. If nvidia called it smooth display or whatever and only counted rendered frames thus keeping latency comparable between framerates nobody would have called it Fake Frames.
 
It's not that people are dummies for not wanting it. It's that it's being marketed as if it is native performance on their cards and charging extra for non-native performance. There are things that are great about this tech and there are things that are not great about it.

This is more about price than it is what the features are. People would be a lot more accepting of this tech if we were charged for native performance with upscaling and frame gen being bonus on top of it. The 5070 isn't a 4090. The 5070 isn't going to perform like a 4090 in games that don't have DLSS4 support.
This. The technology and the features are great. The marketing and the pricing are not.
 
For games that do not employ frame generation higher frame rates bring two benefits:
1. lower latency
2. smoother display

When games run at low frame rates you start to notice:
1. that it takes time for the camera to change according to your input (latency)
2. the slide show (because there is a big visual difference between two consecutive frames)

what frame gen does is alleviate point 2 by creating frames and exacerbate point 1 by increasing the actual latency. So, in effect, for people and games that value latency, frame generation is fake frames. for people and games that are not sensible to latency but to visual transition, frame generation is an improvement.

Frame rate was a good indication for smoothness, latency and usually render quality was comparable between cards at the same settings. The problem with FG is that we now need to asses separately latency and render quality. People that understand these things value the technology for what it brings (smoother display). But, we have to admit that marketing is abusing (tricking unsuspecting buyers) by comparing normal framerate with FG. In light of these marketing tricks users are right to call it Fake Frames. If nvidia called it smooth display or whatever and only counted rendered frames thus keeping latency comparable between framerates nobody would have called it Fake Frames.
Great summary of the issues here. I am not a fan of Frame Generation but hey, it is a free country, people should be allowed to pick the type of graphics they want.

But Jensen can f*** off about rethinking benchmarks. Reviews and benchmarks better include the actual framerate if they are going to measure AI frames as well.
 
Great summary of the issues here. I am not a fan of Frame Generation but hey, it is a free country, people should be allowed to pick the type of graphics they want.

But Jensen can f*** off about rethinking benchmarks. Reviews and benchmarks better include the actual framerate if they are going to measure AI frames as well.
I believe, for the reasons above, Jensen is correct that benchmarks should be done differently. Quality issues (artefacts) aside, in the FG era we can't only use one metric like framerate for smoothness and latency, they should be decoupled. Use framerate for smoothness and measure the actual latency in ms. Like we have 1% average low we will have another metric to look at :))
 
I believe, for the reasons above, Jensen is correct that benchmarks should be done differently. Quality issues (artefacts) aside, in the FG era we can't only use one metric like framerate for smoothness and latency, they should be decoupled. Use framerate for smoothness and measure the actual latency in ms. Like we have 1% average low we will have another metric to look at :))
I would be tentatively ok with that.

But if we are not considering actual frame rates just so Nvidia can boast about its AI frames, that is pure BS. I don't trust Nvidia here at all, they have not acted in good faith plenty of times. (AMD is not any better but it is also not in a position to sell its own BS to the consumer).
 
DLSS and Frame Generation are the best things to happen to gaming, like ever. Those that say that they are 'fake frames' are clueless imbeciles that haven't a clue.
"are clueless imbeciles that haven't a clue"
Isn't frame gen marketed as adding 3 extra frames to a game?
I mean they brag a 5090 can do 240fps in a game.

"DLSS Multi Frame Generation generates up to three additional frames per traditionally rendered frame, working in unison with the complete suite of DLSS technologies to multiply frame rates by up to 8X over traditional brute-force rendering."

Nvidia bragging about their tech, They say themselves adding 3 additional frames.
Which means they are fake frames, they haven't been processed by the gpu.

To the people saying Nvidia is so great.
"This massive performance improvement on GeForce RTX 5090 graphics cards unlocks stunning 4K 240 FPS fully ray-traced gaming."

They know that DLSS lowers the resolution, but they will still lie and say 4k gaming when it isn't 4k, in fact it's somewhere between 1080 and 1440.
If they used the balanced setting.
 
Last edited:
Smoothness in a game differs between people - some don't notice it once things surpass 30fps, others up to 60fps and those people that are "special" and claim they can spot the differences at 120+fps.

Just like how some people are more susceptible to higher latency that FG and DLSS/FSR/XeSS creates while it doesn't bother others.

We can't just stop doing benchmarks like we do them now, they offer a lot of useful information to the readers and the only reason Nvidia wants them redone is to hide the fact that the performance gains we're seeing this time around are mostly tied to the DLSS4/FG being used in their slides for promotion.

Remember when the 4090 was coming out and they said 2x the performance of the 3090? Yeah, of course there was 2x the performance....once you ran DLSS3.

You want that 4090 performance from a 5070? Sorry, that only happens if you're rocking DLSS4 and FG (unless by some miracle reviews actually show us otherwise). But with Nvidia saying the 5070 is the new 4090, that's why they want benchmarks to be changed.

Personally, I don't like the janky feeling that FG and DLSS gives when you run them. I don't play any games with those "features" turned on. I wouldn't want to see benchmarks changed just to fit the narrative of Nvidia.
 
You most definitely should measure it with upscaling/FG consideration, because that is what will be used, and those that are 'Native Dummies' are just that, dummies, that in their stubbornness will continue to whine and complain about poor performance that is caused all by themselves, it's pretty sad actually, they probably think the earth is the center of the universe too... #facepalm

What amazes me is that this whole conversation mirrors the audio conversations justifying $100 cables, $1000 a/d converters, etc. While occasionally some of these items "may" provide some measurable improvement that is real, often the difference is well beyond anything that can be heard.

While high framerates can assist professional gamers or the occasional hyper competitive player, most people really don't notice super high frame rates or how much better the water reflects from one card to the next. Artifacts can distract and need to be squashed, but most of this seems to be pixel peeping for bragging rights. There's a reason Techspot recently ran an article about ray tracing games being divided into categories about whether or not ray tracing helps the visuals a lot, a little, or not worth it.

Not to mention, like photography, the goal used to be to get the most accurate representation of what you see. Now with cell phones, it tweaks this, corrects that, re-balances colors and light, to create the "most pleasing and nicer" looking image. The difference is that with games, the starting material, due to the lack of processing power, time, and money, really has no ideal starting point. The images are rendered in different ways from the exact same source material.

Lastly, I would have no problem if these generated frames were actually 4k interpolated frames, but they are not, they are pseudo frames using persistence of vision and other traits of human vision to create a smoother looking frame, not true inserted frames.
 
Everyone should know by now Jensen always highly exaggerates the performance gain.

Even back in the day, during the revealing of the GTX 1080, he literally said it's twice as fast as the Titan X, when in reality it's nowhere near that figure.
 
Nvidia comments “translated”:
- hey, let fake frames count as real frames as our fake frame algorithm got so good and we can’t do much better on raw performance.
- hey, let us sell 2000€ cards with excellent fake frames performance and let’s try together convince people that it’s as good as real frames, shall we?!
 
Simple solution - disregard any 'benchmark' with DLSS of any sort turned on. Its fake framerate. The game engine isn't going any faster, input latency and response time are completely unchanged. nVidia just gets to lie and say its much faster than it actually is.

Its not a valid benchmark if DLSS is turned on, thats like benchmarking a VHS tape. It doesn't make practical sense
 
It's not that people are dummies for not wanting it. It's that it's being marketed as if it is native performance on their cards and charging extra for non-native performance. There are things that are great about this tech and there are things that are not great about it.

This is more about price than it is what the features are. People would be a lot more accepting of this tech if we were charged for native performance with upscaling and frame gen being bonus on top of it. The 5070 isn't a 4090. The 5070 isn't going to perform like a 4090 in games that don't have DLSS4 support.

The 5070 just isn't going to perform like a 4090 under any circumstance period.

A 5070 at 4k with FG on isn't going to produce the same content as a 4090 at 4k without FG turned on. The 4090 is rending the frames, the 5070 is faking it in visually obvious ways.

Even if the frames per second the monitor sees is identical, the image will not be as good.
 
Simple solution - disregard any 'benchmark' with DLSS of any sort turned on. Its fake framerate. The game engine isn't going any faster, input latency and response time are completely unchanged. nVidia just gets to lie and say its much faster than it actually is.

Its not a valid benchmark if DLSS is turned on, thats like benchmarking a VHS tape. It doesn't make practical sense
The 5070 just isn't going to perform like a 4090 under any circumstance period.

A 5070 at 4k with FG on isn't going to produce the same content as a 4090 at 4k without FG turned on. The 4090 is rending the frames, the 5070 is faking it in visually obvious ways.

Even if the frames per second the monitor sees is identical, the image will not be as good.
Welcome, 24 minutes in, and this is your first comment? Bold. Calling DLSS "fake framerate" is a typical stereotypical lazy take. DLSS isn’t about making the engine faster, it’s about leveraging AI to create playable experiences in games that would otherwise choke even the best hardware.

Ignoring DLSS in benchmarks makes no sense because that’s how most gamers will actually play. It’s like testing a turbocharged car and ignoring the turbo. DLSS isn’t “cheating”, it’s maximizing efficiency. And for the vast majority of users, the tiny latency it adds is a non-issue when weighed against the performance gains.

And your take on the 5070 vs. 4090? How are you so confident when there are literally no independent benchmarks yet? All we have are Nvidia’s slides. Sure, those should be taken with some skepticism, but outright dismissing the 5070’s potential just sounds ignorant.

Honestly, your whole argument screams being stuck in the past. Tech evolves whether you like it or not, and clinging to old ideas of “native rendering only” just makes you sound like someone who hasn’t kept up with how GPUs work today. DLSS and AI-driven performance are the future, and dismissing them outright shows a complete lack of understanding of where gaming is headed. Maybe it’s time to get with the program instead of clinging to outdated notions that no longer apply.
 
Back