Nvidia says over 80% of GeForce RTX GPU owners use DLSS

Daniel Sims

Posts: 1,776   +48
Staff
A hot potato: Upscaling has become a topic of contention since Nvidia and AMD introduced DLSS and FSR, with some users viewing the functionality as a crutch for unoptimized games. After unveiling new GPUs that rely heavily on image reconstruction and frame generation for their claimed performance gains, Nvidia revealed a data point suggesting that almost all RTX GPU users activate DLSS, meaning to indicate that upscaling has become the norm.

Nvidia held a lengthy presentation at CES 2025 outlining the past and future of its DLSS neural rendering technology. One slide included a data point claiming that over 80 percent of users with RTX 20, 30, and 40 series graphics cards use DLSS on games, vindicating the company's extensive use of AI in video game rendering.

While the slide does not specify how Nvidia collected the data, if accurate, it suggests that gamers have overwhelmingly embraced machine learning-based upscaling. The presentation also highlighted that over 500 games and 15 of 2024's top 20 titles support DLSS. However, it did not clarify whether most RTX 40 series owners specifically use frame generation, which is part of the DLSS suite but is a separate step than upscaling.

Upscaling technologies like DLSS, FSR, and XeSS render games below a display's native resolution and use advanced techniques to reconstruct the missing pixels, significantly boosting performance. Analyses from TechSpot and other outlets demonstrate that the increased frame rates often far outweigh the minor reductions in image quality. Nvidia's newly revealed DLSS 4 aims to minimize these visual flaws even further by using GenAI transformer models.

However, as games like Remnant II, Alan Wake II, and Monster Hunter Wilds now list upscaling as part of their system requirements, debate persists over whether the visual trade-offs are justified.

Additionally, Nvidia is promoting frame generation, which interpolates AI-generated frames between traditionally rendered ones without reducing latency, as a key feature of its RTX 50 series cards.

Nvidia's CES presentation also outlined future plans for neural rendering, which will integrate AI-assisted techniques into internal rendering pipelines instead of just resampling the final rendered frame. In the coming years, DLSS could enable games to process more detailed materials, hair, facial expressions, and other assets with minimal performance costs.

Nvidia has previously argued that as performance gains from conventional rendering and semiconductor die shrinks slow down, enhancing visual detail and frame rates will increasingly depend on technologies like neural rendering and frame generation. These currently address the substantial computational demands of 4K resolution and ray tracing.

Moreover, AMD's and Intel's efforts to play catch-up with Nvidia instead of providing alternative solutions further validate the growing importance of AI rendering. Team Red unveiled FSR 4 upscaling at CES, too, which mimics Nvidia's machine learning approach and achieves noticeably better results than FSR 3.

Similarly, late last year, Intel introduced new GPUs that support frame generation using XeSS 2. The shift toward AI-assisted graphics will potentially require new approaches to technical analysis and benchmarking.

Permalink to story:

 
I don't think anyone is complaining about upscaling tech, I think most people really like it. The thing is, it should be optional and something that is optional shouldn't be included in the price of a product.

And yes, DLSS is better than FSR, but does anyone remember how bad FSR1 and DLSS1 were? While they've both gotten orders if magnitude better, they're both just solutions to problems that they themselves have caused.

Look at XBONE and PS4 era of games, they looked sharp and had great animations. I feel that games on this generation of consoles loom WORSE than the previous generation.

I'm not saying "NO" to ray tracing, but it still isn't ready for the mainstream, yet. Games are becoming LESS optimized and instead of the cost savings being passed onto us, they require expensive hardware to have a quality experience.

Graphics fidelity is decreasing while also increasing costs on the consumer side.

Upscaling and frame gen are great tech, and it's very "nice to have tech." The real problem with it is the impact it is having on the gaming industry as a whole and that's what we should be talking about.

 
I'd say you'd have to be a graphics snob to not use it to some degree. Smooth gameplay for a minor (if not imperceptible) performance hit will always be fine with me. I don't need uber realistic, just give me stylistic.

That said, it should never be the main focus of how much "better" a GPU is over a previous generation. Pure power should be the focus, and then you can also show off how much better your frame generation tech is after.
 
This is because so many people launch a game, just start playing and it is auto on. If it was an option that had to be turned on I feel like these numbers would be VASTLY different. I dislike it so much. It is a development crutch now. Why optimize when you can chuck DLSS at it? Why make more powerful cards when you can say your 5070 rivals the 4090 in power (which it doesn't)? Every game I have tried DLSS in (4070ti Super) there is noticeable smudginess. It is here to stay now, though.
 
This is because so many people launch a game, just start playing and it is auto on. If it was an option that had to be turned on I feel like these numbers would be VASTLY different. I dislike it so much. It is a development crutch now. Why optimize when you can chuck DLSS at it? Why make more powerful cards when you can say your 5070 rivals the 4090 in power (which it doesn't)? Every game I have tried DLSS in (4070ti Super) there is noticeable smudginess. It is here to stay now, though.
Pretty much this. To expand a bit, I could be more ok with it if
a) It looked this mature from the get go since it was pretty rough at launch years ago
b) It wasn't proprietry software that requires proprietary hardware on top for no good reason
c) As a consequence of the previous point, it was supported universally as in, at driver level and backwards compatible with titles no longer supported.

As it stand, it's basically just a metastasized cancer on the PC gaming industry: Nvidia has always been a cancer but I feel it might be untreatable and terminal or very nearly at this point.
 
I don't think anyone is complaining about upscaling tech, I think most people really like it.

Hunh..?

Nobody really cheering on upscaling, as everyone understand it is a crutch that helps them get a few frames, but it's not as good as native for hardcore gaming. And anyone spending $800+ on a GPU today in 2025, expect to play in native.

Upscaling is for those people who do not wish to upgrade their video card.




 
Hunh..?

Nobody really cheering on upscaling, as everyone understand it is a crutch that helps them get a few frames, but it's not as good as native for hardcore gaming. And anyone spending $800+ on a GPU today in 2025, expect to play in native.

Upscaling is for those people who do not wish to upgrade their video card.
As I've said in previous posts, it's not the tech that is the problem, it's the price. DLSS4 is likely to be great, but everyone but 50 series owners are software locked out of it.

It's the price and the narrative that nVidia is trying to spin around it that is the problem. Not the tech. And I'm not talking about someone playing at 720 trying to upscale to 1080. How I use FSR on my 6700xt is to upscale 1440p to 4k to play on my TV instead of lowering graphics settings to play natively. I don't play competitive games. The most I do is PVP in ESO or some large battles in EvE.

And framegen is harmless in single player games. I would never use frame gen, but it's nice that it's there.

It's not the tech that is the problem, it's how it is being sold and marketed to us that is leaving a bad taste in everyone's mouth.

People's only complaint about FSR is that it's nit as good as DLSS, but people have many more words over nVidia's treatment of DLSS and frame gen. I want you to take a moment think about why that might be.

It's my opinion that AMD isn't selling us fake performance. I hate the whole FSR vs DLSS debate because it sounds like people arguing over which pile of garbage smells better. The major difference being that AMD isn't charging us extra for the privilege of smelling their garbage
 
Let me guess, without even looking at the comments: "The analytics are wrong," "It defaults," "Gamers don't know what they're enabling," blah blah blah. Excuses upon excuses to dismiss the obvious; DLSS is immensely popular and a game-changer.

Don’t like it? Then turn it off. For yourself.

But clearly, the majority of gamers see the value and are embracing it. Maybe it’s time to accept that AI-driven rendering isn’t just the future, it’s the now. Scratch that, they ARE already accepting it and have been for the past few years.

Nothing, and I mean nothing, can stop progress, not even a minority of gamers being loud on forums and tech sites.
 
This stat is meaningless without methodology. Does it mean 80% of gamers have switched it on at least once, including games where it is on by default? That would include me. Or does it mean 80% of game time is played with it on? That wouldn't match my profile.

Let me guess, without even looking at the comments: "The analytics are wrong," "It defaults," "Gamers don't know what they're enabling," blah blah blah. Excuses upon excuses to dismiss the obvious; DLSS is immensely popular and a game-changer.

It's not obvious to me - I see mostly critical responses to it here and elsewhere. What is your hypothesis as to why Techspot commenters would be so different from gamers in general?

Although I do want to be clear what the criticism is. Not to having the option. If I want to output to my 4K TV and I don't have the render power to do it fully, it's nice to have a fallback option. The problem is in pretending the interpolated frame is the same as a rendered frame. They are not the same and I am glad Techspot and other communities are wanting them to be reported separately and throwing side-eye when they are not ("5070 outperforms 4090" lol).

Having more options to trade quality for performance when needed is appreciated. But having that trade off taken as a given and not even worth mentioning is not.
 
As I've said in previous posts, it's not the tech that is the problem, it's the price. DLSS4 is likely to be great, but everyone but 50 series owners are software locked out of it.

It's the price and the narrative that nVidia is trying to spin around it that is the problem. Not the tech. And I'm not talking about someone playing at 720 trying to upscale to 1080. How I use FSR on my 6700xt is to upscale 1440p to 4k to play on my TV instead of lowering graphics settings to play natively. I don't play competitive games. The most I do is PVP in ESO or some large battles in EvE.

And framegen is harmless in single player games. I would never use frame gen, but it's nice that it's there.

It's not the tech that is the problem, it's how it is being sold and marketed to us that is leaving a bad taste in everyone's mouth.

People's only complaint about FSR is that it's nit as good as DLSS, but people have many more words over nVidia's treatment of DLSS and frame gen. I want you to take a moment think about why that might be.

It's my opinion that AMD isn't selling us fake performance. I hate the whole FSR vs DLSS debate because it sounds like people arguing over which pile of garbage smells better. The major difference being that AMD isn't charging us extra for the privilege of smelling their garbage
I agree with some of what you said, but the tech is the problem... Latency. Can't use for comp gaming.. well you can but you will get destroyed by lower native players.
 
Soooo they're continuously harvesting telemetry about our GPU usage?
That's been the case for 15-20 years now, I remember a telemetry .exe that was a part of the Nvidia driver back on Windows XP, causing massively high CPU usage randomly for no good reason, it was just a buggy driver release and they fixed it a week later, but Nvidia have been collecting this sort of data for a long time.
 
Let me guess, without even looking at the comments: "The analytics are wrong," "It defaults," "Gamers don't know what they're enabling," blah blah blah. Excuses upon excuses to dismiss the obvious; DLSS is immensely popular and a game-changer.

Don’t like it? Then turn it off. For yourself.

But clearly, the majority of gamers see the value and are embracing it. Maybe it’s time to accept that AI-driven rendering isn’t just the future, it’s the now. Scratch that, they ARE already accepting it and have been for the past few years.

Nothing, and I mean nothing, can stop progress, not even a minority of gamers being loud on forums and tech sites.
The problem is games running like crap without it on. Me and all of my friends are PC gamers. However, I am the only one who goes to the settings first before I start a new game. A LOT of PC gamers game this way. Its been that way as far back as I can remember playing with people, back in the Roger Wilco days.
 
The problem is games running like crap without it on. Me and all of my friends are PC gamers. However, I am the only one who goes to the settings first before I start a new game. A LOT of PC gamers game this way. Its been that way as far back as I can remember playing with people, back in the Roger Wilco days.

I ran a test yesterday.
I ran 4k with DLSS quality and then 1440p native. There was definitely less ghosting with 1440 native.
There are people so stuck up on the tech for some reason, that lowering a res by yourself is unheard of,
or the "gods" gave and we must tell everyone to use it.
 
How many of those 80% are just using default settings and might not even know they are using DLSS - and might not want to if asked..

Yup. And how is Nvidia defining "use DLSS"? I'm sure they're fudging this and count every single driver installation where DLSS has been activated at least once... Heck.
 
I ran a test yesterday.
I ran 4k with DLSS quality and then 1440p native. There was definitely less ghosting with 1440 native.
There are people so stuck up on the tech for some reason, that lowering a res by yourself is unheard of,
or the "gods" gave and we must tell everyone to use it.

I'd say, run the resolution your panel was built for. However, if your GPU can't handle that resolution natively in gaming, then you have an issue. I struggled a little when I got an ultrawide screen (not to mention game support for ultrawide). It was 3440x1440 and I had an RTX 2070. Some games would be fine, others... not so much. I then upgraded to an RTX 3070 where I could see it wasn't struggling. 4K would still not be my option.

I wouldn't, for example, run a 4K panel at 1440p, because it'll have to interpolate pixels to do that and not all panels hand that well. Some panels are even crap at downscaling 4K to 1080p which is just halving the horizontal and vertical matrix.
 
I don't think anyone is complaining about upscaling tech, I think most people really like it. The thing is, it should be optional and something that is optional shouldn't be included in the price of a product.

And yes, DLSS is better than FSR, but does anyone remember how bad FSR1 and DLSS1 were? While they've both gotten orders if magnitude better, they're both just solutions to problems that they themselves have caused.

Look at XBONE and PS4 era of games, they looked sharp and had great animations. I feel that games on this generation of consoles loom WORSE than the previous generation.

I'm not saying "NO" to ray tracing, but it still isn't ready for the mainstream, yet. Games are becoming LESS optimized and instead of the cost savings being passed onto us, they require expensive hardware to have a quality experience.

Graphics fidelity is decreasing while also increasing costs on the consumer side.

Upscaling and frame gen are great tech, and it's very "nice to have tech." The real problem with it is the impact it is having on the gaming industry as a whole and that's what we should be talking about.


Early PS5 & Series X games actually look good and sharp. The majority of Sony's first party games looks good. It's only Nvidia when started pushing DLSS & AMD with FSR was when a lot of newer titles started to use this upscaling crap.
 
I'd say, run the resolution your panel was built for. However, if your GPU can't handle that resolution natively in gaming, then you have an issue. I struggled a little when I got an ultrawide screen (not to mention game support for ultrawide). It was 3440x1440 and I had an RTX 2070. Some games would be fine, others... not so much. I then upgraded to an RTX 3070 where I could see it wasn't struggling. 4K would still not be my option.

I wouldn't, for example, run a 4K panel at 1440p, because it'll have to interpolate pixels to do that and not all panels hand that well. Some panels are even crap at downscaling 4K to 1080p which is just halving the horizontal and vertical matrix.

I know, I was only doing a test.
I run at 4k60 native all the time.
My panel is a 4k @ 120hz but fortunately I don't need to use 120hz to enjoy my games.
 
Here we go again. Am I missing something here? Isn’t DLSS pretty much highly recommended when you turn on ray tracing, especially with path tracing? Nobody’s forcing anyone to use it if not needed.

If native rendering works fine for you, then great, stick with it. But let’s be real, in a game like Starfield with no RT, only top-tier GPUs like the 7900XTX or 4090 can hit 60fps at 4K max settings. For anyone using lower-end GPUs, DLSS is needed for better performance.

Not all games are created equal, and I could go on forever debating optimizations, but at the end of the day, whether you turn DLSS on or off depends on how the game runs and what settings you’re aiming for. With RT enabled, there is performance hit, so DLSS is recommended.

On the flip side, if you’re already getting over 60fps or solid performance at native resolution, you probably don’t need it. Unless, of course, you’re chasing even higher frame rates, in which case, go for it. Nvidia Reflex is there to keep the lag minimal. That’s pretty much how I decide when to play with or without DLSS.

DLSS 3 takes things even further. CP2077 RT Overdrive, no brainer to have DLSS 3 on. But then you have games like say, Still Wakes the Deep where frame generation or DLSS isn’t needed at all even though it is supported.

For games pushing visual fidelity to insane levels like the Zorah tech, DLSS 3 seems insufficient. That’s where DLSS 4 comes in to do some serious heavy lifting. With DLSS 4, CP2077 RT Overdrive can hit high triple-digit frame rates at 4K, which is something I thought we'd see in future-future GPUs.

Of course, everything about DLSS 4 so far is based on Nvidia’s own material. No independent reviews yet. Still, given how solid Nvidia has been with their AI-driven approach, I’ve got pretty high confidence it’s gonna deliver. Meanwhile, the competition just feels like it’s not even in the picture anymore which is why to me, Nvidia is the way to go if you’re thinking about “future-proofing.”
 
Back