Nvidia DLSS 4 Ray Reconstruction Analysis: Fixing Ugly Ray Tracing Noise

I'm glad they didn't gate the software, but let the older cards run it. Let people decide if they want to run it for the big performance hit rather than marketing making the call for them.


Actually they're still gating the software overall, if not this particular feature. You can see how in the DLSS 4 features chart in this article. They're just letting everyone run the same version number now, so as to reduce consumer backlash.
 
Actually they're still gating the software overall, if not this particular feature. You can see how in the DLSS 4 features chart in this article. They're just letting everyone run the same version number now, so as to reduce consumer backlash.
Running the same version number makes it less obvious that they're gate keeping, it's very intentional. They know consumers are dumb, PC gaming is no longer filled with smart people that read reviews. Few people build their own computer these days, most gaming computers are prebuilts.
 
I might be the odd one out here, but I just don’t get the hype around ray tracing, especially when it jacks up GPU prices so much. Sure, it can make things look cooler, but half the time it ends up giving everything that weird, wet look. I’m playing these games for the story or the action, not to stare at my reflection in the window.

So, why does it feel like everyone treats ray tracing as the holy grail of graphics? Is it just because we’ve been told it’s a must-have?

Maybe I’ve reached that age where slightly better visuals don’t blow my mind anymore, or maybe my 3070 Ti just isn’t delivering that “wow” factor. I upgraded from a 1070 Ti, flipped on full RT in a bunch of games, and yeah, things looked bit nicer but I didn’t suddenly love the games more.

Once graphics hit a certain threshold, it’s really about the gameplay for me. These incremental visual boosts might be cool, but they come with a huge cost for what feels like diminishing returns. So, seriously, what am I missing here? Why should I really care about ray tracing in games?
 
I might be the odd one out here, but I just don’t get the hype around ray tracing, especially when it jacks up GPU prices so much. Sure, it can make things look cooler, but half the time it ends up giving everything that weird, wet look. I’m playing these games for the story or the action, not to stare at my reflection in the window.

So, why does it feel like everyone treats ray tracing as the holy grail of graphics? Is it just because we’ve been told it’s a must-have?

Maybe I’ve reached that age where slightly better visuals don’t blow my mind anymore, or maybe my 3070 Ti just isn’t delivering that “wow” factor. I upgraded from a 1070 Ti, flipped on full RT in a bunch of games, and yeah, things looked bit nicer but I didn’t suddenly love the games more.

Once graphics hit a certain threshold, it’s really about the gameplay for me. These incremental visual boosts might be cool, but they come with a huge cost for what feels like diminishing returns. So, seriously, what am I missing here? Why should I really care about ray tracing in games?
Ray tracing was always going to be demanding, but the idea was that over a few generations, even the low end cards would be able to do it at 1080p60 native, this didn't happen. Now we are stuck in a situation where ray tracing lowers development time(IE, costs for companies) so they tend to rely on it and upscaling passing the costs onto consumers.

Ray tracing went from an exciting new tech to something people like to turn off to get extra FPS because they don't want to pay $1000+ for a graphics card that relies on upscaling and fake frames. 4k gaming "was here" when the 1080ti came out. Ray tracing undid all of that. An RT game with upscaling, regardless of how good that upscaling is, looks worse than rasterized games made 10 years ago. I'd be perfectly happy playing games with PS4 level graphics at 40k120. Instead, we got Ray tracing.

Also, the RT cores just happened to be really good at AI, this lowered nvidias development costs for consumer cards. RT was a way to sell server GPUs to consumers. The idea that RT made the 4k120 dream just that, a dream, was great for nVidia because it meant that people had to upgrade more often to keep up. Now most people are playing games at basically 720p upscaled to 1440p to get playable frame rates.

RT is just a way for nVidia to sell us solutions to a problem they created and keep us buying cards as often as possible.
 
I might be the odd one out here, but I just don’t get the hype around ray tracing, especially when it jacks up GPU prices so much. Sure, it can make things look cooler, but half the time it ends up giving everything that weird, wet look. I’m playing these games for the story or the action, not to stare at my reflection in the window.

So, why does it feel like everyone treats ray tracing as the holy grail of graphics? Is it just because we’ve been told it’s a must-have?

Maybe I’ve reached that age where slightly better visuals don’t blow my mind anymore, or maybe my 3070 Ti just isn’t delivering that “wow” factor. I upgraded from a 1070 Ti, flipped on full RT in a bunch of games, and yeah, things looked bit nicer but I didn’t suddenly love the games more.

Once graphics hit a certain threshold, it’s really about the gameplay for me. These incremental visual boosts might be cool, but they come with a huge cost for what feels like diminishing returns. So, seriously, what am I missing here? Why should I really care about ray tracing in games?
I might be the odd one out here, but I just don’t get the hype around ray tracing, especially when it jacks up GPU prices so much. Sure, it can make things look cooler, but half the time it ends up giving everything that weird, wet look. I’m playing these games for the story or the action, not to stare at my reflection in the window.

So, why does it feel like everyone treats ray tracing as the holy grail of graphics? Is it just because we’ve been told it’s a must-have?

Maybe I’ve reached that age where slightly better visuals don’t blow my mind anymore, or maybe my 3070 Ti just isn’t delivering that “wow” factor. I upgraded from a 1070 Ti, flipped on full RT in a bunch of games, and yeah, things looked bit nicer but I didn’t suddenly love the games more.

Once graphics hit a certain threshold, it’s really about the gameplay for me. These incremental visual boosts might be cool, but they come with a huge cost for what feels like diminishing returns. So, seriously, what am I missing here? Why should I really care about ray tracing in games?
It is like the overblown bloom fad of the 00s.
 
I'm glad they didn't gate the software, but let the older cards run it. Let people decide if they want to run it for the big performance hit rather than marketing making the call for them.
Agreed, nice that the 30 and 40 series owners will benefit from this.

I've got a 2080 Ti (waiting for 5090 order to come in, might be waiting several months based on reports, oh well) and this article is spot on - for the 20 series, ray tracing just isn't worth it. It's fun to turn on in a relatively static scene for a few minutes just to appreciate the visuals, but unless you like gaming at 30 fps or lower resolutions, you get way more bang for your buck leaving RT off, turning up the resolution, and using traditional AA (DLSS Quality only if you have to).
 
It's always fascinating to me how NV tech is always super awesome until the next version comes out and the prior version gets thrown under the bus. Then suddenly everyone talks about all the IQ issues with the prior version to sell the new version, rinse and repeat. And here I thought all this crap was supposed to be "better than native".

DLSS/FSR etc all have their place IMO: making an old graphics card stay relevant for longer without sacrificing too much IQ.

It's mind blowing to me that people will pay hundreds, even thousands of dollars for a graphics card to use an IQ enhancing feature like RT or PT, only to have to turn on IQ reducing features like DLSS/RR to achieve a playable frame rate.
 
It's mind blowing to me that people will pay hundreds, even thousands of dollars for a graphics card to use an IQ enhancing feature like RT or PT, only to have to turn on IQ reducing features like DLSS/RR to achieve a playable frame rate.
Spot on.

Nvidia and the gaming world want us to rely on the "holy grail" ray tracing (RT) to enhance a scene’s visual quality. However, RT is so GPU-intensive that you must lower the resolution to maintain performance, what a paradox. To get more visual quality we have to reduce the quality? WTF?

Worse yet, the game no longer runs at your monitor’s native resolution and refresh rate, forcing you to need DLSS to upscale and generate artificial frames, both of which further reduce visual quality compared to a natively rendered scene. WTF? How did the gaming world end up in this situation where this all makes sense?
 
Ray Tracing can look very good if done right - but the game has to be built around it. Indiana Jones looks crisp and the lighting amazing - and it runs quite well on older hardware.
Just pushing raytracing «on top» can often result in poorly optimized games that doesn’t look great either
 
My big question is this, how much do you really notice when playing these games? The way I see it, yes, when looking at a freeze frame blown up and are able to study it, the image improvement is there. Let's face it, Nvidia can't keep throwing bigger and more complex hardware at the problem, so they've decided to use a software patch that does do what it says. But when I'm dodging another player, a troll, alien, or whatever, while trying to get off my next shots or spells, I will readily admit I'd never see most of the "problems" we're trying to correct. Has Nvidia made ANY meaningful improvements in per core/per shader processing speed? All I've seen is bigger, more gigantic chips with more and more shaders and a higher price tag. Let me not forget to giving them credit for adding additional core types dedicated to making their software fixes work better.

I see the same problem with Intel. The haven't put out a meaningful IPC increase in years, just keep upping the clock speed and core count. I don't know what their proposed 56 core chip will do for 90% of the users who's software and games really can't use more than 8 cores, if that.

Everybody seems to have hit the wall of Moore's law and is looking for anything to give them an edge WITHOUT actually making the actual hardware any better.
 
I see an odd amount of DLSS hate.

Yall really wanna go back to TAA? You sure about that?

DLSS Q is better than native TAA. It's better than SMAA and it's variants. Way better than fxaa. The only thing better is SSAA and DLAA. I generally prefer DLAA over SSAA, I get better performance and better AA coverage in things like foilage with DLAA over SSAA.
 
My big question is this, how much do you really notice when playing these games? The way I see it, yes, when looking at a freeze frame blown up and are able to study it, the image improvement is there.

I see errors like the boiling Tim refers to often in various games and it looks bad. Coming from turning on RT it's even worse as RT's FPS cost is high but it's supposed to be this transformative thing. But RT's changes are IMO mostly subtle (with the occasional ooo shiny!) until you turn Path Tracing on, so adding boiling and decimating FPS means RT is mostly dead to me. Path Tracing looks actually better but too bad nothing inexpensive can run it well enough yet.

Plus the new Transformer Model of DLSS makes many non-RT things look a lot better with less boiling and straight horizontal and vertical lines properly rendered, where the entire line would boil in and out with DLSS 3.x and earlier.

The improvements are real.

Let's face it, Nvidia can't keep throwing bigger and more complex hardware at the problem, so they've decided to use a software patch that does do what it says. But when I'm dodging another player, a troll, alien, or whatever, while trying to get off my next shots or spells, I will readily admit I'd never see most of the "problems" we're trying to correct.

Yup, none of this image quality matters in PVP or shootouts but some people play games where shootouts happen only 99% of the time, or even less! Which is when image quality improvements are apparent and appreciated.
 
I see an odd amount of DLSS hate.

Yall really wanna go back to TAA? You sure about that?

DLSS Q is better than native TAA. It's better than SMAA and it's variants. Way better than fxaa. The only thing better is SSAA and DLAA. I generally prefer DLAA over SSAA, I get better performance and better AA coverage in things like foilage with DLAA over SSAA.

DLSS = TAA

Without TAA there is no DLSS, I wanted real alternatives to TAA, regardless of the upscaling all TAA based have some level of blur and annoying effect/artifacts. It would be good to continue with forward+ and other alternative AA methods
 
I see an odd amount of DLSS hate.

Yall really wanna go back to TAA? You sure about that?

DLSS Q is better than native TAA. It's better than SMAA and it's variants. Way better than fxaa. The only thing better is SSAA and DLAA. I generally prefer DLAA over SSAA, I get better performance and better AA coverage in things like foilage with DLAA over SSAA.
I think you are missing two critical things.

First, I don't think any one is specifically hating on DLSS upscaling. It is the fact that RT requires you to lower you resolution to hit playable frame rates and then you have to use DLSS to get the resolution back up to where you want it.

Second, the higher the resolution, like 4K and beyond, the less important Anti-Aliasing is. Temporal Anti-Aliasing (TAA) is just a graphical technique used to smooth out jagged edges created by trying to draw straight lines on a pixelated image. The higher the resolution, the less jagged edges. If you just natively render at 4k, then you are going to need a lot less AA to clean up the lines. When you use DLSS you are rendering at a lower resolution to save performance, thus you need high quality upscaling and AA to make the image looks good.

I know I sound like a broken record, but RT is the fundamental driver of all these issue. It is a solution to a problem that only creates more problems, and around we go. The significant computational needs, for only moderate visual fidelity increase, is forcing you to render lower resolution images. So everyone is rendering 1080p and trying to view it at 4k which requires Nvidia's AI based AA called DLSS. I guarantee that if it was just rendered in the full glory of 4k, you would just be happier, and it would be cheaper. We are stuck on the RT bandwagon/hype train and it looks like we can't get off.
 
Last edited:
I see an odd amount of DLSS hate.

Yall really wanna go back to TAA? You sure about that?

DLSS Q is better than native TAA. It's better than SMAA and it's variants. Way better than fxaa. The only thing better is SSAA and DLAA. I generally prefer DLAA over SSAA, I get better performance and better AA coverage in things like foilage with DLAA over SSAA.
You know that TAA is still HEAVILY used in almost every game and has nothing to do with DLSS
 
Not sure many of some will buy into 8K - so this is a big sell for Nvidia

But really is it necessary? will Unity/Unreal make it moot except if want pure RTX

Ie if I showed you a flat scene and said imagine reflections, god rays etc, if you have quite good mental visual skills you could add them in, in your minds view

So really an AI engine could do this maybe for less cost I don't use Journey etc but surely give it a flat image , say where lights are, do ray tracing look- imagine could do OK
 
Not sure many of some will buy into 8K - so this is a big sell for Nvidia

But really is it necessary? will Unity/Unreal make it moot except if want pure RTX

Ie if I showed you a flat scene and said imagine reflections, god rays etc, if you have quite good mental visual skills you could add them in, in your minds view

So really an AI engine could do this maybe for less cost I don't use Journey etc but surely give it a flat image , say where lights are, do ray tracing look- imagine could do OK
So for the specific game and application that I play, 8K interests me. That game is EvE. The scale of those battles blown up onto a large screen creates a sort of 3D effect that has to be experienced to be understood. That said, I sit very close to a 65" 4K120 TV that I use as a "monitor" in this application to get this "experience". At those distances, the DPI is VERY low. Frankly, modern hardware could probably play modern EvE at 8K120. There are 2 MAJOR issues stopping me from buying an 8K display for this purpose.

1) and BY FAR THE LARGEST, is that input lag on 8K is REALLY bad. The short of that is the more pixels, the more processing the driver in the display has to do to change each pixel. Going from 1080P to 4k on the same display will increase latency, but nothing that people will really notice. I HAVE NOT used a single 8k Display, TV or otherwise, that has acceptable levels of input lag

2)8K120 just isn't a thing right now. I will not throw down money for an 8K display if it is limited at 60hz. Whether it is because of the display or the GPU output, I will not be upgrading until 8k120 is a real thing. This is even more infuriating because the organization that controls the HDMI standard wont allow display ports on TV's and they also won't allow proper HDMI support for anything about 4k60 on Linux. There are work arounds for this and JANK is just something you have to accept if you want to become a Linux user. The thing is, the less jank you have to put up with, the better.

So even if I wasn't a stubborn linux user who plays too much EvE, I still thing that we are a VERY far off from 8k being relevant. I think the issues with modern hardware basically requiring upscaling to play modern games which is causing a regress in performance(and graphics) not progress, It's going to be awhile before 8k comes close to being relevant. I find something like the creation of a "stop gap" measure of a 6k standard (3240P?). 6k panels can also be made from cuts of defective 8K panels so it is theoretically a way to make more money per mm^2 than cutting a defective 8K panel into 3 4K panels.

But anyway. As someone who hopes to own a 100" ~8k panel for gaming some day, I feel we're still a good bit off from that. I had a best buy let me hook my laptop upto a Samsung QN800D that I almost bought, but the deal break was the input lag. In 4k, it was perfectly acceptable, you couldn't feel it. If you tried to game at 8K, it instantly was probably somewhere in the 100MS and you could definitely feel that. Even basically desktop use and veiwing files felt like my mouse cursor was moving through a heavy oil or something. It was a very odd experience. Again, 4k was fine, but 8k is completely off the table until inputlag gets address. Also, if I was to use this as a desktop monitor, I would expect 8k120. I'm not a high refresh elitest, I don't think I can really see or feel anything passed 120, but I can 100% feel the difference when web browsing or looking over excel spreed sheets on a 60hz vs or 120hz monitor. It's absolutely tolerable, but I'm not paying 8K TV money for a "tolerable" experience. My only requirement for spending exceptional amounts of money is an exceptional experience, which just isn't there right now. I'd rather replace tmy current display with a 4k OLED TV that accepts a 240hz input than go 8K60. And I actually think TVs can only accept a maximum of 4k165 right now for some arbitrary reason set by the HDMI foundation, but double check those numbers and the reason behind them because I'm not saying that with 100% certainty.

So, for now, I'll have to deal with sitting arms length away at my 65" 4k screen for my EvE gaming needs and being able to see individual pixels because the DPI is low. Part of me was considering a 4k120 projector for awhile because the way the light is, well, projected naturally creates a blending effect that is somehow both sharp and blurry at the same time(think CRT). But, the logistics of that whole idea are, well, absurd to put it politely.
 
Back