In a post on GPUOpen, a site for game and graphics developers, AMD may well have let slip that it plans to take a leaf from Nvidia’s book of rendering tools by including a ray tracing denoiser system in its next generation of FSR. And just as important, it will use an AI neural network to do it all.
Unless you’ve been firmly sticking with an old graphics card and consciously ignoring every GPU development in the past six years, you’ll know that AMD, Intel, and Nvidia have all been furiously busy implementing techniques to improve ray tracing performance and visual quality.
The latter is greatly affected by the number of rays that are used to calculate the lighting, shadows, reflections, and so on. Unfortunately, even on monstrous graphics cards like AMD’s RX 7900 XTX and Nvidia’s RTX 4090, ray tracing is extremely demanding so games only use a relatively small number of rays.
That results in a very ‘noisy’ image—grainy in appearance and often full of white spots—so games have to carry out a process called denoising to clean it up. While the likes of Cyberpunk 2077, Black Myth: Wukong, and Alan Wake 2 employ their own denoiser system, Nvidia has an AI-powered one called Ray Reconstruction (RR).
Ray reconstruction is all about making ray-traced images look much better and more accurate, rather than improving performance, and in Cyberpunk 2077, it’s noticeably better than the game’s own denoiser.
But the GPUOpen post makes it clear that Nvidia won’t be the only GPU vendor offering such a feature in the near future. “We are actively researching neural techniques for Monte Carlo denoising with the goal of moving towards real-time path tracing on RDNA GPUs.”
AMD’s RDNA 2, 3, and 3.5 GPUs can all do denoising right now but only those provided by the game in question and the shader cores handle it all. The fact that the research is specifically about using a neural network to do it means that AMD is very much on board with Nvidia in using AI to boost ray tracing results.
But does this mean that future RDNA GPUs will have dedicated hardware for doing the AI calculations? While Nvidia RTX chips have discrete tensor cores for this job, AMD doesn’t and instead uses specific instructions (referred to as WMMA) and the standard shader cores.
That might change in RDNA 4, for two reasons. One is the fact that Sony’s PlayStation 5 Pro has a dedicated chip for accelerating the AI routines for its new PSSR upscaler, and AMD will certainly be aware of the benefit discrete hardware brings to such tasks. The second is one of the goals listed in AMD’s denoiser research: “Highly optimized performance for real-time path tracing at 4K resolution.”
To me, that alone points to AMD having specific hardware for doing the neural networks, because at 4K, general-purpose shader cores just aren’t going to be good enough, unless one has a small mountain of them. RNDA GPUs are the only ray tracing chips in the desktop market that don’t have dedicated tensor/matrix units, so it’s inevitable that AMD will follow suit at some point.
Coupled with the fact that AMD has previously stated that it plans to have all its gaming devices use AI for upscaling too, I’d say there’s a very good chance that RDNA 4 chips will have matrix cores that get used to do FSR 4 AI-powered upscaling, frame generation, and denoising.
That said, AMD has always been of the mind that its FSR package should run on as many GPUs as possible—not just Radeon cards, but those from Intel and Nvidia too, as long as they have the right level of shader support.
If the new tech was exclusive to one generation of RDNA hardware, it could well backfire on AMD, given that its discrete GPU market share is pretty small. It’s possible that AMD could offer a two-tier FSR 4 system, as Intel does with XeSS, where the full AI-powered package only works on RDNA 4 chips, but a slower and less impressive version is available for everyone.
Until we know more, it’s all just guesswork of course, but Radeon fans should take comfort in the fact that AMD is working hard on making its GPUs as modern as possible.