1980x1203

OIDN-RTX-3080-Ti.jpg

🧵 Intel's Open Image Denoise (OIDN) now supports GPU-based denoising

Anonymous No. 907834

http://www.cgchannel.com/2022/05/open-image-denoise-now-runs-on-gpu-as-well-as-cpu/

>Open-source render denoising system Open Image Denoise (OIDN) now runs on GPU as well as CPU.

>An image tweeted by Intel senior principal engineer Jim Jeffers shows an experimental build of OIDN inside Blender’s Cycles renderer on a laptop with a Nvidia GeForce RTX 3080 Ti consumer graphics card. According to Jeffers, the implementation also supports GPUs from AMD and Intel itself.

>Unlike OptiX, OIDN isn’t hardware-specific: while it was designed for Intel 64 CPUs, it supports “compatible architectures”, including AMD CPUs and, as of last year’s 1.4 update, Apple’s new M1 processors. To that list of compatible architectures, we can now include GPUs.

>According to this tweet by Intel senior principal engineer Jim Jeffers, OIDN now runs on AMD, Nvidia and Intel’s own Xe GPUs, which now include the firm’s new Arc A-Series of discrete graphics cards. The functionality isn’t available in the current public release, Open Image Denoise 1.4.3, but Intel’s demo image, included at the top of this story, shows it implemented inside Blender’s Cycles renderer. The image shows the output of OIDN running on a Nvidia GeForce RTX 3080 Ti laptop GPU and an Intel Core i9-12900H CPU, side by side with that of OptiX running on the GPU alone.

>If implemented in public builds of software, the change should enable developers to take advantage of the full processing power of users’ machines, and to support CPU and GPU denoising with a single code base.

Anonymous No. 907840

nobody cares

Anonymous No. 907844

>>907834
If this gets up and running on AMD cards and it's fast enough to be competitive with Nvidia's OptiX denoiser, I might be persuaded to get an AMD RX 7000 GPU instead of an RTX 4000.

Anonymous No. 907891

That's great. OptiX is nice for fast previews but it's also complete trash compared to OIDN.