504x298
aaaaa.jpg
๐งต FP32 vs FP16 on Nvidia Ampere
Anonymous at Thu, 27 Jan 2022 06:12:02 UTC No. 878351
https://www.nvidia.com/content/PDF/
On page 14, this document seems to be saying that using 16-bit floats instead of 32-bit no longer provides a ~2x performance increase that older architectures used to.
Is this accurate, or is the truth more complicated than that?
Also, why did nVidia do this? Radeon's 6000 series still gets the ~2x performance increase.
Does this change just not make a difference in games?
What about compute applications?
Anonymous at Thu, 27 Jan 2022 07:06:23 UTC No. 878356
FUCK AMD
LONG LIVE NVIDIA CHADS!!!!