Works noisy even at a temperature mark of 50 degrees. You can set it manually of course, but on auto settings it makes noise. Under load, the chip heats up to 55 degrees, but the GPU Hotspot (that is, the hottest place in the chip, not the middle one) warms up to 85 degrees. What is the reason for the temperature difference of 40 degrees between the average temperature and the hottest point of the chip? Gigabyte support said that the video card does not have a Hotspot sensor at all. Then where does AIDA64 read these readings from? Also, while playing Far Cry 6, artifacts were seen for half a second in full screen in one place of the game, flying out of the tunnel. But I associate it with the software part, rather than with the hardware. Apparently, the truth is that Nvidia optimizes games better. Gives stable 60+ frames at maximum settings in FHD resolution. Suitable for games. At idle, the temperature is 30-33 degrees. Despite the disadvantages, I think that the card is good for the money for which I took it, namely 23 at the beginning of the year with a promotional code and cashback
M.2 Screw Kit: Easy Mounting for NVMe SSDs on ASUS Motherboards
19 Review
36-Pack Black Rubber PC CPU/Case Fan Screws/Rivets Set for Computer
11 Review
Glarks 660 Pieces Phillips Assortment Motherboard
10 Review
Comprehensive 500pcs Laptop Screw Kit Set for 🔩 IBM HP Dell Lenovo Samsung Sony Toshiba Gateway Acer
12 Review