Header banner
Revain logoHome Page
asrock authorized rx 6900 16g logo

ASRock Authorized RX 6900 16G Review

6

·

Very good

Revainrating 5 out of 5  
Rating 
4.8
🧰 Computer Internal Components, 💻 Computer Components

View on AmazonView on ЯM

Media

(4)
img 1 attached to ASRock Authorized RX 6900 16G
img 2 attached to ASRock Authorized RX 6900 16G
img 3 attached to ASRock Authorized RX 6900 16G
img 4 attached to ASRock Authorized RX 6900 16G

Photos by authors

(8)
img 5 attached to ASRock Authorized RX 6900 16G
img 6 attached to ASRock Authorized RX 6900 16G
img 7 attached to ASRock Authorized RX 6900 16G
img 8 attached to ASRock Authorized RX 6900 16G

Details

SeriesRX6900XT PGD 16GO
BrandASRock
Graphics CoprocessorAMD Radeon RX 6900 XT
Video Output InterfaceHDMI
Chipset BrandAMD

Description of ASRock Authorized RX 6900 16G

Memory: 16GB, GDDR6. PCI Express x16 4.0 interface. Ports: 3 x DisplayPorts, 1 x HDMI. Maximum resolution: 7680 x 4320 pixels. Chipset: AMD.

Reviews

Global ratings 6
  • 5
    5
  • 4
    1
  • 3
    0
  • 2
    0
  • 1
    0

Type of review

Revainrating 5 out of 5

Perfect purchase for me, exactly what I was looking for!

The card has a consumption of 6800xt, but is more powerful in games. It has 3 connectors for 8pin 12v, which makes it possible to work at a frequency of 2500 + MHz on the chip. Any games in the resolution of 3440*1440 pull without straining. Cooling is powerful - a huge radiator and 3 coolers. Weight 1734 gr. With such a weight, it was dumb to hang, but the card has a rigid metal frame along the textolite, so the card does not sag at all. The frame is attached to the back of the system unit…

Pros
  • The price is not much different from the younger AMD models and much cheaper than Nvidia Top card from amd at the moment Factory overclock Powerful cooling system Reinforcement plate in the frame of the card, preventing it from sagging and bending Pretty cold and energy efficient against the background of analogues from green Powerful and not noisy
Cons
  • Price in connection with the mining boom Warranty 2 years, not 3 There is no type-C output (although I don’t need it, it may come in handy for someone. It is on the reference) Ethereum hashrate at 6800, which is cheaper No second bios Memory does not chase more than +140 (firewood level limit) - important for ether miners

Revainrating 4 out of 5

Not a bad product, quite normal quality.

Initially an unnecessary card that was slightly faster than the RX 6800 (XT) and 3080 and slightly slower than the 3090, but for its price no one needed it. Because it was too unprofitable in terms of performance per for ordinary people, but for enthusiasts there is 3090. So it was until the prices were equal to the mega hashes on the air. And since the memory subsystem with the RX 6800 (XT) is one with a 256-bit bus (compensated by the cache), then they have the same mining performance. At…

Pros
  • - Impressive performance. RDR2, Valhalla at 4k/60, or 120+ fps at 2k. Cyberpunk gives from 70 to 140 - depending on one SSR setting - I really like AMD's adaptive sharpening filter (yes, a strange plus), because now everywhere is "soapy" TAA - AMD, unlike Nvidia, mainly uses open technology standards that win in the long run. G-sync, physics, Hairworks. The same will happen with DLSS and RTX. Plus iron in consoles, which will also contribute to the use of AMD technologies by developers. Much based on Microsoft and DX 12 technologies (DirectML (analogue of DLSS)). There are already developer tools for AMD FidelityFX on consoles, which means they will definitely use it all. Another thing is that Nvidia too, because AMD has everything open. So, in fact, Nvidia gets its technologies + AMD and Microsoft. - When switching from Nvidia, I found all the same functions in the AMD software. - 16 GB of memory. Yes, 8 is missing. Even in 2k is not enough. What can we say about 4k?! - SAM or Resizable Bar is already actively supported by everything, not just Ryzen 5000. AMD has contributed to the introduction of this PCI-E feature. Nvidia also realized it, but it still works better on AMD (optimization, probably). Intel, z390 - normal flight.
Cons
  • - This particular model heats up to 80+ degrees GPU, Hot Spot under 90 in stress tests. Coolers at the same time work at 50%. If the temperature is kept lower, it makes a lot of noise. And given that at least everyone advises to increase the Power Limit of these cards in the first place, everything is not very good. Under 100% work of coolers in stress tests, if you keep the temperature within 75 degrees. - There is no DLSS and ray tracing performance is a cut below Nvidia (literally a generation behind, this card is at the RTX 2080 level). Moreover, the DLSS analog will work worse on AMD cards and ray tracing too - here, purely in hardware, the card lags behind in the number of RT cores and there is generally no analogue of the tensor cores that are used for DLSS. — Healthy - Still unprofitable in terms of price per frame in games

A person familiar with the brain disease asked me for 5700XT for 75 thousand, added another 80 from those set aside for 6900xt, and calmed his toad on this. A nice bonus was a cashback from the market of 6 thousand (the maximum on the same card) In general, everything is ok, I played with overclocking memory and gpu, with lowering the voltage, turntables - it does not fail in anything. Powerlimit did not increase through the recommendations of overclockers, and so it is more than enough.

Pros
  • Are available Relatively cold (84 hot spots, 68 GPUs) Factory overclocking from 2340, in fact to 2510 in boost 170+ fps in WQHD in modern shooters and battle royales with optimized settings to reduce input lag 6k cashback at a purchase price of 154
Cons
  • why 3 8-pin connectors is not clear if the powerlimit is set to 280W from the factory (1 8pin connector gives 150 W x3=450W, + 75W via PCI) There are no other shortcomings, it does not even squeak

Please note that the card, no matter how much more efficient it is than the 3090 in terms of heat dissipation. still can do 300W+. Therefore, do not expect that everything will be ok with temperatures in a stuffy building. I have stock 2x140 blowing and 1x140 blowing in a fractal and their efficiency is not quite enough. A slightly ajar window (it is possible in my case) lowers the operating temperature of the chip from ~63-64 to ~55-56 (in my conditions). In general, about the 6900: I have…

Pros
  • + CO efficiency, though only 2.5 slots. (It’s not a stove, it doesn’t yell, but I advise you to immediately adjust your curve, because the stock scenario is bad. It allows the card to heat up to 60 and then starts to cool. It’s easier to immediately give speed if the card is under load without critical victims in silence) + No obvious squeaks, rustles, etc. (Closer than 80 cm to the body I do not sit)
Cons
  • In general, the design is for an amateur, but in our time it turns out to be more likely to buy what is available and cheaper, and not what fits perfectly into the assembly. The valve would be central matte, otherwise the backlight gives off cheapness.

Revainrating 5 out of 5

Perfect product for any user!

I took it in St. Petersburg for 120k in Elko, I was just lucky, apparently, someone did not take it and they offered me. Seisas already costs 160k. Until the cue ball collapses like the first time from 20 to 4, there will be no reduction in price. I did not plan to take AMD, I wanted 3090, but for 200-300k it's just crazy to take a video card for games. After a month of tests, I think that I will no longer take 3090,

Pros
  • The main advantage is that it is $500 cheaper than 3090, and the average loss is 10 fps. RTH, laughing, it's not worth a penny. For the first time, there are no questions for AMD firewood.
Cons
  • I always choose Nvidia, although there have been occasional attempts to play on ATI/AMD. Unfortunate experience was with 1900xthx (burned twice), more successful with R9 270-280. Before that, I played on 1080 Gamerock Edition for a couple of years. I thought about the same top-end 3090, but it costs under 300k.