AMD Radeon HD6990 Review

Review Score:
4 Flares 4 Flares ×

The tests were performed in a controlled air conditioned room with temperatures maintained at a constant 25c – a comfortable environment for the majority of people reading this.

Idle temperatures were measured after sitting at the desktop for 30 minutes. Load measurements were acquired by playing Crysis Warhead for 30 minutes and measuring the peak temperature. We also have included Furmark results, recording maximum temperatures throughout a 30 minute stress test. All fan settings were left on automatic.

temps4 AMD Radeon HD6990 Review

1.12 Voltage setting

Let’s break down the results above. Firstly at 830MHz core, the card voltage is set to 1.12. Under Furmark load, the card never exceeds 50%, maintaining temperatures of 74c when gaming.

furmark 300x162 AMD Radeon HD6990 Reviewnew furmark 300x161 AMD Radeon HD6990 Review

This rises to 82c when using the earlier version of Furmark and 88c with the latest V1.9.0 (set to extreme burning mode). These are not realistic ‘real world’ load conditions but are included for interest.

ternps idle 300x286 AMD Radeon HD6990 Review

The fan rotates at 28 percent when idle, which translates to a speed around 1,400 rpm. Under load, at 50 percent this rises to around 2,900 rpm. As documented on the previous page, this produces a noise emission around 40 dBA.

1.175 Voltage setting

fan mid full 300x162 AMD Radeon HD6990 Reviewfan full 300x162 AMD Radeon HD6990 Review

When the card is set to the performance or ‘OC’ mode, the voltage increases to 1.175. When loaded in Furmark the card fan profile changes, causing a ‘shift’ in fan speed between 50 percent and 65 percent.

furmark final 300x162 AMD Radeon HD6990 Reviewnew furmark1 300x162 AMD Radeon HD6990 Review

Above you can see the temperature graph which is dipping and rising as the fan speed fluctuates between 2,900 rpm and 3,700 rpm. This causes a rather annoying fan whine on a fairly regular basis as the profile is adjusting the fan speed to compensate for rising (and dropping) temperatures. While this can be unpleasant (causing noise emissions to peak over 48 dBA), it isn’t really indicative of a real world gaming scenario.

Below we have included a video showing the effectiveness of the AMD HD6990 cooler after being loaded in Furmark and returning to ‘idle’ temperatures. It takes several minutes for the card to achieve an ambient state.

VN:R_U [1.9.22_1171]
Rating: 5.0/5 (1 vote cast)
AMD Radeon HD6990 Review, 5.0 out of 5 based on 1 rating
4 Flares Google+ 0 Twitter 4 Reddit 0 StumbleUpon 0 Email -- Facebook 0 4 Flares ×

Page : 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30

  • Steve Ruxton

    Awesome, time to get every thing i need ready and read what I’ve been waiting for.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Seth

    damn that is a powerful card!

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Ned

    Very impressive card, but its out of reach for most people. it costs a few quid in the UK

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Teddy

    GTX590 up for the challenge, thats what I want to see :)

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • John

    I wish that AMd would sort out their coolers, those fans have been used now for about 10 years. its really a tough decision buying one of these due to the noise levels, but no more so that all the other dual gpu cards from AMD, at least this one isnt running at 100c+

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Tim

    The kettle is one, time for a good read, thanks.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • KoRn

    what a mega product, I wasn’t expecting it to be so quick. Nvidia need to come up with something now :) im sure they will

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Jimbo

    Actually when you look at the price, which will be say £550, compared to GTX480 at £400. The performance, per pound is excpetionally good for the added price difference

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Colin

    Well I might be very wrong here, but Nvidia will have a very tough time beating this card. The temperatures are the killer.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Ben

    Two GTX570′s in a high bandwidth state could possibly beat this board, but the engineering would be difficult, a single GTX570 runs hot as it is.

    I think their cooler is extremely impressive for a reference solution, sure its loud, but they aren’t letting it get into the 90c’s which is always good to see. thats why their older dual GPU cards failed. this one shouldnt.

    I could live with the noise for that level of power. no more worrying about IQ settings.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • VeicNt

    What a fantastic product. even if its more expensive that my monthly rent !

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Serge

    It will be really interesting to see what nvidia come up with. I think they can beat this. but its all about the cooling. AMD struggled with this, its clear to see, and its noisy. I dread to think how loud the nvidia card would be.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Raymond

    whats next? 550W graphics cards? its starting to really get out of hand now. two of these would consume 800w when gaming? the bill at the quarter would be insane.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Gainward

    Just a heads up for anyone buying the card and wanting to remove the stock cooler…. There is a small screw on the back that is covered by two stickers with (its under the two stickers that look like a barcode). Well removing that you will then notice a void logo underneath it… I just wanted to point it out to you all…

    Didnt bother us too much here seen as ours is sample but I know to some droppin £550ish UK is quite a bit of cash and if all you are doing is having an inquisitive look it seems a shame to void your warranty :-S

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Roger

    So if you change the bios switch to OC mode you void your warranty. If you take the cooler off for a look you also void it? Anything else? plugging in a DVI cable maybe?

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Gainward

    “whats next? 550W graphics cards? its starting to really get out of hand now. two of these would consume 800w when gaming? the bill at the quarter would be insane”

    I also just wanted to touch on that comment. Whilst these cards seem excessive to some you have to remember that @ this moment in time they are as I dub it the veyron moment like others in the past the concorde moment. They might not be for most people practical but to its like me saying to my team, look lets see what we can do. Not only that but having the crown of fastest single card (agreed single card but multi gpu) goes a long way to brand loyalty and advertising.

    An example I like to use is the GTX 560. Its a fantastic card and in many ways better… hang with me a second. In terms of actual raw power you get for sub £200 is incredible also factor in its quiet and wont eat through your electricity like a mouth through primark. But…. to not produce these high end cards would be criminal. We need people to keep pushing as hard and fast (that sounds so wrong) at the the boundaries(agreed quite crudely in the 6990 case but hey I dont sit in either camp just wait a few days for the 590 for brute but crude).

    with the reduction in nm to 28 the power consumption and heat will be brought down further (dont need pointing out here that a lot of factors at play here it was just a generalisation) but sure we could see just as big and hot cards within practical reason.

    I would not say out of hand I would say its progress and with progress we can learn

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Mopar63

    DO NOT take off the cooler, according to AMD this time around they have used a new thermal compound they are transitioning to that is better than anything on the market. The claim is that once this compound is broken, ie the cooler removed it cannot work anymore and you will suffer higher temps when you put on another compound to replace it.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Brian

    The logic with the GTX560 is true. I dont think many people can afford a 6990 or 580 gtx even. but they are showcase cards of how ‘advanced’ the companies are. and I agree with the intro in the review that people always opt for a card in the range of a ‘performance leader’ higher up the chain.

    Interesting to read about the phase change paste.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Gainward

    “Your comment is awaiting moderation”

    You would not mind spell checking it also would you ;-P

    Damn predictive text

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Erik

    It is a mighty impressive card, but I see it as totally overkill. Im probably one of the few. I can understand the point of it and its extremely quick, but id be happy with a 460 or 560, or even a 6870.

    Many people buy these cards and have a 1080p screen, which to me is the stupidest decision you could ever make, even a 2560×1600 screen is easily powered by a GTX570 or 6970.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Hakuren

    Now give us some after market cooling versions for both 6990 and 590. Drolling already.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Keeks

    the VGA cards get me insane…!

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
Polls

Which monitor next?

View Results

Loading ... Loading ...
KitGuru Facebook
Latest News Latest Previews
Related Posts
Archives