nVidia have had a rocky year, with Fermi delays costing them sales and market place. With the release of the GTX480 and 470, people still weren't sold on the new architecture as it ran hot, sucked a lot of power and didn't really offer considerable improvements over the current HD5000 series ATI hardware. With the release of the GTX460 however the public finally embraced Fermi and rightly so, because for the price, these cards are the market leaders right now with astounding performance, low power drain, cool running temperatures and massive overclocking potential.
Today we are looking at a pair of the latest Gigabyte cards which not only offer preconfigured overclocks, but have a very impressive looking heatpipe cooler with dual fan configuration. We are also going to test the cards in SLi and put them head to head against ATI's HD5870s running in Crossfire. Why? well we noticed that SLI performance with these cards is top drawer and we wanted to pit them against the high end AMD boards to see just how much performance you get for the money.
The Gigabyte cards are sold with clocks at 715mhz for the core and 900mhz for the memory. The GTX 460 reference clocks are 675mhz core and 900mhz memory.
GIGABYTE Ultra Durable VGA Series
- Powered by NVIDIA GeForce GTX 460 GPU
- Integrated with 1024 MB / 256-bit GDDR5 video memory
- GIGABYTE WINDFORCE™ cooling design
- GIGABYTE Ultra Durable VGA High Quality Components
- Supports Microsoft DirectX 11 and OpenGL 4.0
- NVIDIA SLI Ready
- NVIDIA Pure Video HD technology & 3D Vision Surround Ready
- Supports NVIDIA PhysX / CUDA technology
- Features Dual Link DVI-I*2/ mini-HDMI 1.3a connector
- 5.5% better performance than standard HD 5850
- 15% better performance than standard HD 5830
- 4x the DX11 tessellation performance of HD 5830
The Gigabyte GTX 460's are supplied in a box with possibly the coolest artwork I have seen this year. What is there not to like about giant blue robotic eyes?
The bundle contains a couple of power adapters if you are running an older power supply, a driver CD, manual and a very handy mini HDMI to HDMI converter. It is not often we see these litle converters included in the bundle, so bonus points for Gigabyte.
The Gigabyte cards are supplied with a rather attractive glossy black cooler with a dual fan configuration for additional cooling prowess Gigabyte call this the WINDFORCE 2x cooling design which is meant to lower air flow turbulence.
In Gigabytes words: “By unique design, WINDFORCE 2X even enlarges air channel on the graphics card vents and creates a more effective air flow system in chassis. This special design helps heat dissipate quickly from GPU. In addition, WINDFORCE 2X is equipped with 2 copper heat-pipes which strengthen the speed of heat dissipation.”
These fans are bolted to a two heatpipe heatsink and this is a larger design than the reference model so it should in theory be able to remove more heat. The fans blow directly onto the PCB and memory below, meaning greater cooling performance. Negatively this hot air is pushed into the chassis and not out the back.
The VRM's have a small dedicated heatsink which is always good to notice … this will also get a high level of airflow from the dual fan configuration.
The heatsink is a reference AMD design although it is finished in Gigabyte's famous blue colour, which looks great. The GF104 has been overclocked from 675mhz to 715mhz which also means the stream processors have increased in speed from 1,350mhz to 1,430mhz. For some reason Gigabyte haven't touched the memory, which is an unusual decision for an OC board – it is running at the reference speeds of 900mhz (3,600mhz effective) … but we will see how far we can crank it later.
The PCB is a 2oz Copper design which doubles the density of copper inner layer compared to a reference designed card. Japanese Solid Capacitors are used throughout to offered improved electronic conductivity. Gigabyte have also used Ferrite Cores/Metal Chokes which basically means that they are able to store energy longer and prevent rapid energy loss at high frequency. These technologies lead to an apparent 10-30% improvement in power switching loss and 5-10% GPU temperature reductions.
The cards have two dual link DVI connectors, with a mini HDMI port – the mini HDMI converter cable is supplied which means all you need is a HDMI cable for connectivity to your TV or monitor. They require two 6 pin power connectors which follow the reference design.
Today we are concentrating on SLI performance – we have focused on several GTX460 cards in the past. With recent price drops a GTX460 SLI configuration is well within the financial means of a dedicated enthusiast gamer. We have seen these Gigabyte boards selling for £175 inc vat, a very reasonable price indeed.
Our system today is built around one of our fastest KitGuru test beds based on an overclocked Intel 6 core 970 processor. We will test today at 1080p, 1920×1200 and 2560×1600 when applicable. We don't like completely focusing on 2560×1600 because a very small percentage of gamers own one of these panels (under 1%).
Processor: Intel Core i7 970 CPU @4.33ghz (Validation here)
Cooling: Coolit Vantage
Memory: GSkill Trident 2000mhz DDR3 (6GB)
PSU: Corsair AX850
Motherboard: MSI X58A-GD65
Hard Drive: OCZ Agility 2 120GB
Case: Lian Li PC 8FIB
Monitors: LaCie 730 30 inch LED screen & Panasonic 600hz Viera TV
Comparison cards:
MSI Cyclone N460GTX 1GB
GTX465 Reference
GTX460 Reference
HD5850 Reference
HD5830 Reference
HD5870 CrossfireX
Catalyst 10.9
Forceware 260.52
Technical Equipment:
Keithley Integra unit
Thermal Diodes
Raytek Laser Temp Gun 3i LSRC/MT4 Mini Temp
SkyTronic DSL 2 Digital Sound Level Meter (6-130dBa)
Software:
Fraps Pro
Unigine Heaven Benchmark 2.1
3DMark Vantage
Crysis
Resident Evil 5
Aliens V Predator
Battlefield: Bad Company 2
Far Cry 2
All the latest bios updates and WHQL drivers are used during testing. We perform under real world conditions, meaning KitGuru test all games across five closely matched runs and average out the results to get an accurate median figure.
Our minimum frame rate game graphs have three main zones. These are sampled over a specific 30 interval period of time and then mapped into a chart. These are handy reference guides to detail worst case performance of the product being reviewed.
Over 30fps is the zone most people want at all times, this means perfectly smooth frame rates with no hitching.
Between 30fps and 25fps is the KitGuru ‘Playable’ zone, although some people might notice occasional stuttering in specific scenes.
Under 25fps is classed as the KitGuru ‘Danger Zone’ which means that the game experience will be less than impressive. Settings and/or resolution would need lowered to help smooth out the frame rate.
Futuremark released 3DMark Vantage, on April 28, 2008. It is a benchmark based upon DirectX 10, and therefore will only run under Windows Vista (Service Pack 1 is stated as a requirement) and Windows 7. This is the first edition where the feature-restricted, free of charge version could not be used any number of times. 1280×1024 resolution was used with performance settings.
The results fall pretty much in line with what we would expect from a GTX 460 OC in SLI. You can view the compare on Futuremark ORB over here if you would like to check against your own system.
Unigine is a top-notch technology, that can be easily adapted to various projects due to its elaborated software design and flexible toolset. A lot of our customers claim that they have never seen such an extremely-effective code, which is so easy to understand. It is already used in the development of different projects (mostly games).
Heaven Benchmark is a DirectX 11 GPU benchmark based on advanced Unigine engine from Unigine Corp. It reveals the enchanting magic of floating islands with a tiny village hidden in the cloudy skies. Interactive mode provides emerging experience of exploring the intricate world of steampunk.
- Efficient and well-architected framework makes Unigine highly scalable:
- Multiple API (DirectX 9 / DirectX 10 / DirectX 11 / OpenGL) render
- Cross-platform: MS Windows (XP, Vista, Windows 7) / Linux
- Full support of 32bit and 64bit systems
- Multicore CPU support
- Little / big endian support (ready for game consoles)
- Powerful C++ API
- Comprehensive performance profiling system
- Flexible XML-based data structures
We always run the Heaven Benchmark at the same settings so all video card results are easily comparable. We run at 1080p resolution with other settings left to default. Shaders high, Tessellation normal, anisotrophy 4x and anti aliasing off.
nVidia cards have very strong tessellation performance and we can see with this benchmark that the SLI solution is almost sitting around 60fps average. If more games used tessellation right now nVidia would be in a very strong position.
Crysis Warhead, like the original Crysis, is based in a future where an ancient alien spacecraft has been discovered beneath the Earth on an island east of the Philippines. The single-player campaign has the player assume the role of (Former SAS) Delta Force operator Sergeant Michael Sykes, referred to in-game by his call sign, Psycho. Psycho’s arsenal of futuristic weapons builds on those showcased in Crysis, with the introduction of Mini-SMGs which can be dual-wielded, a six-shot grenade launcher equipped with EMP grenades, and the destructive, short ranged Plasma Accumulator Cannon (PAX). The highly versatile Nanosuit returns.
In Crysis Warhead, the player fights North Korean and extraterrestrial enemies, in many different locations, such as a tropical island jungle, inside an “Ice Sphere”, an underground mining complex, which is followed by a convoy train transporting an unknown alien object held by the North Koreans, and finally, to an airfield. Like Crysis, Warhead uses Microsoft’s Direct3D 10 (DirectX 10) for graphics rendering.
Testing was taken from a custom run of Cargo Level at 1080p in DX10, gamer settings.
The SLI configuration helps to ensure that the frame rates never dip into the sub 30 zone – we really do stress these cards out on one of the most intensive levels available with several firefights and explosions up close. Generally performance would be slightly higher but we wanted to focus on a ‘worst case' scenario.
Resident Evil 5, known in Japan as Biohazard 5, is a survival horror third-person shooter video game developed and published by Capcom. The game is the seventh installment in the Resident Evil survival horror series, and was released on March 5, 2009 in Japan and on March 13, 2009 in North America and Europe for the PlayStation 3 and Xbox 360. A Windows version of the game was released on September 15, 2009 in North America, September 17 in Japan and September 18 in Europe. Resident Evil 5 revolves around Chris Redfield and Sheva Alomar as they investigate a terrorist threat in Kijuju, a fictional town in Africa.
Within its first three weeks of release, the game sold over 2 million units worldwide and became the best-selling game of the franchise in the United Kingdom. As of December, 2009, Resident Evil 5 has sold 5.3 million copies worldwide since launch, becoming the best selling Resident Evil game ever made.
We tested via DX 10 with 8AA, Motion blur ON, Shadow, Texture on and at the native resolution of our 1920×1200 panel.
We then cranked the Anti Aliasing to C16XQ settings.
Then we upped the resolution to 2560×1600, native resolution on our 30 inch panel.
Even at 2560×1600 with C16xQ AA the cards are able to power through the frames, delivering a very smooth experience.
Aliens V Predator has proved to be a big seller since the release and Sega have taken the franchise into new territory after taking it from Sierra. AVP is a Direct X 11 supported title and delivers not only advanced shadow rendering but high quality tessellation for the cards on test today.
To test the cards we used a 1080p resolution with DX11, Texture Quality Very High, MSAA Samples 1, 16 af, ambient occulsion on, shadow complexity high, motion blur on.
With two cards our average frame rate increases from low 40's to around 75, which is a good scaling increase. We will revisit this game at 2560×1600 later in the review.
Battlefield: Bad Company 2 features 15 vehicles, including new additions, such as the UH-60 Black Hawk, a quad bike, a four-man patrol boat, a personal watercraft, a ZU-23 mounted on a BTR-D armored personnel carrier, and a UAV helicopter controlled via remote computer terminals. The Mi-24 attack helicopter, dating back to Battlefield 2, has now been replaced by the more capable Mi-28, while the original is reintroduced as a transport helicopter.
It uses the Frostbite 1.5 game engine with the Destruction 2.0 feature set.
A great set of results for the Gigabyte SLI solution in Battlefield showing an edge when compared against the single card configurations. We will look at 2560×1600 performance later in the review.
Far Cry 2 (commonly abbreviated as “FC2 or “fc2″) is an open-ended first-person shooter developed by Ubisoft Montreal and published by Ubisoft. It was released on October 21, 2008 in North America and on October 23, 2008 in Europe and Australia. It was made available on Steam on October 22, 2008. Crytek, the developers of the original game, were not involved in the development of Far Cry 2.
Ubisoft has marketed Far Cry 2 as the true sequel to Far Cry, though the sequel has very few noticeable similarities to the original game. Instead, it features completely new characters and setting, as well as a new style of gameplay that allows the player greater freedom to explore different African landscapes such as deserts, jungles, and savannas. The game takes place in a modern-day East African nation in a state of anarchy and civil war. The player takes control of a mercenary on a lengthy journey to locate and assassinate “The Jackal,” a notorious arms dealer.
Far Cry 2 is still a popular game and the open world environment can be taxing on even the latest hardware available today. We set the game to 8xAA and 16 texture filtering and maxed all the other settings in game (Ultra High).
Frame rates were often in excess of 100 fps so we decided to crank the SLI solution to 2560×1600.
Even at 2560×1600 Far Cry 2 is perfectly playable with everything set to maximum. Frame rates were mainly between 40-50 with a few dips into the mid 30's.
Today our review is following a slightly different structure because we want to see how far we can push the Gigabyte GTX 460 cards in SLI then pit them in a head to head against a reference ATI HD5870 Crossfire X solution. While this seems like an unusual move, we thought it would be interesting, specifically as the HD5870 solution costs considerably more.
We used MSI Afterburner to overclock the cards.
We managed to push the cards to 880mhz and 885mhz respectively with one hitting 1000mhz on the memory and the other 1025mhz. Obviously for an SLI configuration we will take both of the lowest numbers for the final configuration, meaning 880mhz core and 1000mhz (4000 effective) on the memory. This is a 165mhz increase on the core and 100mhz (400mhz) on the memory, pretty much what we would expect from a GTX460 with modified cooling solution.
While we already looked at Battlefield Bad Company 2 earlier, we wanted to try something different by placing the manually overclocked GTX 460 SLi solution in a head to head with two reference clocked HD 5870 cards on the same system. Obviously there is a considerable price difference but with the GTX460 manual overclock it will be interesting to see how close the dual card solutions match up.
Both solutions deliver fantastic performance results and are perfectly playable throughout. The HD5870 CFx solution has the edge, by about 7 frames per second but it is hardly noticeable real world. We were surprised how well the GTX460 SLI performed with this title and you will note that the all important minimum frame rate scores are identical.
This time we are pitting GTX 460 SLI against HD5870 CFx with the Direct X 11 title Aliens V Predator which is a system killer. We will test at 2560×1600 with 16AF and no AA. All other settings are configured to high.
Again the HD5870 Crossfire X configuration beats the manually overclocked Gigabyte GTX460 solution, but by only 9 frames per second. This shows how potent the lower cost GTX460 solution is, especially when you manually overclock it to the limits.
The tests were performed in a controlled air conditioned room with temperatures maintained at a constant 25c – a comfortable environment for the majority of people reading this.
Idle temperatures were measured after sitting at the desktop for 30 minutes. Load measurements were acquired by playing Crysis Warhead for 30 minutes and measuring the peak temperature. We also have included Furmark results, recording maximum temperatures throughout a 30 minute stress test. All fan settings were left on automatic.
When we were taking shots with the cooler off, we noticed a lack of paste on the cores so we cleaned them and applied Noctua NT H1 thermal paste. Although this isn't the first time we have seen a graphics card with very little thermal paste applied it really helped reduce temperatures with our stress testing. Our recommendation is to always check your graphics card has the right amount of thermal paste applied … even then we always replace it with high quality Noctua or Arctic Cooling paste.
You can see above that with Furmark stress testing you can see the difference in temperatures between card 1 and card 2. One of them has had Noctua thermal paste reapplied. It isn't rocket science to work out which is which.
The cooler on these cards is exceptional with temperatures hovering between 58-60c when playing Crysis Warhead.
Recently we have changed our method of measuring noise levels. We have built a system inside a Lian Li chassis with no case fans and have used a fanless cooler on our CPU. We are using a heatpipe based passive power supply and an Intel SSD to keep noise levels to a minimum. The motherboard is also passively cooled. This gives us a build with completely passive cooling and it means we can measure noise of just the graphics card inside the system when we run looped 3dMark tests. Ambient noise in the room is around 20-25dBa. We measure from a distance of around 1 meter from the chassis and 4 foot from the ground to mirror a real world situation.
Why do this? Well this means we can eliminate secondary noise pollution in the test room and concentrate on only the video card. It also brings us slightly closer to industry standards, such as DIN 45635.
KitGuru noise guide
10dBA – Normal Breathing/Rustling Leaves
20-25dBA – Whisper
30dBA – High Quality Computer fan
40dBA – A Bubbling Brook, or a Refridgerator
50dBA – Normal Conversation
60dBA – Laughter
70dBA – Vacuum Cleaner or Hairdryer
80dBA – City Traffic or a Garbage Disposal
90dBA – Motorcycle or Lawnmower
100dBA – MP3 player at maximum output
110dBA – Orchestra
120dBA – Front row rock concert/Jet Engine
130dBA – Threshold of Pain
140dBA – Military Jet takeoff/Gunshot (close range)
160dBA – Instant Perforation of eardrum
The Gigabyte cooler has already shown how well it handles the core temperatures under load but we noticed it wasn't loud at all, even when running Furmark. We recorded a maximum of around 35.8dBA with our noise meter which is an excellent result.
To test power consumption today we are using a Keithley Integra unit and we measure power consumption from the VGA card inputs, not the system wide drain. The best way to get maximum load results is by using Furmark, and even though it is not indicative of a real world situation it shows the limits the card can theoretically demand. The ‘gaming’ results are measured when playing Crysis Warhead and is a more valuable result to take from this.
These cards consume very little when idle, around 13 watts each. When gaming we can see this rise to 128 watts, and then 172 watts when running Furmark stress test. These cards are consuming less power than any of the GTX460's we have looked at to date.
The Gigabyte GTX460 OC is a mega gaming card and in SLI they become a potent powerhouse, capable of delivering a high level of frame rates with any modern day engine. The Gigabyte cards have many benefits over the reference solution, the most noticeable being the fantastic cooling solution which delivers sub 60c load results while generating a low level of noise.
With manual overclocking we were happily surprised to see the GTX 460's almost hanging onto performance levels that a HD5870 CFx solution was delivering. Sure they weren't quite as fast but considering two of them cost £350 in the UK right now and two HD5870's will set you back over £600 then it certainly seems like a fantastic value for money proposition. We don't think there is a better performance to cost ratio on the market today.
Negatively, these cards have clearly got huge overclocking potential and both our samples hit 1000mhz memory or over, so why not sell them with a higher reference clock speed? We really don't understand the logic, because the cooler is clearly very capable.
We love the GTX 460 – the combination of high performance, low noise, moderate power drain and exceptional overclocking capabilities mean that in the sub £200 bracket this is the card to get. We like the Gigabyte card because the cooler is a much superior solution to the reference board and probably one of the best on the market. The MSI Cyclone might still have the edge, but it is close.
KitGuru says: If you have £180 to spend right then you are spolit for choice, really what you need to be selecting is the GTX460 with the best cooler and this Gigabyte card is right at the top of the pile.
KitGuru KitGuru.net – Tech News | Hardware News | Hardware Reviews | IOS | Mobile | Gaming | Graphics Cards














































That is a fantastic board. SLi scaling is impressive, always has been.
Good all round boards, but I keep wondering if it is just too little to late. rumours on the net say ATIs next solutions are out in a months time.
I love the 460, only card from nvidia ive rated in 2 years. 450 not so much.
SLI performance is strong. these cards overclock liike crazy
ermm dont these seem a bit expensive compared to 460s at 145-150 ?
the cheaper models are normally 768mb versions though, not worth picking up
I still think the HD5850 is a better buy. its faster and with AMD you get better drivers and support.
5850 is priced higher + not always faster card, thus not the best buy… yet…
I agree with Jordan – HD5850 is quite a bit more expensive still.
Unless you get a HD5850 on a sale deal, its costing more. and if you manually OC these 460s you get HD5850 performance anyway. thats the whole selling point from nvidia.
Nvidia re panicing tho. they know ATis new cards are coming soon. its a reduced sale to sell as many cards as possible before everyone goes back to ATi.
everyone needs to stop calling them ATI 😉 that name is no more. unfortunately
lol yeah, i dont think anyone cares about the name change
AMD better pull their socks up some and get their new cards to market since the GTX 460’s in SLI are so close to 5870 in Crossfire for half the price