Home / Tech News / Featured Announcement / Gigabyte GTX 460 OC SLi Review

Gigabyte GTX 460 OC SLi Review

Rating: 9.0.

nVidia have had a rocky year, with Fermi delays costing them sales and market place. With the release of the GTX480 and 470, people still weren't sold on the new architecture as it ran hot, sucked a lot of power and didn't really offer considerable improvements over the current HD5000 series ATI hardware. With the release of the GTX460 however the public finally embraced Fermi and rightly so, because for the price, these cards are the market leaders right now with astounding performance, low power drain, cool running temperatures and massive overclocking potential.

Today we are looking at a pair of the latest Gigabyte cards which not only offer preconfigured overclocks, but have a very impressive looking heatpipe cooler with dual fan configuration. We are also going to test the cards in SLi and put them head to head against ATI's HD5870s running in Crossfire. Why? well we noticed that SLI performance with these cards is top drawer and we wanted to pit them against the high end AMD boards to see just how much performance you get for the money.

The Gigabyte cards are sold with clocks at 715mhz for the core and 900mhz for the memory. The GTX 460 reference clocks are 675mhz core and 900mhz memory.

GIGABYTE Ultra Durable VGA Series

  • Powered by NVIDIA GeForce GTX 460 GPU
  • Integrated with 1024 MB / 256-bit GDDR5 video memory
  • GIGABYTE WINDFORCE™ cooling design
  • GIGABYTE Ultra Durable VGA High Quality Components
  • Supports Microsoft DirectX 11 and OpenGL 4.0
  • NVIDIA SLI Ready
  • NVIDIA Pure Video HD technology & 3D Vision Surround Ready
  • Supports NVIDIA PhysX / CUDA technology
  • Features Dual Link DVI-I*2/ mini-HDMI 1.3a connector
  • 5.5% better performance than standard HD 5850
  • 15% better performance than standard HD 5830
  • 4x the DX11 tessellation performance of HD 5830

Become a Patron!

Check Also

Nvidia Blackwell successor will reportedly be named after astronomer Vera Rubin

We're already expecting Nvidia's next-generation graphics architecture to be named Blackwell, but what about the …