KitGuru knows that companies spend millions creating the best HD processing set up possible only to see the final image quality shattered by a weak component somewhere in the system. We spend many hours watching High Definition content and we never stop playing with our ATI and nVidia driver panels to get the quality as high as we can. All this fiddling might be fun, but can we add any science to the process of deciding on the best picture? HQV Benchmark 2.0 aims to do just that. Square-Eyed KitGurus investigate.
Hollywood Quality Video. You are setting yourself up with a name like that. Especially for a benchmark. But the guys at http://www.hqv.com/ did a cracking job when they launched their original benchmark back in 2007. It quickly won Wired Magazine’s Editor’s Pick as a ‘must have’ for setting up the best home cinema experience possible. The main advantage was that it could be used across a broad cross section of systems, PC and hi-fi to give a simple number which allowed you to compare image quality.
The point of the program is that it takes a series of ‘human’ tests (you look at it and score it) and lets you give that experience a number. The numbers given by a diverse audience are often quite close in scoring.
HQV Benchmark 2.0 is an updated version of the original tool and it consists of various video clips and test patterns which are designed to evalute motion correction, de-interlacing, decoding, noise reduction, detail enhancement and film cadence detection.
There are two versions of the program, standard definition on DVD and high definition on Bluray. As our audience will be concentrating on HD content so will we.
This has a total of 39 video tests which is increased from 23 in the original and the scoring is also up from a total of 130 to 210. As hardware and software gets more complicated, the software has been tuned to make sure we can thoroughly maximise our analysis.
We will be comparing ATi, nVidia and Intel graphics solutions with the HQV 2.0 Benchmark to see which company really delivers the best viewing experience. As a portion of the testing will be subjective, it is absolutely imperative that we use the best quality screen possible. Many HDTV’s have intensive processing which may artificially enhance the picture quality, but we need to omit this from the equation and concentrate totally on the graphics hardware and driver output. Our initial plans were to use a television such as the Panasonic Viera NeoPDP 600hz Plasma, but while we adore the picture there is a lot of processing going on behind the scenes, even with intelligent frame creation disabled.
For the purposes of this review we will be using one of the most expensive monitors on the planet, the LaCie 730, which retails around the £2,700 mark. This incorporates a 14-bit colour system (14-bit look up table and 14-bit per colour processing) which takes the 8-bit output of your graphics cards dual-link DVI port and modifies it into the monitor’s output using an overall palette which has 192 times more colour shades than the graphics card’s output. The colours displayed will more accurately reflect exactly what is being displayed. The colour gamut is also 23 per cent larger than that of Adobe RGB standard which is used in applications such as Photoshop.
This monitor has been carefully calibrated with LaCie’s Blue Eye Pro Proof Edition software. During our calibration we have configured blackpoint, colour temperature as well as adaptation and profile type. This screen also offers a native 1:1 1080p mode for perfect pixel reproduction.