nVidia offers the constant smell of burning bridges says AMD

25 Flares 25 Flares ×

KitGuru is asked more questions about the graphics market than anything else. Seems like you lot love a good fight – and the Heavyweight Championship is fought in the land of shaders. In the spirit of pushing things together until they get heated and critical mass is achieved, we sent a set of (apparently) inflammatory questions to Intel, AMD and nVidia. The Radeon folk replied first. In charge of presenting their version of the world was Richard Huddy who is pointman for his company’s DevRel team. In no particular order, here’s what Huddy had to say.

“I’m confused as to why you’d want to upset people. Historically, with nVidia, there’s the constant smell of burning bridges. Over time, they seem to damage relationships everywhere and, in the long term, if hampers them”, said Huddy when we asked about the differences each company has in approaching developer relations (DevRel). Bold stuff.

It’s worth mentioning here that along with a couple of geniuses called Doug and Servan, Huddy helped invent DirectX in the first place while working for a company called Rendermorphics. Shortly afterwards, Servan and Doug went to a small software company called Microsoft and Huddy joined an organisation called nVidia. Here endeth the history part.

Corporate Huddy nVidia offers the constant smell of burning bridges says AMD

Huddy shows his true colours

While Microsoft lays down the environment for the development of PC games with its DirectX API, there are differences in each company’s hardware and drivers which means each is better at some things and worse at others. Then there are the consoles. nVidia had the original Xbox graphics and the Playstation 3. Intel had the first Xbox CPU. AMD (ATI) has the Wii and Xbox360 graphics. Proper mish-mash of styles and abilities.

Huddy told us how many software development studios are actively trying to balance all of these combinations, “We’re open to working with anyone, but realistically, we spend most of our time with the 100 or so studios who will be able to put a game into the charts”. Why use that distinction? “These games are going to sell in volume, so more people will be impacted if the code can be improved. It’s an intelligent use of our resources”.

How many of these top studios does AMD work with ? “All of them. My team is scattered across the globe and, between us, we’re fluent in more than 20 human languages. Any developer that will impact the market, regardless of where they are based or how large a company they are, will get help from AMD”.

Just one more quick checks on Huddy’s background to confirm if he’s been doing all this DevRel stuff long enough to have an opinion, “I spent 4 years helping to run develop relations with nVidia before moving across to ATI and I’ve been with this team for the best part of a decade”. OK. Experience established. Let’s continue.

“With one exception, every game released through to 2012 will have been developed on AMD’s Radeon
We asked Huddy about the ways in which differences in hardware design translate into the DevRel push. “AMD being able to deliver DirectX 11 hardware into the hands of developers a full 6 months ahead of nVidia, means that with one single known exception, every game released through to 2012 will have been developed on AMD’s Radeon hardware. that gives us a huge advantage”.

ati radeon hd 5870 nVidia offers the constant smell of burning bridges says AMD

Despite the name, these Radeon things are no good at all in removing stains from your clothing at low temperatures

“Even after nVidia finally began seeding production Fermi cards, many of the DX11 game development projects were already well underway on AMD hardware. The result is that many of those games will, quite naturally, be tuned for the Radeon series”. Logical statement form Huddy.

We wanted a specific case where DevRel would be used to make the most of a company’s hardware. Huddy gave us a pearl.

When all you have is a hammer, then everything in the world looks like a nail“.

Sounds like a wise saying, but what does he mean?

“We’re speaking with every development studio in the work that’s likely to create a piece of software that makes it into the charts. All of them are telling us the same thing. nVidia is pushing a single message and that’s tessellation“, explained Huddy.

“Tessellation is about enriching detail, and that’s a good thing, but nVidia is pushing to get as much tessellation as possible into everything”. Huddy reminds us of the problem you get when adding too much salt to your pasta. “Tessellation breaks your image components down in order to add more detail at lower levels and, when it’s done right, it can be stunning”.

Tessellation nVidia offers the constant smell of burning bridges says AMD

Huddy says that you can have too much of a good thing, although at KitGuru we're not entirely sure if that applies to sun tan lotion models or gold bullion bars or genie wishes

Huddy then got scientific, “These days, the most typical resolution for serious gaming is 1080p. A resolution where you have 1920 dots left to right and 1080 from top to bottom. That gives you around 2 million pixels’ worth of work onto the final screen. However, you actually end up working with a much larger picture, to allow for things like light sources and shadow generators that are off screen in the final image, but which still need to be accounted for. The same goes for situations where something is in front of something else, but you don’t know that at the start, so you end up doing work on pixels that may or may not make it to the final cut”.

“Overall, the polygon [Triangle - Ed] size should be around 8-10 pixels in a good gaming environment”, said Huddy. “You also have to allow for the fact that everyone’s hardware works in quads. Both nVidia and AMD use a 2×2 grid of pixels, which are always processed as a group. To be intelligent, a triangle needs to be more than 4 pixels big for tessellation to make sense”.

Interesting enough, but why are we being told this? “With artificial tests like Stone Giant, which was paid for by nVidia, tessellation can be done down to the single pixel level. Even though that pixel can’t be broken away from the 3 other pixels in its quad. Doing additional processing for each pixel in a group of 4 and then throwing 75% of that work away is just sad”.

Huddy touched on benchmarks and we know he will be biased in this department. Knowing he has a built-in bias, we asked him what makes a good benchmark. “It needs to be reproducible, so you get the same results time after time. It also needs to test a new feature. What’s the point of replacing a benchmark with a new one that tests exactly the same thing? Ideally the new benchmark should add something, maybe improved visual quality, so that the user experience is better”.

benchmarkscreen nVidia offers the constant smell of burning bridges says AMD

Searching for the truth in graphics benchmarks gets you into deep water

So far, no mad rantings or foaming at the mouth, “If you’re testing a graphics chip, then try to avoid testing the CPU. With the new cards, it should be a Shader Model 5 test so that you can get an idea of how future games will work”, said Huddy. “Lastly, we all know that code changes constantly before launch. For a benchmark to be really good, it should be created on the final launch code”. Maybe even after the first patches have been released ?

Thinking about the $2M that nVidia paid to buy Crysis 2, we asked Huddy if he thought that nVidia now has a huge DevRel team working putting as many tessellation calls into the game as will fit.

Huddy declined to comment directly on that game, but he did speak about the Batman Arkham Asylum AA fiasco which saw 2 versions of the game released in quick succession following a global flame war across some of the most influential graphics forums on the planet.

He explained, “Batman used a deferred rendering engine. That creates a special situation for applying anti aliasing. There are well known ways of doing it and, as luck would have it, there is a vanilla implementation that works equally well on AMD and nVidia graphics hardware. Let me be clear, this method is well known, works well on AMD and nVidia cards and there is a minimum impact on performance for the customers who have bought the game”.

“Instead of releasing unified code that works on every customer’s card the same, nVidia got the first release of the game to detect if the installed card was a GeForce or Radeon and, if it saw a Radeon, it would turn off the AA feature”, said Huddy.

“The only person this harms if the customer who spent their hard earned money on a brand new game, believing that Batman Arkham Asylum had been written to work as well as possible on the customer’s system”, he pointed out.

Batman nVidia offers the constant smell of burning bridges says AMD

Smoother these days, apparently

“This highlights perfectly the fundamental difference in thinking between AMD and nVidia. We would never do that”, said Huddy. “nVidia needs to learn that you should always put the gamer’s experience ahead of your own ego. Issues like the deliberate and unnecessary reduction in image quality seen in the Batman Arkham Asylum situation, shows that nVidia is willing to single out half the market and nobble their experience. That’s just not right. You should never harm PC gamers just to make yourself look good”.

Huddy told us that the publisher re-evaluated the situation in light of the flame-war and, a short while later with no huge effort, a different version of Batman Arkham Asylum appeared on the market with perfect AA on Radeon cards. Nice.

KitGuru says: We have a follow up piece coming soon in which Huddy shares his views on why nVidia makes a bad partner all round. Informed comment or rampant vitriol? You choose. Just be assured that we’ve opened up the same ground to Intel and nVidia and we’re keen to see what they have to say about the game development market and the role that they have to play in delivering our gaming experiences.

Friendly fire below, fire and forget in the KitGuru forum.

VN:R_U [1.9.22_1171]
Rating: 5.0/5 (1 vote cast)
nVidia offers the constant smell of burning bridges says AMD, 5.0 out of 5 based on 1 rating
25 Flares Google+ 0 Twitter 10 Reddit 0 StumbleUpon 0 Email -- Facebook 15 25 Flares ×

  • Tech Head

    wow he didnt pull any punches !

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Tim

    man he painted nvidia bad didnt he? are they going to retort to this or just take it ?

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Tech Head

    Nvidia PAID to have Stone Giant done? fuck. the system they are using is not even usable real world,. its just to enhance benchmarks for nvidia. thats disgusting.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Harry

    Im shocked. seems nvidia are cheating a hell of a lot with stuff. still. im sure AMD arent innocent either mind you, but this certainly has put me off nvidia for their tactics.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Steve

    Nvidia hobbled Batman Arkham asylum so it deliberately ran worse on AMD hardware? the way its meant to be played my arse. Fuck im fuming.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Kern

    nvidia wont reply. they never do. not unless they own the website. sad really :(

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • John

    While the nvidia batman hack to make it run bad on AMD cards is nasty, I still cant believe nvidia have paid to implement a tessellation routine which doesnt bear any relation to real life rendering situations (Stone Giant) JUST to make their cards score higher. I really am speechless about that one.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • AMD4EVER

    We need to burn down their HQ

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • LowFat

    Very interesting read and I normally hate these interviews. He certainly didnt hide behind a wall of PR crap which is refreshing to see.

    That Stone Giant benchmark falls on the developers. if nvidia paid them to create a benchmark which doesnt reflect a real world tessellation experience then why use it? I am shocked they have no self respect on this level.

    That said, I would say both companies have ploys and little underhanded things going on. In this case however Stone Giant is used quiet a lot in reviews, even here. Are you guys dropping it? Will they redress these statements. Do nvidia have a reason for the pixel dimension rendering of their tessellation in this benchmark?

    I can’t see them answering anything, if they will, they do it on a handful of nvidia friendly sites, which is pointless.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Jerry

    Well this explains a lot, thanks for the interview. I wondered how the GTX460 evga card in your recent review at Kitguru nailed the HD5870 in stone giant. explains the reasons.

    Wankers.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Lara Crofts Thong

    This is an AMD fan site :) all the hardware here gets destroyed in review.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • TechHead
  • Tri Color

    Richard Huddy rocks. He always says it like it is. While I dont always agree with his views on things, there is very little room to fault his technical knowledge.

    I followed the Batman debate on hexus, that was intense, good to see it called back here.

    Stone Giant was always know as an nvidia funded benchmark, they probably got 100k for developing it and nvidia use it all the time in referencing to tessellation performance.

    To be fair to nvidia, they aren’t really cheating, its rendering tessellation blocks much finer than currently needed, but it doesnt mean its a driver hack. its still handling the same procedure through hardware even if 75% of it isnt used. its still hardware power.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Tim

    But its not a real world representation of tesellation Tri Color. why spend time rendering something when 75% of it is binned in the end? makes no sense.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Tri Color

    if we being totally subjective, then most if not all synethic benchmarks are not real world. you cant play them. none of them have a game based around them, thats why games ilke resident evil have benchmarks built in, to reproduce steps for measuring a game engine. much better.

    In the case of Stone giant however, the tesellation is much finer than anything on the market, but its not to say that in the future the tessellation power WONT be required? this is a fair indication to show differences between the hardware. many benchmarks are meant to show future possibliities. I think its still valid.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Tim

    I think you are confusing the fact that its rendering xx amount of tessellation with the fact that this is needed. if its not needed its hardly a useful benchmark. why render things that a scene doesnt need or doesnt in anyway render more noticeable IQ or data into the image?

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Ghhi

    “a triangle needs to be more than 4 pixels big for tessellation to make sense” (visible vs invisible)

    “Stone Giant, which was paid for by nVidia, tessellation can be done down to the single pixel level.[...] Doing additional processing for each pixel in a group of 4 and then throwing 75% of that work away is just sad”

    Of course! Really biaised it’s sad! :(

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Ju1iet

    Intel also optimizes its GMA drivers to achieve better scores in 3DMark. On the real world, game side while their hardware and drivers are not as good as those integrated ones from AMD or Nvidia, their 3DMark score are nearly the same.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • SiliconDoc

    That’s amazing. After telling us all ATI has EVERY GAME except one being developed in a biased manner for ATI hardware, he WHINES about Batman AA and Nvidia, and a poster cites the Hexus interview, but FAILS to mention that Huddy PROMISED proper AA for ATI cards in Batman, naming and claiming to have his BEST programmer “working on it”, THEN FAILED TO DELIVER.
    So, we have now EVERY GAME COMING according to ATI’s liar having a biased devlopement on ATI cards, except one, and this piece of PR FUD named Huddy, whines instead about Nvidia having “biased developement”, something he claims “ATI would never do”, right after bragging it has done it on EVERY GAME THERE IS except one, and all the hate filled lemmings screed about Nvidia with their puppet master like, frankly, brain dead fools.
    It’s amazing what mindless blind subjects can be created with a few extremely hypocritical statements, while the gaping hole in the whole spiel is staring all the sheeple in the face.
    LOL – If Huddy isn’t laughing at the stupidity he encounters, he isn’t human.
    Congratulations folks, you’re putty in his hands.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • usud

    More like smell of burning PCBs.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Sam

    Hey SiliconDoc, get out in the sun much at all? Or are you too busy working across global forums and newssites for nvidia?

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Mariosti

    @SiliconDoc
    The main problem with your simple thought process is that you clearly don’t understand what your saying. Scalar coding for nvidia card hurts every other processor unit, that is vector sp’s of radeons, and vector sse unit’s of today’s cpus. If a game gets optimized well for a radeon gpu it will still use 99% of the geforce potential, an it will be more optimized to use the system cpu. Why is this possible? Because radeon’s have huge processing power, much greater than their competition.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • GForceXIII

    @SiliconDoc
    The difference is,games coded on radeon hardware do not cripple nvidia cards.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • kcs

    I can only agree with silicondoc here,a very biased and whining huddy,bragging over the fact almost all future games have been developed on ati hardware and whining over one game?
    I like it when the general crowd doesn’t have a clue and amd is losing market share rapidly in gpu land as yesterday’s financial results prove.
    Not the first time I’m seeing huddy act like a child however.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Ghhi

    Radeon HD 5000

    September 2009 – january 2010: one million
    April 2010: 6 million
    June 2010: 11 million
    July 2010: 16 million
    october 2010: “AMD has shipped more than 25 million DirectX11-capable GPUs since introduction in September 2009″…

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • Faith

    nVidia, Intel and AMD all got the same basic set of questions on the same day.
    KitGuru guarantees a 100% no-spin replication of this interview with the developer relations teams from nVidia and/or Intel.
    We’re an equal opportunities tech site as TechHead’s list of recent positive nVidia reviews shows.
    Like most of our readers, we’re really interested to know how each company sees what it is doing, what their competitors do and how the whole market stacks up.
    Ferrari has had its say, now we’re waiting for McLaren and Red Bull.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • kcs

    @ghhi:too bad they hardly made a profit on these and they’re losing market share fast,
    amd will be obsolete if they continue to report negative earnings and sooner than you think

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • bjv2370

    kit guru is reliable how dare you!

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
  • kcs

    Actually thinking about it again they seem to be doing quite well Ghhi. Sorry I came off a little Nvidia fanboish. Had that problem as a kid till now so I have a hard time curbing it now that I realize what I’m doing.

    Next time I’ll try to keep a more subjective outlook.

    VA:R_U [1.9.22_1171]
    Rating: 0 (from 0 votes)
Polls

Which SSD Size for you?

View Results

Loading ... Loading ...
KitGuru Facebook
Latest News Latest Previews
Related Posts
Archives