Over the past week or so, we have heard a lot of rumours about nVidia's products. As wizened KitGurus, you can trust us when we say that our sources are both plentiful and well placed. We report this kind of story when we have no reason to doubt a story. That said, we do stand to be corrected when a company goes as far as to issue a definitive denial.
After carefully reading through all of our posts [we like to think we're every PR director's favourite bookmark – Ed], nVidia's headquarters has made direct contact with KitGuru about our recent revelations. Intrigued? We were.
nVidia has chosen not to contradict our stories relating to the launch of the 50 watt Fermi, GT420 or GTS455 etc, which was nice. Also, they have also been careful to avoid saying that KitGuru's revelations about the pricing or physical specifications of the GTX465 are wrong.
However, they were adamant that at no time has production of the GTX470 card been affected. This contradicts our story that GTX470 chips have been moved to the creation of the GTX490 (and high end notebook systems from Clevo and others).
Also, they have stated categorically that all of nVidia's new cards fully support CUDA and that there are no plans to release an nVidia card in this generation that does not have CUDA support. The nVidia spokesperson we talked to was prepared to go on record as saying “These rumors are categorically incorrect.” Fair enough.
We have to say that, given our source on the story, it's possible that some misunderstanding might have occurred between the various groups involved in bringing an nVidia card to market. For example, if a pre-release driver for GTX465 was delivered to a ‘tester' and CUDA support had not been included in the code at that stage, then that could have caused a misunderstanding.
KitGuru says: Lucky we don't live in a world of instant information transfer, greased-lightning rumour mills and news editors with hairline triggers. Seriously though, Santa Clara was under no obligation whatsoever to contact KitGuru about these stories and to put nVidia's position forward. We really appreciate them taking the time to do so.