The development of ever smarter artificial intelligences is going to be a major component of high-tier tech firms over the next few years, but Google looks to have the jump on most of them. It’s developed its own AI processor, specifically designed for high-operational tasks and it’s been testing its effectiveness over the past few years.
It’s called the Tensor processing unit (TPU), a chip that is built specifically with machine learning in mind. That means that its main focus is raw power. It’s much faster than your average chip of similar size and power draw and can fit neatly into a hard drive bay in a data centre rack (as per Engadget).
It looks to be passively cooled too. You can imagine a huge array of these would provide plenty of thinking power, with an ominous silence over the whole datacentre.
These chips have already proved effective too, having been used to improve different parts of Google’s services, like its mapping systems. TPUs were also used to help the AlphaGo system beat a Go champion earlier this year.
Although Google has no plan to make this chip into a commercial product – no doubt preferring to keep its AI developmental hardware to itself – we should all feel the benefits over the next few years. As automated services become more nuanced, we’ll no doubt have TPUs to thank for that in some part.
Discuss on our Facebook page, HERE.
KitGuru Says: Considering how fast we’ve seen the development of hardware for tasks like Bitcoin mining, expect AI focused hardware to become far, far more powerful in a short period of time.