‘Waymo is using TPUs’ : SelfDrivingCars

0
78


No, it’s not bad.

A few years ago Google announced they had designed a special chip for use in their datacenters called a Tensor Processing Unit, which is a chip specifically optimized for machine learning applications, and it marks a dramatic improvement over GPUs (which most people are still using) for machine learning tasks. Many companies have or are in the process of making similar chips, which are also referred to as AI chips, neural network processors, or inferencing chips. Apple has a small AI chip in the iphoneX, for instance.

Nvidia and Intel are both in the early stages of producing application specific integrated circuits with inferencing capabilities tested to automotive grade reliability for general use in robotaxis, and Tesla is also getting their version ready for mass production.

It stands to reason that would be ahead of the curve on this, given that they have access to Google’s technology and Google had an early lead in producing these kinds of chips, though Ben Evans tweet would be the first confirmation we’ve seen to that effect.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here