Tesla CEO Elon Musk has had his opinions on Artificial Intelligence (AI) spoken publicly on a few occasions. However, the man has now decided to finally give the technology the space it needs. He has reportedly ‘hired’ the technology and turned AI into a subordinate that reports directly to him.
Interestingly, the CEO has previously gone against Facebook’s Mark Zuckerberg and Alibaba’s Chairman Jack Ma over the role of AI. However, Musk now apparently implements the technology in question, likely in his startup Neuralink which is creating a brain-machine interface. However, the CEO still has some reservations and here’s why.
Watch: Ather 450X First Look
The Neuralink project is applying cutting-edge technology to train neural networks on various problems. These range from perception to control. Musk tweeted tagging Lex Fridman, popular host of an Artificial Intelligence podcast on YouTube. He said that at Tesla, using AI to solve self-driving isn’t just icing on the cake, it the cake.”
He further added saying “Join AI at Tesla! It reports directly to me & we meet/email/text almost every day. My actions, not just words, show how critically I view (benign) AI,” the Tesla CEO added. For Musk, AI is allowed to do certain tasks but these have to be critically evaluated by him.
“Our per-camera networks analyze raw images to perform semantic segmentation, object detection and monocular depth estimation. Our birds-eye-view networks take video from all cameras to output the road layout, static infrastructure and 3D objects directly in the top-down view,” says Tesla.
What Tesla is using AI for
Tesla is building silicon chips that will completely power its software on self-driving cars. Further, they will be able to take into account every tiny architectural detail and improvement and push hard to get more performance-per-watt. “Our networks learn from the most complicated and diverse scenarios in the world, iteratively sourced from our fleet of nearly 1 million vehicles in real-time. A full build of Autopilot neural networks involves 48 networks that take 70,000 GPU hours to train. Together, they output 1,000 distinct tensors (predictions) at each timestep,” the electric automobile manufacturer added.