Skip to main content

Tesla has scheduled their AI Day for August 19th at 5:00 PM PDT. There will be an invite-only, in-person audience attending the event at the Tesla HQ in Palo Alto, California. For those without an invitation (including myself), Tesla will be hosting a live stream on their website and on Twitter.

How to Watch AI Day

Elon Musk announced via Twitter that AI day will be streamed for the public and press to view. The event is following a recent announcement of a NHTSA investigation into 11 autopilot accidents dating back to 2018.

Live Stream: CLICK HERE

Live Updates (PDT)

Refresh page occasionally.

5:00 PM: Live stream now displaying a cover page of the presentation.

5:47 PM: Musk has now taken the stage. 

5:48 PM: "We want to show that Tesla is much more than an EV company" - Musk

5:50 PM: Andrej takes over the presentation. Starts by demonstrating the "Vision Component". 

5:50 PM: "We are developing a synthetic animal from the ground up" -Andrej

5:53 PM: "Hydranet" allows multi-task learning which allows feature sharing, de-coupling of tasks, and speed up fine-tuning.

5:57 PM: Need neural net for FSD because individual predictions per individual camera were too difficult. 

6:03 PM: Merging camera images together allows for greater prediction ability of the car.

6:06 PM: Added feature cache to remember certain road signs and vehicles. Allows storage of information surrounding car.

6:07 PM:  "We prefer a spatial RNN video module"-Andrej

6:09 PM: "Multiple cars can collaborate to and HD map"-Andrej

6:13 PM: Team is actively working on increasing the speed of fusion between time and space.

6:15 PM: Key problems with planning in action-space: non-convex  and high-dimensional.

6:16 PM: *Demonstation of planning a real-life lane change*

6:17 PM: Car will run through many scenarios to choose the safest and most comfortable outcome.

6:18 PM: FSD will predict other cars' actions and update decisions as variables change.

6:20 PM: FSD will optimize for smooth turns.

6:24 PM: Final architecture will include: vector space, neural net planner, explicit planning, and control.

6:25 PM: "Neural networks only set upper-bounds on parameters"

6:25 PM: *Moving onto data labeling*

6:26 PM: Tesla moved all data labeling internally as the third-party quality was poor. 

6:27 PM: "Originally used 2D labeling and now using 4D labeling in vector space"

6:28 PM: Now utilizing auto-labeling.

6:32 PM: Different cars can collect clips in the same location to label objects clean noisy data. 

6:32 PM: The benefit of using data offline allows for predicting vehicle and pedestrian movement despite visual occlusion. 

6:34 PM: Prediction and memory have allowed for vision-only in hard-to-see scenarios.

6:35 PM: Investing heavily into the ability to create simulations to train neural networks. 

6:36 PM: Simulation is helpful when scenarios are difficult to source. Ex: couple walking a dog on the highway.

6:40 PM: Algorithms can create scenarios allowing for scalability.

6:41 PM: Can recreate scenarios from real-world into the simulation.

6:42 PM: *Moving onto scaling data generation*

6:43 PM: FSD hardware improves latency and frame-rate.

6:44 PM: Dual computing engines in FSD.

6:45 PM: Currently running over 1 million evaluations on any code change for greater debugging. 

6:45 PM: *Moving onto Project DOJO*

6:46 PM: Consists of Distributed COmpute Architecture

6:47 PM: Bandwith scales with the plane of compute

6:49 PM: Training Node is the smallest entity of scale in DOJO. Wanted to address latency and bandwidth by using 64 bit superscalar CPU.

6:52 PM: Training nodes arranged together by abutment to allow modular system.

6:52 PM: D1 chip is manufactured in 7nm tech. 100% of the area is going to machine learning. Chip was entirely designed by the Tesla team.

6:53 PM: Is Tesla now a chip company?

6:54 PM: D1 chips allow a seamless connection between each other. 500,000 chips connected to create compute plane.

6:57 PM: Traning tile is unit of scale for the system with 9 Peta Flops of compute. Possibly largest multi-chip module in the industry. 

6:58 PM: Training tile is fully integrated with a cooling system and power supply allowing for a high-bandwidth compute plane. 

7:00 PM: Currently running first training tiles.

7:01 PM: Tiles are seamlessly integrated while preserving bandwidth. 1 ExaFlops of compute per cabinet.

7:01 PM: *Moving to software*

7:02 PM: Utilized "DOJO processing unit"

7:04 PM: Will became fastest AI training computer. 

7:05 PM: Much more to be done. "Planning a 10X improvement already. "

7:07 PM: Musk now speaking

7:07 PM: "Tesla is arguably the world's largest robotics company"-Musk.

7:08 PM: May have a Tesla humanoid bot "Tesla Bot" that will be used to eliminate dangerous and repetitive tasks.

7:09 PM: Bot will have eight cameras and an autopilot system making use of all the same tools in a Tesla vehicle. 

7:10 PM: Tesla Bot will be able to listen to commands.

7:11 PM: "Physical work may be a choice in the future"-Musk

7:12 PM: *Now going to Q&A*

What to Expect

The purpose of this event is for Tesla to recruit talent in its autonomous/artificial intelligence branch. Tesla also mentioned in their invitations, that viewers will "also get an inside-look at what’s next for AI at Tesla beyond our vehicle fleet."

Beyond the aforementioned, it is pure speculation about what AI Day will fully entail. One rumor circulating was the involvement of a UCLA researcher, Dr. Dennis Hong, after stating he was involved in a "secret project" with Tesla. Interestingly, Dr. Hong's tweets related to the "secret project" were also deleted. One can only speculate that his involvement would include any of his research interests: Humanoids & Bipedal Robots, Robot Locomotion & Manipulation, Soft Actuators, Robotic Platforms, Autonomous Vehicles, Machine Design, Kinematics & Mechanisms.

What do you think Tesla has in mind for AI beyond their vehicle fleet? Let us know on Twitter @teslapodcast.


Disclosure: Brennan Ertl is long TSLA stock and derivatives.