I know Elon Musk famously said that Tesla is “very close to level five autonomy,” and that there are “no fundamental challenges,” just “many small problems,” but the truth is that those “many small problems” are what reality is made of, and there’s a great and actually fun example of one of these small problems, which was captured by a Tesla owner and shared for all of us to enjoy. So let’s enjoy it!
This is a good example of how far autonomy — especially the camera-based visual systems that Tesla is now leaning into— really has to go in its quest to understand the world well enough to drive in it without constant supervision.
In this example, which was first posted to Reddit’s r/teslamotors subreddit, we see a video of a Tesla Model 3’s large dashboard screen, which is showing a visualization of what it’s perceiving in the world around it as it operates under the semi-automated Autopilot system.
We see the car traveling at about 80 mph, identifying road lanes and other traffic, which primarily seems to be trucks, represented on screen as simple 3D box trucks.
The weird part is that from the box truck directly in front of the Tesla, we have a constant stream of traffic lights being emitted, flying out of the back of the truck and zooming over the Tesla:
Oh boy, look at that. It looks like one of those rhythm video games, where you have to hit some control every time another traffic light flies over your car. Honestly, it looks pretty fun.
As to what is actually going on here, the original poster was kind enough to post a little clip of exactly what was causing Autopilot to manifest so many phantom traffic lights:
Ahh, that makes sense! A truck with three traffic lights! This is not something that would have fooled a human driver, but, remember, computers are very fast-thinking, mathematically gifted morons. This was more than enough to baffle it.
This particular case can likely be special-cased so the system can understand it better, at least until a visually different design of traffic-light hauler shows up.
Also, it’s lucky that these were traffic lights and not, say, stop signs that would have triggered the car to stop right there in the middle of the highway, or any number of other visual representations of objects that could fool the system into suddenly stopping.
This is a funny example, sure, but it’s also a great reminder that the world is full of unexpected “edge cases,” and if you’re going to trust the driving of your speeding car to a machine with no understanding of the natural and artificial world and all of its strange rules and vagaries and weird situations, then you’d better do your homework.