overnight, a new video computer vision and pattern recognition meeting (CVPR), Ashok Elluswamy, head of Autopilot Software at Tesla, gave a presentation.
Elluswamy provided some great insight into the new developments Tesla is making in terms of computer vision-based self-driving software known as full self-driving. Non-Tesla owners are confused by the name, but an insight into Tesla’s commitment to understanding the world around their vehicles and navigating safely should leave no doubt in people’s minds. .They absolutely plan to make their cars self-driving
During the talk, we saw Tesla’s new strategy for tackling the challenges of achieving this. We’ve seen Tesla start using lane boundaries as a driving boundary parameter, which is still used in autopilot today, but the FSD beta uses drivable space as a limiting factor. to allow the car to move over the line at the appropriate place. Complex environments in a more human-linked way.
Tesla also does depth calculations from its vision system to determine if an object is solid and therefore should be avoided, but this has challenges such as predictability through occlusion.
Elluswamy introduces occupied networks as a solution to these challenges. This is a new technique that takes input from all eight cameras to build his 3D representation of the volumetric coverage around the car. This means not only identifying curbs to avoid when cornering, but actually understanding the vehicle and the space around it (this means cicadas avoid objects such as overhead signs and overhanging trees). can be very useful for
Effectively this translates to whether the space is occupied or not. That is, cars cannot enter an occupied space, regardless of what is occupied. It can be a permanent object like a mailbox or a temporary object like a parked car. In any case, the car understands that it cannot enter the space and navigates through it, enforcing the constraints of the prevailing road rules.
Things are static in this proprietary network, and Tesla treats every object as if it can move. So predict the flow (or movement) over time to understand if Tesla is going to occupy the space you’re trying to drive.
What I loved hearing in the demo was Tesla’s ability to detect objects without 3D assets and understand them occupying unusable space. This means that the item is not visible in the visualization, but it does not mean that the car is not recognizing and responding to the item. Given that we don’t have an infinite amount of 3D assets in our software, I think there’s an interesting discussion about how to represent this within Tesla.
One of the most interesting sections of the presentation is Elluswamy talking about occlusion for about 18 minutes. Parts of the environment in which we drive may be hidden from the camera’s view. This is a real challenge for Tesla to understand how it can overcome such problems.
This example is ‘Chuck’s turnThis will be addressed in FSD Beta v10.69, released today. The presentation shows an example of an intersection where oncoming traffic is blocked from view.
Another great example of these difficult intersections is this image from Lisa, which shows a tree on the left side of the vehicle.
To date, Tesla has used a very rudimentary implementation of creep. It edges the vehicle forward to increase the angle of incidence to give the vision system and path planning a chance to pass the intersection.
What we learned today is that the Tesla brain inside the car has the ability to understand where it has trouble seeing from (i.e. a car that appears out of nowhere) and adapt to that lack of visibility. That’s it. Think of it as smart creep. Understand the distance from the front of a passing car and assess the space available for creeping. The right amount of creep improves your visibility to oncoming traffic and allows your car to navigate the environment safely.
This is what’s included in FSD Beta v10.69, and I believe it makes some pretty big architectural changes to how decisions are made in the car.
Later in the presentation, we learned about Tesla’s ability to avoid collisions. We’ve seen a bit of this before with Smart Shift. On the updated Model S and Model X, Smart Shift allows owners to automatically select the driving direction based on what the camera sees.
This collision avoidance technology now goes well beyond that and seems to apply to all Teslas. This could cause the vehicle to crash into a building, other vehicle, or worse, a person if a human driver accidentally hits the accelerator.
As mentioned above, leveraging space “occupancy” data, Tesla translates it into collision avoidance probabilities. In other words, if you hit the gas pedal against a wall, the probability of a collision is almost 100%, and the car won’t let you do that.
The image above is from a simulation of a distracted driver stepping on the accelerator and not steering. The car detects when a collision occurs and plots a path for safely navigating the car through the environment. Unlike the gradual lane corrections found in today’s cars, with just a few turns of the wheel, in this example the Tesla changes trajectory dramatically, crossing into the correct lane and proceeding safely.
Elluswamy concluded his presentation by explaining that if he successfully implemented all the techniques he described, he could build a car that didn’t need to crash.
This job is clearly not complete and more effort is needed to solve it. The final slide of the presentation pitches engineers to join Tesla in building a car that never crashes!
This will be a very interesting challenge for Tesla. Now they are entering the insurance field. There’s still a chance that other cars will run into Teslas, so insurance is likely still needed, but imagine the day when all Teslas HW3 and above get a software update that makes them uncrashable (From the results of the driver’s input) ) It’s wild to think.
Tesla wants to make a car that doesn’t break. His Ashok Elluswamy, Head of Autopilot Software presents at his CVPR 22.
Source link Tesla wants to make a car that doesn’t break. His Ashok Elluswamy, Head of Autopilot Software presents at his CVPR 22.