Monday, March 28, 2022

Self-Driving Car Challenges

Mercedes recently announced level 3 self-driving that takes legal liability. Typically, the levels of autonomy are defined by situations where the car is able to operate without human intervention, but being able to assume legal liability is a huge step forward. Levels of autonomy would only remain a marketing slogan unless it can be tested in court. The cost of liability will be factored into the insurance premium, so self-driving only becomes economically viable if the insurance premium can be lower than that of a human driver.

However, Mercedes' legal responsibility comes with limits: on certain (already-mapped) highways, below 40 mph, during daytime, in reasonably clear weather, without overhead obstructions.

The first challenge that any camera based system must overcome is the stability of the footage. You can see from this test that a car mounted camera with improper image stabilization (circa 2016) would produce a wobbly and shaky image. The wobble can be counteracted by shortening exposure time (high camera shutter speed) but this requires good lighting condition, hence the reasonably clear weather requirement. Furthermore, when the car is traveling fast, you need a telephoto lens to look further ahead for hazardous conditions, but longer telephoto also exacerbates the wobble, hence the operating speed limit of the self-driving. If the video footage is bad, it won't help much if you feed this into machine learning because "garbage in, garbage out." More recent cameras such as a GoPro has improved image stabilization (circa 2021) that also works on a toy car (circa 2020) which is more challenging to stabilize. These cameras can produce clean image under more forgiving lighting conditions.

Car manufacturers who are serious about their self-driving should be licensing camera stabilization technology from the likes of GoPro.

Self-driving cars using LIDAR face a different challenge. LIDAR works by sending a short pulse of light and observe reflections, so it is not dependent on external lighting conditions. But when there are multiple LIDAR equipped cars on the road, they could be picking up each other's signals which might seem like noises. As such, a LIDAR has to encode its own unique ID into the light pulse and filter out any pulse not coming from itself.

A third challenge is about how to legally dispute a claim in court. A self-driving system must be able to produce a rationale why it made any given decision, and the decision has to be legally justifiable. Previously, machine learning is a black box that could produce surprising results (it recognized a dumbbell only because it has an arm attached to it), but explainable AI is making some progress. Similarly, self-driving technology must be explainable.

Explainable self-driving technology can be thought of as a driving instructor that happens to be a computer. The driving instructor not only has to make the right decision on the road, but it also has to explain to a student why it is the right decision given the circumstance.

Car manufacturers that want to assume legal responsibility should aim for a computerized driving instructor instead.

A musician would understand that in order to perform at 100%, they must practice up to 120%. This mindset applies to a lot of engineering problems where safety is at stake. Building structures such as elevators are often designed with load considerations at ~200% rated capacity or more (ASCE/SEI 7-10 1.4.1.1 (b)). When it comes to self-driving, the specs are less quantifiable, but the design qualifications must similarly exceed expectation in order for the technology to be viable.

No comments: