How A Driverless Car Sees The Road - LEKULE

Breaking

24 Sept 2017

How A Driverless Car Sees The Road

Near Future Driverless Car

In the near future if you are if you’re late for work and you don’t have time for breakfast your car will be there for your rescue. You will sit back, relax and enjoy your breakfast while your car safely drives you to your destination. No, this is not a sci-fi movie scene I’m talking about. This is soon going to be a reality!
If technology grows at the current rate, soon your car will do all the driving and concentrating for you. Automakers are currently working on new technology that will allow the cars to drive themselves. They're also modifying the existing technologies such as self-parking and pre-safe systems to make the driving safe.
driverless car
Figure 1: Components to be used in Driverless Car
Studies show that driver error is the most common cause of traffic accidents. Chris Urmson who is currently working on driverless car technology goes even further and says that the least reliable part of the car is the driver. The vehicles have been made safer, smarter and stronger but the problem of driver still exists.
Car advertisements sell us the idea of warm sunny days and driving through the countryside with cool wind rushing through our hair but the reality is quiet different from it. Driving these days mainly consists of sitting in the traffic or rain paying more attention to our phones instead of our surroundings. With the way technology is advancing, these problems are not going to go away. Almost 2 million people are killed on the world's roads every year. So if the driver is not paying attention to the road, then who is?
In this article, we'll talk about the technology behind cars that can operate without any or minimum input from the driver. Also how far away these cars are from mass production and when are we going to let the machines take over?

The Primary Technology

Google has been working on driverless cars since 2009. Surprisingly, these cars have driven over a half a million miles (804,672 Km) without a single crash! While human drivers get into an accident every half a million mile.
The technology uses a Chauffeur system called LIDAR (light detection and ranging). LIDAR works as a radar and a sonar but more accurately. What it actually does is that it maps points in space using 64 rotating laser beams that take more than a million measurements per second and form a 3D model in its computer brain. The system also includes preloaded maps that tell it where the stationary things are such as the traffic lights, crosswalks, pavements etc. while the LIDAR pictures the landscape with moving objects such as people and traffic.
blueprint-of-driverless-car-operation
Figure 2: Blueprint Of Driverless Car Operation
To work safely and efficiently, self-driving cars has understand its position on a GPS map and also the relative position of other cars and pedestrians. While pedestrians or construction adds to the level of complexity of the algorithms and inputs received by the cars. Police cars, cyclists and school buses required to be handled uniquely by the car.
Google collects data which displays the behaviour of pedestrians, cyclists and drivers Around 3 million miles of testing is done everyday on the simulators on this technology. As more and more data is collected, the situations can be predicted better by the car.

How it sees the road and its surrounding

The vehicle starts by understanding where it is in the world by taking an input from its map and its sensor data and by aligning the two. On top of that it layers on what it sees in the moment. Such as other vehicles, pedestrians in the surrounding.
But the driverless car has to do better than just understanding its surrounding. It has to be capable of predicting what is going to happen.
For ex- a truck in front it is going to make a lane change as the road in front of him is closed. The driverless car has to know this. But in fact even knowing that is not enough.
What it really needs to know is what everybody on the road is thinking. On top of that the car has to figure out how it has to respond in the moment. Such as what trajectory to follow, should it slow down or speed up in the moment. When combined together, all this becomes quiet complicated and is obtained by thousands of algorithm checks.
When this concept started in 2009, it was a very simple system, in which the car was driving on the road in which it had to understand where it is and roughly where the other vehicles are on the road. It was pretty much like a geometric understanding of the world.
Although this is not what we encounter in our daily lives. On city streets, the problem takes a whole new level of difficulty. There are pedestrians crossing, cars going in every direction plus the traffic lights and constructions going on the road.
Above all this, it has to respond differently to police vehicles and school buses. Also the car has to understand when the police officer signals to stop and to go.
Chris gives a wonderful example in which the car is standing on a red light. An ordinary driver sitting in the car cannot see the cyclist at the far left end out from his eye sight view. But the driverless car can! This is possible because of the laser data scanned by the car of the area in its vicinity.
Now the cyclist is coming through on the road. His light has turned yellow but he keeps on moving. Now halfway, his light turns red and our light turns green. Most of the drivers move forward because they haven’t noticed the cyclist in the first place. But the driverless car because of his laser data anticipates that the cyclist is coming through. It responds safely to it. While the other drivers start to pull forward, the cyclist has a narrow escape by avoiding the collision but the driverless car waits patiently for the cyclist to cross through.

Limitations

Although, the people working on this technology are pretty sure that it will eventually come to market, it still poses some serious limitations.
So far, the available sensors and artificial intelligence is not capable of seeing and understanding the vehicle’s surroundings as accurately as a human being can.
For example, if a ball rolls down the road, a human might anticipate that a child could follow. Artificial intelligence cannot provide that level of thinking neither it can communicate with its surrounding in real time. To achieve that level of artificial intelligence would almost take 16 years.
As of 2014 the latest prototype have not yet been tested in heavy rainfall and snowfalls. This was mainly because of safety concerns as the cars are primarily pre-programmed with the route data, and do not obey temporary traffic lights. The vehicle faces a difficulty in understanding when a trash can or garbage on the road is harmless.
The LIDAR technology cannot spot some potholes or distinguish when a human, such as a police officer, is signalling the car to stop. Google expects these issues will be fixed by 2020.

No comments: