By Anil Gupta, Technology Advisor, Magnos Technologies LLP
Driverless cars – the concept has made a lot of hype over past eight to ten years. Though a lot has been going on this subject for as early as fifty years ago. Study has been done about how to make car learn by itself and then able to drive on its own. Today we see a lot of experiments being done about driverless car in controlled environment under the supervision of human, and excellent road and environment conditions.
Not only are the traditional car companies aiming to bring some sort of driverless car in the market in next 5-10 years, but other companies are also showing up a lot of interests. Google is a great example. There are rumors about Apple. Companies like Uber are aspiring to bring driverless cars in the market, obviously which will not only improve the experience of shared riding, but also would have a better control on the operational costs and will help reduce accidents on the road.
The whole concept of ‘driverless’ vehicle on the road has raised curiosity among various sections of the society. Not many people still believe that there could be such a possibility at all, or will be a possibility in near future. And why there shouldn’t be – there are so many moving parameters in the driving function that needs to be handled and controlled simultaneously. And even a single failure could be highly catastrophic. Realistically speaking, there is no absolute black or absolute white about autonomous driving. There are broadly five different levels which can define the level of automation. At the basic level, there is human (driver) who controls all the functions such as brakes, steering, throttle, power etc. A brief about other levels :
Level 1: Most functions are still controlled by the human( the driver), but some specific function (like steering or accelerating) can be done automatically by the car.
Level 2: In level 2, at least one driver assistance system of steering and acceleration/ deceleration using information about the driving environment is automated, e.g. cruise control and lane-centering. Hence driver starts getting disengaged from physically operating the vehicle by having his or her hands off the steering wheel and foot off pedal at the same time. However, in this condition, the driver still should be completely alert and must always be ready to take control of the vehicle.
Level 3: Human drivers are still needed in level 3 cars, but are able to completely shift safety-critical functions to the vehicle, under certain traffic or environmental conditions. It means that the driver is still present and will intervene if necessary, but is not required to monitor the situation in the same way it does for the previous levels.
Level 4: Is meant by “fully autonomous.” Level 4 vehicles are “designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip.” However, again this is limited to the Operational design domain (ODD) of the vehicle—meaning it does not cover every driving scenario.
Level 5: This level refers to a fully-autonomous system in which it is expected that the vehicle’s performance to equal that of a human driver, in every driving scenario—including extreme environments like dirt roads that are unlikely to be navigated by driverless vehicles in the near future.
Now comes the real challenge. How to design an autonomous/driverless vehicle system, which is capable of handling vehicle’s performance like human in all possible conditions. Autonomous vehicle typically a combination of sensors and actuators, sophisticated algorithms, and powerful processors to execute software. There are hundreds of such sensors and actuators which are situated in various different parts of the vehicle, being driven by a highly sophisticated system. The sensory system can be classified into three different parts
Navigation and Guidance : the system which determines – where you are, where you want to go, and how do you get there. Instruments and techniques such as the compass, sextant, LORAN radiolocation, and dead reckoning are among those which have been used, with varying degrees of accuracy, consistency, and availability.
Driving and Safety : Directing the vehicle, making sure that vehicle acts properly under all circumstances and follow the rules of the road. The autonomous car must be able to see and interpret what’s in front when going forward (and behind when in reverse, of course). It is also necessary to see what is on either side; in other words, it needs a 360⁰ view. An array of video cameras is the obvious choice, with a camera to determine where the lane is and sense objects or markers on the road.
Performance : managing car’s internal systems. , a large portion of the design of an autonomous vehicle involves mundane issues such as power management. Several application-specific, unique circuit boards and subsystems are added to a conventional vehicle to provide the functions needed for autonomous operation. Much of the system-level operation involves measuring and managing the power requirements to control power, overall consumption, and thermal dissipation.
Today, we see the driverless cars a reality after a constant research and development effort for past fifty plus years. Still there are a lot of challenges in designing a fully autonomous system for the driverless cars. Some sneak-peak about these challenges are :
- Road conditions : Road conditions could be highly unpredictable and vary from places to places. Some cases there are smooth and clearly lane marked broad highways, Open to access the GPS and other signals from satellite and other communication interfaces. In other cases, road conditions are highly deteriorated. No lane marking, lanes are not clearly defined, potholes, mountainous and tunnel roads where external signals for direction are not very clear and likewise.
- Weather conditions : Weather conditions play another spoil sport. There could a sunny and clear weather condition, there could be night dark condition in which optical cameras don’t necessary work clearly. There could be rainy or stormy weather. Thunder conditions could play spoil sports in cases. Autonomous car should be working in all sorts of weather conditions. There is absolutely no scope for failure or downtime.
- Traffic conditions : Autonomous cars would have to get onto road where they would have to drive in all sorts of traffic conditions. They would have to drive with other autonomous cars on the road, and at the same time there would also be lot of humans. Wherever humans are involved, there are involved a lot of emotions. Traffic could be highly moderated, self regulated and smoothly moving. But often there are cases where people may be breaking traffic rules. Object may turn up in unexpected conditions. In the case of dense traffic, even the movement of few cms per minute does matter. One can’t wait endless for traffic to automatically clear and have some precondition to start moving. If more of such cars on the road wait for traffic to get cleared, ultimately that may result into a traffic deadlock.
- Accident Liability : The most important aspect of autonomous vehicle is accidents liability. Who is liable for accidents caused by a self-driving car? In the case of autonomous cars, the software will be the main component that will drive the car and will make all the important decisions. While the initial designs have a person physically placed behind the steering wheel, newer designs showcased by Google, do not have a dashboard and a steering wheel! In such designs, where the car does not have any controls like a steering wheel, a brake pedal, an accelerator pedal, how is the person in the car supposed to control the car in case of an untoward incident? Additionally, due to the nature of autonomous cars, the occupants will mostly be in a relaxed state and may not be paying close attention to the traffic conditions. In situations where their attention is needed, by the time they need to act, it may be too late to avert the situation.
- Radar Interference : Autonomous cars uses lasers and radar for navigation. The lasers are mounted on roof top while the sensors are mounted on the body of the vehicle. The principle of radar works by detecting reflections of radio waves from surrounding objects. When on the road, a car will continuously emit radio frequency waves, which get reflected from the surrounding cars and other objects near the road. The time taken for the reflection is measured in order to calculate the distance between the car and the object. Appropriate action is then taken based on the radar readings. The principle of radar works by detecting reflections of radio waves from surrounding objects. When on the road, a car will continuously emit radio frequency waves, which get reflected from the surrounding cars and other objects near the road. The time taken for the reflection is measured in order to calculate the distance between the car and the object. Appropriate action is then taken based on the radar readings. When this technology is used for hundreds of vehicles on the road, will a car be able to distinguish between its own (reflected) signal and the signal (reflected or transmitted) from another vehicle? Even if multiple radio frequencies are available for radar, this frequency range is unlikely to be insufficient for all the vehicles manufactured.
Challenges are many even today for rolling out the autonomous cars on the road. But so is the determination of our scientists, engineers and problem solvers from various disciplines. The collective effort of the industry will definitely make the autonomous car on the road a reality one day, and the benefits will be huge. Not only it will save fuel, encourage efficient transportation and shared services, but will also help in saving many lives that are lost regularly in road accidents.