A world of autonomous vehicles may usher in an era of convenience for commuters, but it may also result in new complications
Automation, automation, automation — it is the watchword driving innumerable industrial inventions, it is the proverbial compass that guides the hand of humanity as we seek new ways to alleviate ourselves of the burden of hard work and decision-making.
They say necessity is the mother of all invention. Well, I like to think it is our laziness, paradoxically, that pushes us to find new ways to solve our problems. In one of our previous articles, we talked about how lazy employees are a company’s biggest assets because they are “practical, creative, and intelligent people who don’t overthink things”.
And that rings true for the purveyors and innovators of automation. You see, what these lazy people are doing is surreptitiously conditioning us to entrust our lives to an elegantly constructed series of circuit boards programmed with an indecipherable cocktail of algorithms. We don’t know the science behind it — hell, you could tell me it was a hamster on a wheel powering everything all along and I would be hard pressed to disprove you — but we believe it works well anyway.
Our enterprising ways will slowly but surely bring our society closer to achieving a sort of quasi-stoner’s zen – the ability to sit around and do diddly-squat, and, not because we are incapacitated, but because everything is taken care of.
Now that would be okay if it was, say, an automated lasagna making machine, or even automated home power control system. Because if these devices hit a glitch, it wouldn’t exactly endanger your life —unless you encounter the rare chance of it to exploding in your face — you could call/email tech support and have it remedied while sipping a cappuccino.
But not an autonomous driverless vehicle.
Cruising the highway in a steerless car
We’ve all read how Google is pushing the envelope for driverless cars. Since its self-driving car projection began about five years, it has successfully navigated several cities in the US, and its system has even been recognised as an official driver. The goal is of the project is to eventually transform drivers into mere passengers, and let the car do all the hard work.
Google X’s latest prototype even takes out the steering wheel and brake pedal, essentially rendering it completely autonomous.
What? not even an emergency override? Aren’t we getting ahead of ourselves yet?
Well, the reasoning behind this move according to Astro Teller, the head of X (formerly Google X), is because “users didn’t stay alert in case the car had to hand control back to them.”
Google is not the only entity championing this move, the UK government is trialling an £8 million (US$11.5 million) project to put driverless pods on to the streets of London. Each pod can carry up to six passengers and comes equipped with an emergency button.
And closer to home, the Singapore government yesterday unveiled a joint venture with 2 Getthere Holding B.V to launch 2getthere’s 3rd Generation Group Rapid Transit (GRT) vehicle. Each pod purportedly holds up to 24 passengers each, “and can operate as a low-cost automated transit system that can cater for up to 8,000 passengers per hour in any single direction,” according to an official press release.
This certainly spells good news for everyone who dreaded to be the designated driver at parties, or people can’t seem to pass their driving candidates, or just bad drivers in general.
Also by offering more alternatives to current mass transport systems, cities can reduce congestion especially during peak hours.
Now I’m not a born-skeptic, but I think caution should be exercised anyhow.
Remember a few months ago when a Google self-driving car got collided with a bus? It was the first case where the self-driving car was at least partially at fault.
But who takes the blame if the self-driving car is responsible for the accident? If Google’s system is recognised as “the official driver” as mentioned earlier, should the victim file a claim against Google itself, since it was the system that made the fault?
And by the way, who exactly is or are the victims here? Is it the passenger in the Google self-driving car or the drivers and passengers in the other vehicle, or both?
It is important for regulations and guidelines to be drafted first before we even think of rolling out autonomous vehicles in masses.
There is also the question of over-reliance on computers. Despite Tesla’s Model X widespread popularity and cult following, TechCrunch reported several software faults with its infotainment and proximity sensors. In one case, its autopilot went a little nuts and activated emergency brakes instead of entertainment.
This might sound alarming, but when you put it all in perspective — autonomous vehicles are basically computers on wheels; software glitches and the likes are bound to happen from time to time and regular updates are required. The only difference is that you aren’t cruising down the highway at 60km/h when that happens.
It will be important for future autonomous car manufacturers to execute far more stringent tests than you would on a simple nuts-and-bolts 1995 Ford station wagon.
There is also of course, the question of a driver’s license. Would future drivers (or passengers in this case) be required to learn how to drive if some autonomous cars like Google’s don’t allow you to “take the wheel” anyway?
Or would they be trained specifically for a driverless vehicles regardless of their ability to drive a human-controled vehicle?
And that is just for people who own personal driverless cars. What about the folks in autonomous mass rapid transports on the road? How much culpability can we expect from mass transit authorities?
When a person falls into a train track and is killed, it is nearly always considered ‘human error’ and the fault of the victim. Does same apply to a robotic vehicle? This question has yet to be answered.
It would be safe to say that even if no drivers are needed, transport authorities would still need to have one personnel manning each vehicle, at least in the first few years of use, until all the kinks and other issues have been ironed out.
It would certainly be unfair to pass premature judgements on driverless vehicles. After all, the technology is still at it’s infant stage. But we should also exercise a healthy degree of caution and skepticism as we move forward with each step, so that we can nip potential problems in the butt before they turn into a possible multi-billion class action lawsuit.
The post Driverless vehicles – should we embrace it, albeit with caution? appeared first on e27.
from e27 http://ift.tt/1WHpWhH