Here's Something: Don’t trust Elon Musk to drive your car

  • Mail this page!
  • Delicious
  • 0

Talk of a future filled with driverless trucks, autonomous cars and remotely controlled public transit scares me.

And it isn’t just the possibility of crashing in – or into – one of these self-driving contraptions that frightens me. What really bothers me is the mindset that says technology will always outperform humans, and the lack of pushback from media and government on this way of thinking.

We hear news about driverless vehicle technology almost daily. Tesla and Google are leading the way by creating cars featuring autonomous driving capabilities that would allow the driver to nod off, read a book or check their email while behind the wheel. Some driverless and autonomous cars are already on our roads, and mass-production vehicles will completely drive themselves within a few years.

The latest development comes from Sacramento, California, where government is proposing driverless public transit shuttles that can hold up to 12 people and travel between 8 and 25 mph on a pre-programmed, fixed route around the city. Each vehicle would cost about $200,000 and run on a battery that can last up to 10 hours.

The idea of computer-chauffeured cars sounds futuristic, even Jetsonian. But on second glance it’s not that big a deal. When I was growing up, flying cars were the future. No one talked about driverless cars. Have we given up on flying cars? Are today’s drivers more interested in sleeping and playing on their phones than taking off into a new dimension? Are the self-driving vehicle manufacturers just giving Americans what they want – more time to play Angry Birds?

While many are looking forward to letting a computer drive their car, many others fear a driverless future. According to a AAA survey conducted in March, “three‐quarters of U.S. drivers would be afraid to ride in a self‐driving vehicle, while 19 percent would trust the vehicle and 4 percent are unsure.” The same survey found that 54 percent “would feel less safe sharing the road with self‐driving cars while they drive a regular car.”

The survey results aren’t surprising; I don’t know anyone who trusts technology enough to put their lives in its hands. What is surprising is that we rarely hear this kind of skepticism in news reports. We only hear that a self-driving future is a certainty. Is the media that enamored of Tesla founder Elon Musk that they forget to ask about the potential downsides to his technology aims?

While the media praise each new driverless technology achievement and our government offers little pushback or guarantee for driver safety as the manufacturing sector careens headlong into this dangerous future, all I can think of is the poor guy in Florida who sat watching a Harry Potter movie as his Tesla Model S, equipped with Autopilot software, failed to detect a truck crossing the highway ahead. The car drove itself under the 18-wheeler’s trailer, killing the man.

Although U.S. traffic safety regulators blamed the driver and not the software after investigating the 2016 crash, the story instantly made me fearful of anything driverless.

There are other aspects to this issue besides safety. Many people simply like driving. It’s a form of freedom. They want to be in control of their car. They want to enjoy the curves of a country road. Will they soon be unable to buy a car they can pilot manually? Will the government eventually outlaw human drivers?

Also, I don’t think I can give up steering and braking control to a computer. I already get freaked out when I’m a passenger in a vehicle. I don’t even like automatic transmissions or ABS brakes. I like being one with my vehicle. Will control freaks like me ever accept a computerized Jeeves?

And driverless technology is just plain boring. There’s no wow factor to it. I thought the future of motoring was going to be awesome. I thought we’d take off in our car like a Harrier jumpjet and fly to work. Guess not. Instead, they’re giving us cars that don’t need us. Ho hum.

The underlying trend that has me most worried, though, is the complete trust some have of computers to take over every aspect of our lives. Don’t they realize electronics fail? It’s one thing to trust driverless technology to navigate a well-lined, pancake flat highway; it’s another to trust the software to navigate a twisting, turning, potholed, snow-covered, limb-shrouded, poorly lined back road in rural Maine.

We need skilled humans at the helm when driving conditions turn dangerous. I trust a person to see me and my truck crossing the road much more than a circuit board built by Elon Musk.

John Balentine, a former managing editor for Sun Media Group, lives in Windham.

0
  • Just Sayin’

    Let’s begin with a factual correction: The case mentioned here with the driver watching Harry Potter was in an AI Assisted vehicle, not fully autonomous. The driver also failed to understand the difference, and decided to let the vehicle drive all on its own. That is not how that model of car operates, he is supposed to keep his hands on the wheel save for short pauses.

    The vehicle in question even alerted him to put his hands back on the wheel several times in the time preceding the crash. This is not a technological failure, but a human one.

    And, realistically speaking, I think it would be far safer to put self driving cars on the roads than to let the average driver have a car that can fly. People have enough trouble at 4-way stops, much less needing to worry about a full 360 degrees and a massively different set of physics involved in flight compared to wheeled driving.

  • Chew H Bird

    I have been working in the technology field for 30 years and have several years experience at sanctioned drag strips (and a fair amount of success at both endeavors).

    First, we must understand the vast majority of automobile collisions and accidents are the fault of the driver, regardless of legal rulings. Blame these crashes on whatever you want byt remember that poor judgement, poor decision making, poor skills, and impairment due to a multitude of distractions, along with alcohol and other drugs are the primary cause of tragedy on our public roads.

    Second, we must understand that today’s digital technology is infinitely more reliable than human perception. Today’s heavily hyped digital threats are often focused on getting a human to commit an error because humans are more fallible than software.

    If you go to one of our local drag strips on a street night, you will see (generally) young people trying to drive in a straight line and obtain the quickest time and fastest speed. These are often people who think they know what they are doing and usually have years worth of stories to convince you of how good they are at driving.

    The reality is, all but a small handful of them are not good drivers. One day a friend of mine was racing his car at the track and complaining about the weather and the car’s slow performance. A different friend of mine, a great driver, convinced this guy to let him drive his car… The difference was the great driver lowered the elapsed time by nearly half a second.

    Most drivers are, frankly, dangerous and qualify as an accident waiting to happen. As much as I am control freak I look forward to the predictability of self driving vehicles because they will help get many idiots off the road.

    Most people do not understand digital technology and fear of technology is quite common. The solution is to educate oneself to the benefits and risks and compare them with our current situation. Opinions based upon fear rather than understanding is is a self perpetuating problem in all walks of life.

  • Little crow

    I agree with my engineer friends that humans probably make more mistakes than a computer, but when a human makes a mistake, he can recognize it and make a correction; while the computer may not. Engineers should know better than anyone some version of Murphy’s Law: all machines can fail – and they are not sentient beings. What happens if some road salt gets into the car’s “brain”?

    • Just Sayin’

      Yes, all machines can fail. That’s why these systems are designed with multiple redundant layers. Yes, it’s possible for all layers to fail, but very unlikely. Just like it’s possible, but very unlikely, that a contemporary car will light on fire or explode when you turn the key. It would take multiple failures within the car, and the chance of it is so low that society deems it an acceptable risk, eventually self-driving cars will get to that point too.

      • Little crow

        I also hope and expect that self-driving cars will get to that point, and I think they will be very useful in many cases (for the handicapped, visually impaired, and the drunks). But I just got back from a trip where the GPS had us going in all the wrong directions, so we got out the old technology – a map. I just like having that manual override.

    • Chew H Bird

      It is difficult to for a human to make a correction if they have been drinking, using drugs, or texting while driving. It is difficult for a human to make a correction if they do not understand the physics of driving because they learned to drive in a vehicle with traction control, computerized braking, roll control, and collision warning systems. The safest drivers in the world are farmers because they understand environmental conditions and basic physics. Most people are not farmers…