Human traveling in car vs. computer driven vehicle -Irony prevails

Humans Are Slamming Into Driverless Cars and Exposing a Key Flaw

December 17, 2015 — 7:01 PM EST Updated on December 18, 2015 — 6:30 AM EST

The self-driving car, that cutting-edge creation that’s supposed to lead to a world without accidents, is achieving the exact opposite right now: The vehicles have racked up a crash rate double that of those with human drivers.

The glitch?

They obey the law all the time, as in, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well. As the accidents have piled up -- all minor scrape-ups for now -- the arguments among programmers at places like Google Inc. and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time to stay out of trouble?

“It’s a constant debate inside our group,” said Raj Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Res... in Pittsburgh. “And we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you. And I would be one of those people.”

Last year, Rajkumar offered test drives to members of Congress in his lab’s self-driving Cadillac SRX sport utility vehicle. The Caddy performed perfectly, except when it had to merge onto I-395 South and swing across three lanes of traffic in 150 yards (137 meters) to head toward the Pentagon. The car’s cameras and laser sensors detected traffic in a 360-degree view but didn’t know how to trust that drivers would make room in the ceaseless flow, so the human minder had to take control to complete the maneuver.

“We end up being cautious,” Rajkumar said. “We don’t want to get into an accident because that would be front-page news. People expect more of autonomous cars.”

Not at Fault

Turns out, though, their accident rates are twice as high as for regular cars, according to a study by the University of Michigan’s Transportation Research Institute in Ann Arbor, Michigan. Driverless vehicles have never been at fault, the study found: They’re usually hit from behind in slow-speed crashes by inattentive or aggressive humans unaccustomed to machine motorists that always follow the rules and proceed with caution.

“It’s a dilemma that needs to be addressed,” Rajkumar said.

It’s similar to the thorny ethical issues driverless car creators are wrestling with over how to program them to make life-or-death decisions in an accident. For example, should an autonomous vehicle sacrifice its occupant by swerving off a cliff to avoid killing a school bus full of children?

California is urging caution in the deployment of driverless cars. It published proposed rules this week that would require a human always to be ready to take the wheel and also compel companies creating the cars to file monthly reports on their behavior. Google -- which developed a model with no steering wheel or gas pedal -- said it is “gravely disappointed” in the proposed rules, which could set the standard for autonomous-car regulations nationwide.

Fast Track

Google is on a fast track. It plans to make its self-driving-cars unit a stand-alone business next year and eventually offer a ride-for-hire service, according to a person briefed on the company’s strategy.

Google cars have been in 17 minor crashes in 2 million miles (3.2 million kilometers) of testing and account for most of the reported accidents, according to the Michigan study. That’s partly because the company is testing mainly in California, where accidents involving driverless cars must be reported.

The most recent reported incident was Nov. 2 in Mountain View, California, Google’s headquarters, when a self-driving Google Lexus SUV attempted to turn right on a red light. It came to a full stop, activated its turn signal and began creeping slowly into the intersection to get a better look, according to a report the company posted online. Another car stopped behind it and also began rolling forward, rear-ending the SUV at 4 mph. There were no injuries and only minor damage to both vehicles.

Robot-Car Stop

Ten days later, a Mountain View motorcycle cop noticed traffic stacking up behind a Google car going 24 miles an hour in a busy 35 mph zone. He zoomed over and became the first officer to stop a robot car. He didn’t issue a ticket -- who would he give it to? -- but he warned the two engineers on board about creating a hazard.

“The right thing would have been for this car to pull over, let the traffic go and then pull back on the roadway,” said Sergeant Saul Jaeger, head of the police department’s traffic-enforcement unit. “I like it when people err on the side of caution. But can something be too cautious? Yeah.”

While Google rejects the notion that its careful cars cause crashes, “we err on the conservative side,” said Dmitri Dolgov, principal engineer of the program. “They’re a little bit like a cautious student driver or a grandma.”

More Aggressive

Google is working to make the vehicles more “aggressive” like humans -- law-abiding, safe humans -- so they “can naturally fit into the traffic flow, and other people understand what we’re doing and why we’re doing it,” Dolgov said. “Driving is a social game.”

Google has already programmed its cars to behave in more familiar ways, such as inching forward at a four-way stop to signal they’re going next. But autonomous models still surprise human drivers with their quick reflexes, coming to an abrupt halt, for example, when they sense a pedestrian near the edge of a sidewalk who might step into traffic.

“These vehicles are either stopping in a situation or slowing down when a human driver might not,” said Brandon Schoettle, co-author of the Michigan study. “They’re a little faster to react, taking drivers behind them off guard.”

That could account for the prevalence of slow-speed, rear-end crashes, he added.

Behave Differently

“They do behave differently,” said Egil Juliussen, senior director at consultant IHS Technology and author of a study on how Google leads development of autonomous technology. “It’s a problem that I’m sure Google is working on, but how to solve it is not clear.”

One approach is to teach the vehicles when it’s OK to break the rules, such as crossing a double yellow line to avoid a bicyclist or road workers.

“It’s a sticky area,” Schoettle said. “If you program them to not follow the law, how much do you let them break the law?”

Initially, crashes may rise as more robot autos share the road, but injuries should diminish because most accidents will be minor, Schoettle said.

“There’s a learning curve for everybody,” said Jaeger, of the Mountain View Police, which interacts more with driverless cars than any other law-enforcement unit. “Computers are learning, the programmers are learning and the people are learning to get used to these things.”

Views: 106

Comment

You need to be a member of 12160 Social Network to add comments!

Join 12160 Social Network

Comment by Anti Everything on December 18, 2015 at 10:04am

Control of all aspects of human beings is proving to be tougher then the 'Controllers' hoped.

"Destroying the New World Order"

TOP CONTENT THIS WEEK

THANK YOU FOR SUPPORTING THE SITE!

mobile page

12160.info/m

12160 Administrators

 

Latest Activity

tjdavis commented on tjdavis's video
5 hours ago
tjdavis commented on tjdavis's video
5 hours ago
tjdavis posted videos
6 hours ago
cheeki kea commented on cheeki kea's photo
Thumbnail

Both True.

"You're on to it Doc V, China wants a slice of the ice although they have no historical…"
15 hours ago
cheeki kea commented on cheeki kea's photo
Thumbnail

Can it get any Sicker !

"Sick, sad, gut wrenching and true. It is understandable why so many families are fleeing Britain…"
yesterday
cheeki kea posted a photo
yesterday
Snakedaddy favorited Parrhesia's photo
yesterday
Doc Vega posted a blog post

What is Reality? Ask Doctor Steven Greer

So, what is the mystery drone sightings all about? We’re going to have to jump down a rabbit hole…See More
yesterday
Doc Vega favorited Sandy's video
yesterday
Doc Vega commented on Sandy's video
Thumbnail

THE FALL OF THE CABAL by Janet Ossebaard & Cynthia Koeter (THE SEQUEL) Part 9

"And there are the atheists who say there's no such thing as the Devil."
yesterday
tjdavis posted videos
Wednesday
cheeki kea commented on cheeki kea's photo
Thumbnail

Prime clown idiot of the year.

"Wow the Pause button for this circus just got hit. Prime clown silenced his own self right out of…"
Tuesday
tjdavis commented on tjdavis's video
Tuesday
tjdavis posted a video

In The Year 2525 - Groove Guild feat. Jean Rohe

Master recording by Groove Guildwww.grooveguild.comSocial handles - @GrooveGuildAll visuals by Sarofskywww.sarofsky.comSocial handles - Instagram - @Sarofsky...
Tuesday
tjdavis posted a blog post
Tuesday
Doc Vega posted blog posts
Monday
Doc Vega commented on Doc Vega's blog post Veiled Aggression
"Keisha Ruan I guess I misunderstood your sarcasm. Please do more content."
Monday
Doc Vega commented on Keisha Ruan's blog post The Alienadox – by Kaiya
"Yes a witty way to put it !"
Monday
Doc Vega favorited Keisha Ruan's blog post The Alienadox – by Kaiya
Monday
Sandy posted a video

THE FALL OF THE CABAL by Janet Ossebaard & Cynthia Koeter (THE SEQUEL) Part 9

"The Fall of The Cabal, The Sequel" by Janet Ossebaard and Cynthia Koeter unveils the matrix's truth, unravels secret agendas, and exposing shadow government...
Monday

© 2025   Created by truth.   Powered by

Badges  |  Report an Issue  |  Terms of Service

content and site copyright 12160.info 2007-2019 - all rights reserved. unless otherwise noted