Google Maps Driving Speed
#1
Does anyone here know of a way to adjust the driving speed of Google Maps? For some reason, GM assumes an average driving speed... what I mean by this is if the average driving speed on the 405 freeway is 85mph, GM assumes I'm going to be driving 85mph. Also, I heard, but have not verified, that GM takes your own average driving speeds and factors that into your drive time. This makes it very frustrating for me when I used to drive my Ford Mustang GT at 90+ mph, and have switched to a truck and drive just a titch over the speed limit and GM is quoting me an average of 25-30 min shorter that it actually takes! Does anybody know if there is a way to adjust the speed variance up or down, or set the maximum value of the driving speed to the legal street/freeway Speed Limit? I'm becoming fed up with being under quoted and being late to engagements due to this, but I don't mind at all being over quoted and early.
"The true value of a human being is determined primarily by the measure and the sense in which he has attained liberation from the self." -Albert Einsetin
Reply
#2
http://www.businessinsider.com/google-ma...es-2013-12
”There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy." - Hamlet (1.5.167-8), Hamlet to Horatio.

[Image: yVR5oE.png][Image: VKQ0KLG.png]

Reply
#3
In a sort of related bit, I love the discussions on the morality of driving.

As Google, Uber, Tesla, and many other companies continue to develop self-driving car technology, they're working past the technical problems related to driving and moving more on to the moral and ethical issues. Google said long ago that they really have no choice but to tell their automated cars to follow the speed limit. That's pretty straightforward; there's a whole host of legal problems with programming your cars to move at a "safe speed" rather than following a legally posted speed limit, so they had to go with legally posted speed limits. Even if those don't make much sense.

But you get into a whole next-level issue when you consider morality. Since computers can make complicated decisions instantaneously, and act upon those decisions instantaneously, there is no more human "fudge factor" to be considered when a moral decision has to be made.

Case in point: your self-driving vehicle is moving down a roadway at the posted speed limit of 45 miles per hour. It is approaching an intersection with a concrete barrier on the left side of the road. A child is walking across the crosswalk in front of the vehicle. The car goes to apply the brakes to stop at the intersection and finds that its owner has not properly maintained those brakes, or they were sabotaged, or whatever...the point is, the brakes fail. At the car's current course and speed, it will strike the child crossing the intersection if no course correction takes place.

Traveling inside the car is a family of three - husband, wife, and child. The car must now make a decision.

1) Continue on its current course, striking and likely killing the child crossing the street. Affects one person.
2) Deviate course to the left, causing the car to strike the barrier head-on at 45 mph, possibly killing or severely injuring the family inside (but saving the child crossing the street). Affects three people.

Again, human "fudge factor" isn't an excuse here. You can't later argue in court that the driver "froze up" being unable to make a decision. You have to program the car to decide a course of action here. To make this more complex, what if the child crossing the street isn't a child? What if it's a family? What if it's a government leader? What if the passengers in the car are "important people?"

What if you program the car to always choose to drive into the barrier, thinking that there's only a possibility of death for the passengers while there's a guarantee of death for the pedestrian? What if people later learn/know of this, and take advantage of it to actually murder someone by jumping out in front of their self-driving car?

I can't answer these questions definitively. But I love thinking about them. Smile
Quote:Considering the mods here are generally liberals who seem to have a soft spot for fascism and white supremacy (despite them saying otherwise), me being perma-banned at some point is probably not out of the question.
Reply
#4
(06-19-2017, 07:10 PM)Bolty Wrote: I can't answer these questions definitively. But I love thinking about them. Smile

There is literally a very popular facebook page just entirely built on trolley problem memes. I find ethics fun to contemplate, and hopefully relevant, but kids these days have taken it to a whole new level.

-Jester
Reply
#5
(06-20-2017, 12:15 AM)Jester Wrote:
(06-19-2017, 07:10 PM)Bolty Wrote: I can't answer these questions definitively. But I love thinking about them. Smile

There is literally a very popular facebook page just entirely built on trolley problem memes. I find ethics fun to contemplate, and hopefully relevant, but kids these days have taken it to a whole new level.

-Jester

Interesting discussion. Re: the aforementioned 'trolley problem', I think something to consider also would be if you touched the lever diverting the train, you'd be liable for manslaughter at best, murder at worst, but if you did nothing, the justices would be hard pressed to say you committed willful neglect when you could argue you were simply too shocked to commit to any action at that moment. So the logical conclusion would be to not even touch the lever.
"The true value of a human being is determined primarily by the measure and the sense in which he has attained liberation from the self." -Albert Einsetin
Reply
#6
(06-20-2017, 01:38 AM)Taem Wrote: Interesting discussion. Re: the aforementioned 'trolley problem', I think something to consider also would be if you touched the lever diverting the train, you'd be liable for manslaughter at best, murder at worst, but if you did nothing, the justices would be hard pressed to say you committed willful neglect when you could argue you were simply too shocked to commit to any action at that moment. So the logical conclusion would be to not even touch the lever.

If you believe ethics reduces to the question "how do I stay out of jail," then yeah, that works!

-Jester
Reply
#7
Lol, of course not, however I neglected to point out that from my perspective supporting four children, there would be more at stake than simply avoiding prison, and that would be supporting my family by avoiding prison. I do believe that with morals and ethics, humans tend to prioritize with their own ilk taking center stage, themselves next, innocent children and close relatives next, and total strangers last. That's just the way the mind works, and it makes sense from a biological evolutionary standpoint. As I pointed out many times before, there is no altruism, only actions we take that help support our own survival and lineage.
"The true value of a human being is determined primarily by the measure and the sense in which he has attained liberation from the self." -Albert Einsetin
Reply
#8
Re: Bolty

(quotes don't work in mobile, sorry)

I found your moral quandary to be quite the talking point of my car ride yesterday. Seems discussions like these are all the rage in my sons college, but the conversation quickly turned towards intent. Of course creators of said software would eventually include a clause to, in situations as you proposed, force the car to avoid *them* in particular. This is inevitable (like the original Robocop being unable to kill the acting CEO). Once members of congress or other very influential people got wind, you bet that particular programming would creep into all cars, perhaps becoming such a common practice as to be entered into law for the "protection" and longevity of our country. Maybe not now, but in 100years, I can practically guarantee it! Anyway, interesting conversation starter Bolty; much thanks to you!
"The true value of a human being is determined primarily by the measure and the sense in which he has attained liberation from the self." -Albert Einsetin
Reply
#9
(06-19-2017, 07:10 PM)Bolty Wrote: I can't answer these questions definitively. But I love thinking about them. Smile

This is an interesting discussion indeed.
Important it is to realize that the amount of car accidents will decrease by at least 90% when everybody uses self driving cars.

We have teh same ethical problems now but we choose not to answer them.
90% of accidents are caused by 'mistakes'' of the driver. These can be simply being distracted, but also texting while driving, or even consciously speeding or driving while intoxicated.
Punishments in all cases are usually very low....often because it is very difficult to prove what was done wrong, and even if there is proof never will the ask for murder....usually it is something like gross negligence (is that the correct english term?).

Self driving cars will always use there indicators when making a turn, they will always have their lights on, they will never speed etc.

So to get back to your point: it is indeed a valid discussion, but now at the current situation we are usually choosing not to have the discussion at all.
Reply
#10
(06-20-2017, 07:04 PM)eppie Wrote:
(06-19-2017, 07:10 PM)Bolty Wrote: I can't answer these questions definitively. But I love thinking about them. Smile

This is an interesting discussion indeed.
Important it is to realize that the amount of car accidents will decrease by at least 90% when everybody uses self driving cars.

We have teh same ethical problems now but we choose not to answer them.
90% of accidents are caused by 'mistakes'' of the driver. These can be simply being distracted, but also texting while driving, or even consciously speeding or driving while intoxicated.
Punishments in all cases are usually very low....often because it is very difficult to prove what was done wrong, and even if there is proof never will the ask for murder....usually it is something like gross negligence (is that the correct english term?).

Self driving cars will always use there indicators when making a turn, they will always have their lights on, they will never speed etc.

So to get back to your point: it is indeed a valid discussion, but now at the current situation we are usually choosing not to have the discussion at all.
I think self driving will be initially disruptive and like this example more dangerous until the sensors, and AI outperform human reaction and crash prevention.
”There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy." - Hamlet (1.5.167-8), Hamlet to Horatio.

[Image: yVR5oE.png][Image: VKQ0KLG.png]

Reply
#11
(06-23-2017, 04:16 PM)kandrathe Wrote: I think self driving will be initially disruptive and like this example more dangerous until the sensors, and AI outperform human reaction and crash prevention.

Your example is valid but also a bit unfair. Tesla knew that their "Autopilot" was not ready to be a fully self-driving replacement for humans, and thus demanded that drivers pay attention to the road and have their hands on the wheel. As the article you linked states, the "driver" ignored 7 visual and 6 audible warnings that he was not using the car properly.

That said, the article also pointed out something that I remember when Tesla rolled out their "Autopilot" technology - the industry begged Tesla not to do it because they knew human nature. Google does not want to release any self-driving technology until they know that the cars can 100% operate without any human intervention whatsoever, for exactly the reason why the Tesla driver died. Humans will pay less and less attention over time if they feel that the Autopilot is "good enough" and there will be cases where it kills people. And media being media, they will trump up and make huge headlines out of anyone dying to self-driving technology, which could potentially sway public opinion away from it.

Never mind that 30,000+ people die every year from driving in the U.S. If one person dies in a self-driving car, it'll make national headlines every time. Humans suck at statistics.

Anyway, I guess the point I'm making is that the driver is at fault, but so is Tesla for just ignoring human nature - arrogantly thinking they could put out a technology that's 80-90% complete and that humans would always be willing to make up for that last 10-20% percent.
Quote:Considering the mods here are generally liberals who seem to have a soft spot for fascism and white supremacy (despite them saying otherwise), me being perma-banned at some point is probably not out of the question.
Reply
#12
Bolty Wrote:Anyway, I guess the point I'm making is that the driver is at fault, but so is Tesla for just ignoring human nature - arrogantly thinking they could put out a technology that's 80-90% complete and that humans would always be willing to make up for that last 10-20% percent.
Sure. It's true. But, does it matter? Like VHS versus Betamax, it comes down to perception. You and I tend to be logical and rational, but some like that driver, will be too trusting of the technology. Then, their sensational mistakes will become the fodder for the inevitable neo-luddites to build their campaign against changing the status quo. There is typically a small group of zealous early adopters, a larger population who need convincing, and a minority of close minded who will never change.

The big question in self-driving technology is what path the adoption narrative takes. It could be years, or decades depending on how safety is perceived regardless of the truth.
”There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy." - Hamlet (1.5.167-8), Hamlet to Horatio.

[Image: yVR5oE.png][Image: VKQ0KLG.png]

Reply
#13
(06-24-2017, 05:13 AM)kandrathe Wrote: The big question in self-driving technology is what path the adoption narrative takes. It could be years, or decades depending on how safety is perceived regardless of the truth.

The first and main error was with the truck driver......who didn't have a self driving truck.
Reply
#14
(06-24-2017, 05:44 AM)eppie Wrote:
(06-24-2017, 05:13 AM)kandrathe Wrote: The big question in self-driving technology is what path the adoption narrative takes. It could be years, or decades depending on how safety is perceived regardless of the truth.

The first and main error was with the truck driver......who didn't have a self driving truck.
Do we outlaw driving?
”There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy." - Hamlet (1.5.167-8), Hamlet to Horatio.

[Image: yVR5oE.png][Image: VKQ0KLG.png]

Reply
#15
(06-24-2017, 06:08 AM)kandrathe Wrote: Do we outlaw driving?
No need. Over time the insurance will become so expensive that only the luxury-rich will be able to afford to manually drive. Then there might be such a societal backlash against anyone who self-drives (and thus causes accidents) that it could disappear altogether outside of specialized tracks (think Nascar).
Quote:Considering the mods here are generally liberals who seem to have a soft spot for fascism and white supremacy (despite them saying otherwise), me being perma-banned at some point is probably not out of the question.
Reply
#16
(06-24-2017, 05:45 PM)Bolty Wrote:
(06-24-2017, 06:08 AM)kandrathe Wrote: Do we outlaw driving?
No need. Over time the insurance will become so expensive that only the luxury-rich will be able to afford to manually drive. Then there might be such a societal backlash against anyone who self-drives (and thus causes accidents) that it could disappear altogether outside of specialized tracks (think Nascar).
Insurance may not be more expensive since it's based on your risk to cause damage. If self driving is ubiquitous, then the odd manual driver who is used to the orderly roadway shouldn't be an issue. But, I would say the means for "signaling" intentions would need to be integrated. In theory your self-driving vehicles will announce and coordinate their interactions. A manual driver would need to queue their turns or lane changes. While also still communicating their velocity changes to surrounding vehicles. So, I think as self-driving is adopted, it first will need to deal with the majority of "mute" and insensitive vehicles currently on the roadway.

My prediction is then that the first obstacle in generation one self-driving will be sensors and prediction algorithms without much reliance on intervehicle communication. Once a critical mass of "smart" cars are on the roadways the rules can then change to enable and eventually prefer them. But, the technology would inherit the sensor capabilities of gen one. I've never seen technology removed, and in this case, with NTSB probably calling the shots on requirements, it would be like trying to have seat belts or airbags made obsolete.

Then, all in all, as I think it through, I see manual driving would remain an anachronistic legacy of vehicle technology which will become even safer as random is removed from our transportation network.
”There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy." - Hamlet (1.5.167-8), Hamlet to Horatio.

[Image: yVR5oE.png][Image: VKQ0KLG.png]

Reply
#17
(06-24-2017, 06:08 AM)kandrathe Wrote:
(06-24-2017, 05:44 AM)eppie Wrote:
(06-24-2017, 05:13 AM)kandrathe Wrote: The big question in self-driving technology is what path the adoption narrative takes. It could be years, or decades depending on how safety is perceived regardless of the truth.

The first and main error was with the truck driver......who didn't have a self driving truck.
Do we outlaw driving?
I don't get this reaction kandrathe.
We were talking about liabilty in case of accidents with self diving cars. Subsequently about the accident involving a truck and a Tesla and what was wrong with the self driving system of the Tesla. Someone remarked that the cause of the incident was the mistake the truck driver and I made my remark because it was 'funny'' to discus the issues with self driving cars using an incident that was caused by a human driver.

Nowhere did I say we need to outlaw driving.
Reply
#18
(06-25-2017, 08:26 AM)eppie Wrote: I don't get this reaction kandrathe.
We were talking about liabilty in case of accidents with self diving cars. Subsequently about the accident involving a truck and a Tesla and what was wrong with the self driving system of the Tesla. Someone remarked that the cause of the incident was the mistake the truck driver and I made my remark because it was 'funny'' to discus the issues with self driving cars using an incident that was caused by a human driver.

Nowhere did I say we need to outlaw driving.
The truck driver did that thing they do when they make wide turns. He crossed from the right lane to make a left turn, assuming the driver behind him would see his asshat move, brake, and be inconvenienced. or, maybe just drove across the two lane road...

Quote:The NTSB found it was visible for at least seven seconds before the crash, but he never tried to brake or swerve. For comparison, seven seconds is long enough to say "He should have seen the truck," four times--and certainly long enough to have braked and possibly avoided the collision

The second problem was that Tesla sensors failed to see the truck against the sky. But, what killed the man was his carelessness. The way you said that, to me, implied the problem was the truck was being driven by a human.

P.S. Upon further reading of the NTSB case file, I'm unsure if the truck driver was turning or just crossing the road. One thing I did learn though, was that the driver also set the cruise speed at 74 mph ( in a 65 mph zone), and did not swerve or brake before impacting the trailer (which killed him pretty much instantly), then careening off the road through a drainage ditch, two fences, before finding and snapping a telephone pole. Had the driver not passed under the middle of the semi-trailer, ripping the top off the car , and suffering massive head trauma, he might have survived the ditch, fences, and telephone pole.
”There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy." - Hamlet (1.5.167-8), Hamlet to Horatio.

[Image: yVR5oE.png][Image: VKQ0KLG.png]

Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)