Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

"When the going gets tough, the tough get empirical." -- Jon Carroll


tech / sci.electronics.design / Re: Every Tesla Accident Resulting in Death

Re: Every Tesla Accident Resulting in Death

<bf8ec5f6-ae3f-427d-ad30-995402d3de08n@googlegroups.com>

  copy mid

https://www.novabbs.com/tech/article-flat.php?id=93593&group=sci.electronics.design#93593

  copy link   Newsgroups: sci.electronics.design
X-Received: by 2002:a05:622a:1a27:b0:2e0:64c2:7469 with SMTP id f39-20020a05622a1a2700b002e064c27469mr9192819qtb.187.1648831985703;
Fri, 01 Apr 2022 09:53:05 -0700 (PDT)
X-Received: by 2002:a0d:f103:0:b0:2eb:488:f0e1 with SMTP id
a3-20020a0df103000000b002eb0488f0e1mr8962268ywf.487.1648831985377; Fri, 01
Apr 2022 09:53:05 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.misty.com!border2.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.electronics.design
Date: Fri, 1 Apr 2022 09:53:05 -0700 (PDT)
In-Reply-To: <t2728v$skf$1@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=24.138.223.107; posting-account=I-_H_woAAAA9zzro6crtEpUAyIvzd19b
NNTP-Posting-Host: 24.138.223.107
References: <t1upig$tmg$2@dont-email.me> <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me> <t1vm1c$9lm$1@dont-email.me>
<t202na$bvu$1@dont-email.me> <t20t89$78k$1@dont-email.me> <26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me> <b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me> <ccf76ab3-8f2b-4704-83ee-65d618b20f7bn@googlegroups.com>
<t2728v$skf$1@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <bf8ec5f6-ae3f-427d-ad30-995402d3de08n@googlegroups.com>
Subject: Re: Every Tesla Accident Resulting in Death
From: gnuarm.d...@gmail.com (Ricky)
Injection-Date: Fri, 01 Apr 2022 16:53:05 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
Lines: 213
 by: Ricky - Fri, 1 Apr 2022 16:53 UTC

On Friday, April 1, 2022 at 10:29:58 AM UTC-4, David Brown wrote:
> On 01/04/2022 14:38, Ricky wrote:
> > On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
> >> On 01/04/2022 00:29, Ricky wrote:
> >>> On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
> >>>> On 31/03/2022 22:44, Ricky wrote:
> >>>>> On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
> >>>>>> On 30/03/2022 00:54, Tom Gardner wrote:
> >>>>>>> On 29/03/22 20:18, David Brown wrote:
> >>>> <snip>
> >>>>>> No, it is not "too strong". It is basic statistics. Bayes' theorem,
> >>>>>> and all that. If a large proportion of people use autopilot, but
> >>>>>> only a small fraction of the deaths had the autopilot on, then
> >>>>>> clearly the autopilot reduces risks and saves lives (of those that
> >>>>>> drive Teslas - we still know nothing of other car drivers).
> >>>>>
> >>>>> A simple comparison of numbers is not sufficient. Most Tesla
> >>>>> autopilot usage is on highways which are much safer per mile driven
> >>>>> than other roads. That's an inherent bias because while
> >>>>> non-autopilot driving must include all situations, autopilot simply
> >>>>> doesn't work in most environments.
> >>>>>
> >>>> Yes. An apples-to-apples comparison is the aim, or at least as close as
> >>>> one can get.
> >>>>
> >>>> I suspect - without statistical justification -
> >>>
> >>> Yes, without justification, at all.
> >> Which do /you/ think is most likely? Autopilot crashes on the motorway,
> >> or autopilot crashes on smaller roads?
> >
> > Because autopilot doesn't work off the highway (it can't make turns, for example) more often autopilot involved crashes are on the highways.
> >
> I was not aware of that limitation. Thanks for providing some relevant
> information.
> > I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use it.
> >
> >
> >>>> that the accidents
> >>>> involving autopilot use are precisely cases where you don't have a good,
> >>>> clear highway, and autopilot was used in a situation where it was not
> >>>> suitable. Getting good statistics and comparisons here could be helpful
> >>>> in making it safer - perhaps adding a feature that has the autopilot say
> >>>> "This is not a good road for me - you have to drive yourself" and switch
> >>>> itself off. (It would be more controversial, but probably statistically
> >>>> safer, if it also sometimes said "I'm better at driving on this kind of
> >>>> road than you are" and switching itself on!)
> >>>>>>>
> >>>>>>> An issue is, of course, that any single experience can be
> >>>>>>> dismissed as an unrepresentative aberration. Collation of
> >>>>>>> experiences is necessary.
> >>>>>>>
> >>>>>>> Some of the dashcam "Tesla's making mistakes" videos on yootoob
> >>>>>>> aren't confidence inspiring. Based on one I saw, I certainly
> >>>>>>> wouldn't dare let a Tesla drive itself in an urban environment,
> >>>>>>>
> >>>>>>> I suspect there isn't sufficient experience to assess relative
> >>>>>>> dangers between "artificial intelligence" and "natural
> >>>>>>> stupidity".
> >>>>>> I don't doubt at all that the Tesla autopilot makes mistakes.
> >>>>>
> >>>>> Which depends on how you define "mistakes".
> >>>> Of course.
> >>>>> It's a bit like asking
> >>>>> if your rear view mirror makes mistakes by not showing cars in the
> >>>>> blind spot. The autopilot is not designed to drive the car. It is a
> >>>>> tool to assist the driver. The driver is required to be responsible
> >>>>> for the safe operation of the car at all times. I can point out to
> >>>>> you the many, many times the car acts like a spaz and requires me to
> >>>>> manage the situation. Early on, there was a left turn like on a 50
> >>>>> mph road, the car would want to turn into when intending to drive
> >>>>> straight. Fortunately they have ironed out that level of issue. But
> >>>>> it was always my responsibility to prevent it from causing an
> >>>>> accident. So how would you say anything was the fault of the
> >>>>> autopilot?
> >>>>>
> >>>> There are a few possibilities here (though I am not trying to claim that
> >>>> any of them are "right" in some objective sense). You might say they
> >>>> had believed that the "autopilot" was like a plane autopilot -
> >>>
> >>> It is exactly like an airplane autopilot.
> >>>
> >>>
> >>>> you can
> >>>> turn it on and leave it to safely drive itself for most of the journey
> >>>> except perhaps the very beginning and very end of the trip. As you say,
> >>>> the Tesla autopilot is /not/ designed for that - that might be a mistake
> >>>> from the salesmen, advertisers, user-interface designers, or just the
> >>>> driver's mistake.
> >>>
> >>> Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.
> >>>
> >> Yes, that's the original idea of a plane autopilot. But modern ones are
> >> more sophisticated and handle course changes along the planned route, as
> >> well as being able to land automatically. And more important than what
> >> plane autopilots actually /do/, is what people /think/ they do - and
> >> remember we are talking about drivers that think their Tesla "autopilot"
> >> will drive their car while they watch a movie or nap in the back seat.
> >
> > Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It's not! Just like in the car, there is a pilot who's job is to fly/drive and assure safety.
> >
> I am fully aware that plane autopilots are limited. I am also aware
> that they are good enough (in planes equipped with modern systems) to
> allow pilots to let the system handle most of the flight itself, even
> including landing. The pilot is, of course, expected to be paying
> attention, watching for other aircraft, communicating with air traffic
> controllers and all the rest of it. But there have been cases of pilots
> falling asleep, or missing their destination because they were playing
> around on their laptops. What people /should/ be doing, and what they
> are /actually/ doing, is not always the same.

Exactly like the Tesla autopilot. The pilot is still in charge and responsible.

> > As to the movie idea, no, people don't think that. People might "pretend" that, but there's no level of "thinking" that says you can climb in the back seat while driving. Please don't say silly things.
> >
> You can google for "backseat Tesla drivers" as well as I can. I am
> confident that some of these are staged, and equally confident that some
> are not. There is no minimum level of "thinking" - no matter how daft
> something might be, there is always a dafter person who will think it's
> a good idea.

The fact that someone pulled a stunt doesn't mean they thought that was an ok thing to do. You know that. So why are we discussing this?

> >>>> And sometimes the autopilot does something daft - it is no longer
> >>>> assisting the driver, but working against him or her. That, I think,
> >>>> should be counted as a mistake by the autopilot.
> >>>
> >>> The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.
> >>>
> >> Well, "does something daft" is no worse than "acts like a spaz", and
> >> it's a good deal more politically correct!
> >
> > Bzzzz. Sorry, you failed.
> >
> Really? You think describing the autopilot's actions as "acts like a
> spaz" is useful and specific, while "does something daft" is not? As
> for the political correctness - find a real spastic and ask them what
> they think of your phrase.

How do you know what is meant by "spaz"? That's my point. Words like that are not well defined. I intended the word to be colorful, with no particular meaning. Your use of daft was in a statement that needed much more detail to be meaningful. Besides, if I jump off a cliff, are you going to jump as well?

--

Rick C.

++- Get 1,000 miles of free Supercharging
++- Tesla referral code - https://ts.la/richard11209

SubjectRepliesAuthor
o Every Tesla Accident Resulting in Death

By: Tom Gardner on Tue, 29 Mar 2022

55Tom Gardner
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor