Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

Friction is a drag.


tech / sci.electronics.design / Every Tesla Accident Resulting in Death

SubjectAuthor
* Every Tesla Accident Resulting in DeathTom Gardner
+* Re: Every Tesla Accident Resulting in DeathRickster
|`* Re: Every Tesla Accident Resulting in DeathDavid Brown
| +* Re: Every Tesla Accident Resulting in DeathTom Gardner
| |+- Re: Every Tesla Accident Resulting in DeathRickster
| |`* Re: Every Tesla Accident Resulting in DeathDavid Brown
| | `* Re: Every Tesla Accident Resulting in DeathTom Gardner
| |  +- Re: Every Tesla Accident Resulting in DeathDon Y
| |  +- Re: Every Tesla Accident Resulting in DeathRickster
| |  `* Re: Every Tesla Accident Resulting in DeathDavid Brown
| |   `* Re: Every Tesla Accident Resulting in DeathRicky
| |    `* Re: Every Tesla Accident Resulting in DeathDavid Brown
| |     `* Re: Every Tesla Accident Resulting in DeathRicky
| |      `* Re: Every Tesla Accident Resulting in DeathDavid Brown
| |       +* Re: Every Tesla Accident Resulting in DeathTom Gardner
| |       |+* Re: Every Tesla Accident Resulting in DeathDavid Brown
| |       ||+* Re: Every Tesla Accident Resulting in DeathTom Gardner
| |       |||`- Re: Every Tesla Accident Resulting in DeathRicky
| |       ||+* Re: Every Tesla Accident Resulting in DeathJeroen Belleman
| |       |||`* Re: Every Tesla Accident Resulting in DeathDon Y
| |       ||| `* Re: Every Tesla Accident Resulting in DeathJeroen Belleman
| |       |||  +- Re: Every Tesla Accident Resulting in DeathDon Y
| |       |||  `- Re: Every Tesla Accident Resulting in DeathRicky
| |       ||`* Re: Every Tesla Accident Resulting in DeathRicky
| |       || `* Re: Every Tesla Accident Resulting in DeathDavid Brown
| |       ||  `* Re: Every Tesla Accident Resulting in DeathRicky
| |       ||   +* Re: Every Tesla Accident Resulting in DeathDavid Brown
| |       ||   |`- Re: Every Tesla Accident Resulting in DeathRicky
| |       ||   +* Re: Every Tesla Accident Resulting in DeathTom Gardner
| |       ||   |`* Re: Every Tesla Accident Resulting in DeathRicky
| |       ||   | `* Re: Every Tesla Accident Resulting in DeathTom Gardner
| |       ||   |  `- Re: Every Tesla Accident Resulting in DeathRicky
| |       ||   `- Re: Every Tesla Accident Resulting in DeathJeroen Belleman
| |       |+* Re: Every Tesla Accident Resulting in DeathDon Y
| |       ||`* Re: Every Tesla Accident Resulting in DeathTom Gardner
| |       || `* Re: Every Tesla Accident Resulting in DeathDon Y
| |       ||  `* Re: Every Tesla Accident Resulting in DeathTom Gardner
| |       ||   `* Re: Every Tesla Accident Resulting in DeathDon Y
| |       ||    `* Re: Every Tesla Accident Resulting in DeathTom Gardner
| |       ||     `- Re: Every Tesla Accident Resulting in DeathDon Y
| |       |`* Re: Every Tesla Accident Resulting in DeathRicky
| |       | `* Re: Every Tesla Accident Resulting in DeathDavid Brown
| |       |  `* Re: Every Tesla Accident Resulting in DeathRicky
| |       |   `* Re: Every Tesla Accident Resulting in DeathDavid Brown
| |       |    +- Re: Every Tesla Accident Resulting in DeathRicky
| |       |    `* Re: Every Tesla Accident Resulting in DeathCursitor Doom
| |       |     `* Re: Every Tesla Accident Resulting in DeathRicky
| |       |      `- Re: Every Tesla Accident Resulting in DeathFlyguy
| |       `* Re: Every Tesla Accident Resulting in DeathRicky
| |        `* Re: Every Tesla Accident Resulting in DeathDavid Brown
| |         `* Re: Every Tesla Accident Resulting in DeathRicky
| |          `- Re: Every Tesla Accident Resulting in DeathDavid Brown
| `* Re: Every Tesla Accident Resulting in DeathCursitor Doom
|  `- Re: Every Tesla Accident Resulting in DeathRicky
`* Re: Every Tesla Accident Resulting in DeathTom Gardner
 `- Re: Every Tesla Accident Resulting in DeathCursitor Doom

Pages:123
Every Tesla Accident Resulting in Death

<t1upig$tmg$2@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93273&group=sci.electronics.design#93273

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: spamj...@blueyonder.co.uk (Tom Gardner)
Newsgroups: sci.electronics.design
Subject: Every Tesla Accident Resulting in Death
Date: Tue, 29 Mar 2022 12:12:16 +0100
Organization: A noiseless patient Spider
Lines: 23
Message-ID: <t1upig$tmg$2@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Tue, 29 Mar 2022 11:12:16 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="814ab694781e82b072eb7e856f0211eb";
logging-data="30416"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX18AtqJDJ6DAzSIc9vPl3qex"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101
Firefox/52.0 SeaMonkey/2.49.4
Cancel-Lock: sha1:7Q5gp1Zn5X/h0Jq9umjtRUFaLYk=
X-Mozilla-News-Host: snews://news.eternal-september.org:563
 by: Tom Gardner - Tue, 29 Mar 2022 11:12 UTC

From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1

The website referred to appears to be collating information in
a reasonable and unemotional way.

Every Tesla Accident Resulting in Death (Tesla Deaths)
Gabe Goldberg <gabe@gabegold.com>
Thu, 24 Mar 2022 01:53:39 -0400

We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.). This sheet also tallies
claimed and confirmed Tesla autopilot crashes, i.e. instances when
Autopilot was activated during a Tesla crash that resulted in death. Read
our other sheets for additional data and analysis on vehicle miles traveled,
links and analysis comparing Musk's safety claims, and more.

Tesla Deaths Total as of 3/23/2022: 246
Tesla Autopilot Deaths Count: 12

https://www.tesladeaths.com/

Re: Every Tesla Accident Resulting in Death

<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93275&group=sci.electronics.design#93275

 copy link   Newsgroups: sci.electronics.design
X-Received: by 2002:a05:620a:4309:b0:67b:3fc1:86eb with SMTP id u9-20020a05620a430900b0067b3fc186ebmr19774368qko.495.1648558852357;
Tue, 29 Mar 2022 06:00:52 -0700 (PDT)
X-Received: by 2002:a0d:ff05:0:b0:2ea:25ed:d714 with SMTP id
p5-20020a0dff05000000b002ea25edd714mr15399645ywf.454.1648558852107; Tue, 29
Mar 2022 06:00:52 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.misty.com!border2.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.electronics.design
Date: Tue, 29 Mar 2022 06:00:51 -0700 (PDT)
In-Reply-To: <t1upig$tmg$2@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=24.138.223.107; posting-account=I-_H_woAAAA9zzro6crtEpUAyIvzd19b
NNTP-Posting-Host: 24.138.223.107
References: <t1upig$tmg$2@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
Subject: Re: Every Tesla Accident Resulting in Death
From: gnuarm.d...@gmail.com (Rickster)
Injection-Date: Tue, 29 Mar 2022 13:00:52 +0000
Content-Type: text/plain; charset="UTF-8"
Lines: 31
 by: Rickster - Tue, 29 Mar 2022 13:00 UTC

On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
> From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
>
> The website referred to appears to be collating information in
> a reasonable and unemotional way.
>
>
> Every Tesla Accident Resulting in Death (Tesla Deaths)
> Gabe Goldberg <ga...@gabegold.com>
> Thu, 24 Mar 2022 01:53:39 -0400
>
> We provide an updated record of Tesla fatalities and Tesla accident deaths
> that have been reported and as much related crash data as possible
> (e.g. location of crash, names of deceased, etc.). This sheet also tallies
> claimed and confirmed Tesla autopilot crashes, i.e. instances when
> Autopilot was activated during a Tesla crash that resulted in death. Read
> our other sheets for additional data and analysis on vehicle miles traveled,
> links and analysis comparing Musk's safety claims, and more.
>
> Tesla Deaths Total as of 3/23/2022: 246
> Tesla Autopilot Deaths Count: 12
>
> https://www.tesladeaths.com/

Yeah, it's raw data. Did you have a point?

--

Rick C.

- Get 1,000 miles of free Supercharging
- Tesla referral code - https://ts.la/richard11209

Re: Every Tesla Accident Resulting in Death

<t1v4co$llm$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93279&group=sci.electronics.design#93279

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: david.br...@hesbynett.no (David Brown)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Tue, 29 Mar 2022 16:16:56 +0200
Organization: A noiseless patient Spider
Lines: 55
Message-ID: <t1v4co$llm$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 7bit
Injection-Date: Tue, 29 Mar 2022 14:16:57 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="7c677f3130244374869bf373b6af480d";
logging-data="22198"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19nwUdoc7PCGlp5EpvIs0tTdPkjGeJbw/k="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.11.0
Cancel-Lock: sha1:CObyW65fAb8/YbatxrZGtyOMfoc=
In-Reply-To: <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
Content-Language: en-GB
 by: David Brown - Tue, 29 Mar 2022 14:16 UTC

On 29/03/2022 15:00, Rickster wrote:
> On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
>> From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
>>
>> The website referred to appears to be collating information in
>> a reasonable and unemotional way.
>>
>>
>> Every Tesla Accident Resulting in Death (Tesla Deaths)
>> Gabe Goldberg <ga...@gabegold.com>
>> Thu, 24 Mar 2022 01:53:39 -0400
>>
>> We provide an updated record of Tesla fatalities and Tesla accident deaths
>> that have been reported and as much related crash data as possible
>> (e.g. location of crash, names of deceased, etc.). This sheet also tallies
>> claimed and confirmed Tesla autopilot crashes, i.e. instances when
>> Autopilot was activated during a Tesla crash that resulted in death. Read
>> our other sheets for additional data and analysis on vehicle miles traveled,
>> links and analysis comparing Musk's safety claims, and more.
>>
>> Tesla Deaths Total as of 3/23/2022: 246
>> Tesla Autopilot Deaths Count: 12
>>
>> https://www.tesladeaths.com/
>
> Yeah, it's raw data. Did you have a point?
>

Without comparisons to other types of car, and correlations with other
factors, such raw data is useless. You'd need to compare to other
high-end electric cars, other petrol cars in similar price ranges and
styles. You'd want to look at statistics for "typical Tesla drivers"
(who are significantly richer than the average driver, but I don't know
what other characteristics might be relevant - age, gender, driving
experience, etc.) You'd have to compare statistics for the countries
and parts of countries where Teslas are common.

And you would /definitely/ want to anonymise the data. If I had a
family member who was killed in a car crash, I would not be happy about
their name and details of their death being used for some sort of absurd
Tesla hate-site.

I'm no fan of Teslas myself. I like a car to be controlled like a car,
not a giant iPhone (and I don't like iPhones either). I don't like the
heavy tax breaks given by Norway to a luxury car, and I don't like the
environmental costs of making them (though I am glad to see improvements
on that front). I don't like some of the silly claims people make about
them - like Apple gadgets, they seem to bring out the fanboy in some of
their owners. But that's all just me and my personal preferences and
opinions - if someone else likes them, that's fine. Many Tesla owners
are very happy with their cars (and some are unhappy - just as for any
other car manufacturer). I can't see any reason for trying to paint
them as evil death-traps - you'd need very strong statistical basis for
that, not just a list of accidents.

Re: Every Tesla Accident Resulting in Death

<t1v7um$ica$2@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93281&group=sci.electronics.design#93281

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: spamj...@blueyonder.co.uk (Tom Gardner)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Tue, 29 Mar 2022 16:17:42 +0100
Organization: A noiseless patient Spider
Lines: 61
Message-ID: <t1v7um$ica$2@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Tue, 29 Mar 2022 15:17:42 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="814ab694781e82b072eb7e856f0211eb";
logging-data="18826"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/RE8TVz/FMFMh1rkhj6sQG"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101
Firefox/52.0 SeaMonkey/2.49.4
Cancel-Lock: sha1:t3aZIZZqDXs05UMJ+UUKcX++pAs=
In-Reply-To: <t1v4co$llm$1@dont-email.me>
 by: Tom Gardner - Tue, 29 Mar 2022 15:17 UTC

On 29/03/22 15:16, David Brown wrote:
> On 29/03/2022 15:00, Rickster wrote:
>> On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
>>> From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
>>>
>>> The website referred to appears to be collating information in
>>> a reasonable and unemotional way.
>>>
>>>
>>> Every Tesla Accident Resulting in Death (Tesla Deaths)
>>> Gabe Goldberg <ga...@gabegold.com>
>>> Thu, 24 Mar 2022 01:53:39 -0400
>>>
>>> We provide an updated record of Tesla fatalities and Tesla accident deaths
>>> that have been reported and as much related crash data as possible
>>> (e.g. location of crash, names of deceased, etc.). This sheet also tallies
>>> claimed and confirmed Tesla autopilot crashes, i.e. instances when
>>> Autopilot was activated during a Tesla crash that resulted in death. Read
>>> our other sheets for additional data and analysis on vehicle miles traveled,
>>> links and analysis comparing Musk's safety claims, and more.
>>>
>>> Tesla Deaths Total as of 3/23/2022: 246
>>> Tesla Autopilot Deaths Count: 12
>>>
>>> https://www.tesladeaths.com/
>>
>> Yeah, it's raw data. Did you have a point?

I have no point.

I am curious about the causes of crashes when "autopilot" is engaged.

> Without comparisons to other types of car, and correlations with other
> factors, such raw data is useless. You'd need to compare to other
> high-end electric cars, other petrol cars in similar price ranges and
> styles. You'd want to look at statistics for "typical Tesla drivers"
> (who are significantly richer than the average driver, but I don't know
> what other characteristics might be relevant - age, gender, driving
> experience, etc.) You'd have to compare statistics for the countries
> and parts of countries where Teslas are common.
>
> And you would /definitely/ want to anonymise the data. If I had a
> family member who was killed in a car crash, I would not be happy about
> their name and details of their death being used for some sort of absurd
> Tesla hate-site.
>
> I'm no fan of Teslas myself. I like a car to be controlled like a car,
> not a giant iPhone (and I don't like iPhones either). I don't like the
> heavy tax breaks given by Norway to a luxury car, and I don't like the
> environmental costs of making them (though I am glad to see improvements
> on that front). I don't like some of the silly claims people make about
> them - like Apple gadgets, they seem to bring out the fanboy in some of
> their owners. But that's all just me and my personal preferences and
> opinions - if someone else likes them, that's fine. Many Tesla owners
> are very happy with their cars (and some are unhappy - just as for any
> other car manufacturer). I can't see any reason for trying to paint
> them as evil death-traps - you'd need very strong statistical basis for
> that, not just a list of accidents.

There is an attempt at comparisons, as stated in the FAQ.

Re: Every Tesla Accident Resulting in Death

<5234b5df-733b-4543-83e0-1037fbabec7en@googlegroups.com>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93295&group=sci.electronics.design#93295

 copy link   Newsgroups: sci.electronics.design
X-Received: by 2002:a05:620a:1434:b0:67d:40a2:da33 with SMTP id k20-20020a05620a143400b0067d40a2da33mr20992959qkj.93.1648574892687;
Tue, 29 Mar 2022 10:28:12 -0700 (PDT)
X-Received: by 2002:a25:818e:0:b0:633:916b:1df6 with SMTP id
p14-20020a25818e000000b00633916b1df6mr30093332ybk.566.1648574892235; Tue, 29
Mar 2022 10:28:12 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.misty.com!border2.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.electronics.design
Date: Tue, 29 Mar 2022 10:28:11 -0700 (PDT)
In-Reply-To: <t1v7um$ica$2@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=24.138.223.107; posting-account=I-_H_woAAAA9zzro6crtEpUAyIvzd19b
NNTP-Posting-Host: 24.138.223.107
References: <t1upig$tmg$2@dont-email.me> <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <5234b5df-733b-4543-83e0-1037fbabec7en@googlegroups.com>
Subject: Re: Every Tesla Accident Resulting in Death
From: gnuarm.d...@gmail.com (Rickster)
Injection-Date: Tue, 29 Mar 2022 17:28:12 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
Lines: 58
 by: Rickster - Tue, 29 Mar 2022 17:28 UTC

On Tuesday, March 29, 2022 at 11:17:49 AM UTC-4, Tom Gardner wrote:
> On 29/03/22 15:16, David Brown wrote:
> > On 29/03/2022 15:00, Rickster wrote:
> >> On Tuesday, March 29, 2022 at 7:12:23 AM UTC-4, Tom Gardner wrote:
> >>> From comp.risks https://catless.ncl.ac.uk/Risks/33/11/#subj1.1
> >>>
> >>> The website referred to appears to be collating information in
> >>> a reasonable and unemotional way.
> >>>
> >>>
> >>> Every Tesla Accident Resulting in Death (Tesla Deaths)
> >>> Gabe Goldberg <ga...@gabegold.com>
> >>> Thu, 24 Mar 2022 01:53:39 -0400
> >>>
> >>> We provide an updated record of Tesla fatalities and Tesla accident deaths
> >>> that have been reported and as much related crash data as possible
> >>> (e.g. location of crash, names of deceased, etc.). This sheet also tallies
> >>> claimed and confirmed Tesla autopilot crashes, i.e. instances when
> >>> Autopilot was activated during a Tesla crash that resulted in death. Read
> >>> our other sheets for additional data and analysis on vehicle miles traveled,
> >>> links and analysis comparing Musk's safety claims, and more.
> >>>
> >>> Tesla Deaths Total as of 3/23/2022: 246
> >>> Tesla Autopilot Deaths Count: 12
> >>>
> >>> https://www.tesladeaths.com/
> >>
> >> Yeah, it's raw data. Did you have a point?
> I have no point.
>
> I am curious about the causes of crashes when "autopilot" is engaged.

What do you expect to learn by posting this here? Autopilot is not perfect, by any means. They tell you to remain alert just as if you were driving, and in fact, observe your grasp on the wheel alerting you if you relax too much.

The point is when the car crashes on autopilot, it is the driver's fault, not the car because the car is just a driving assistance tool, like the blind spot warning device. If you smack someone in your blind spot, who's fault is that? Yours, because the tool is not perfect.

I know one accident occurred at a highway divide where the guy had previously had the car try to go up the middle, rather than left or right. He even posted that the car was trying to kill him. One day he did something wrong at that same spot and he killed the car and himself.

Have you learned anything new yet?

--

Rick C.

+ Get 1,000 miles of free Supercharging
+ Tesla referral code - https://ts.la/richard11209

Re: Every Tesla Accident Resulting in Death

<t1vm1c$9lm$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93308&group=sci.electronics.design#93308

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: david.br...@hesbynett.no (David Brown)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Tue, 29 Mar 2022 21:18:04 +0200
Organization: A noiseless patient Spider
Lines: 41
Message-ID: <t1vm1c$9lm$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 29 Mar 2022 19:18:05 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="e24a174c0e9ce7a923081470d158b408";
logging-data="9910"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19rPTrYANyBInqZI+ZKHTz2MvVM8Gl+6Rg="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.11.0
Cancel-Lock: sha1:/1QhTmD0NQuFvBHEVtpx+UrPM4k=
In-Reply-To: <t1v7um$ica$2@dont-email.me>
Content-Language: en-GB
 by: David Brown - Tue, 29 Mar 2022 19:18 UTC

On 29/03/2022 17:17, Tom Gardner wrote:
> On 29/03/22 15:16, David Brown wrote:
>> On 29/03/2022 15:00, Rickster wrote:
>>> Yeah, it's raw data.  Did you have a point?
>
> I have no point.
>

Fair enough, I suppose. But was there a reason for the post then?

> I am curious about the causes of crashes when "autopilot" is engaged.
>

That's a reasonable thing to wonder about. The more we (people in
general, Tesla drivers, Tesla developers, etc.) know about such crashes,
the better the possibilities for fixing weaknesses or understanding how
to mitigate them. Unfortunately, the main mitigation is "don't rely on
autopilot - stay alert and focused on the driving" does not work. For
one thing, many people don't obey it - people have been found in the
back seat of crashed Telsa's where they were having a nap. And those
that try to follow it are likely to doze off from boredom.

However, there is no need for a list of "crashes involving Teslas",
names of victims, and a site with a clear agenda to "prove" that Teslas
are not as safe as they claim. It is counter-productive to real
investigation and real learning.

>
>
> There is an attempt at comparisons, as stated in the FAQ.

It is a pretty feeble attempt, hidden away.

Even the comparison of "autopilot" deaths to total deaths is useless
without information about autopilot use, and how many people rely on it.

The whole post just struck me as a bit below par for your usual high
standard. There's definitely an interesting thread possibility around
the idea of how safe or dangerous car "autopilots" can be, and how they
compare to average drivers. But your post was not a great starting
point for that.

Re: Every Tesla Accident Resulting in Death

<t202na$bvu$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93329&group=sci.electronics.design#93329

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: spamj...@blueyonder.co.uk (Tom Gardner)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Tue, 29 Mar 2022 23:54:34 +0100
Organization: A noiseless patient Spider
Lines: 101
Message-ID: <t202na$bvu$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 29 Mar 2022 22:54:34 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="439d548173710a523cbcdc3d4e5efb55";
logging-data="12286"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19NNDuEoIhEZUDTeUdNieHV"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101
Firefox/52.0 SeaMonkey/2.49.4
Cancel-Lock: sha1:XcxNRTajQFZFxLheAxtyS6EL8Rk=
In-Reply-To: <t1vm1c$9lm$1@dont-email.me>
 by: Tom Gardner - Tue, 29 Mar 2022 22:54 UTC

On 29/03/22 20:18, David Brown wrote:
> On 29/03/2022 17:17, Tom Gardner wrote:
>> On 29/03/22 15:16, David Brown wrote:
>>> On 29/03/2022 15:00, Rickster wrote:
>>>> Yeah, it's raw data.  Did you have a point?
>>
>> I have no point.
>>
>
> Fair enough, I suppose. But was there a reason for the post then?

Primarily to provoke thought and discussion, and
secondarily to point to occurrences that Tesla fanbois
and Musk prefer to sweep under the carpet.

>> I am curious about the causes of crashes when "autopilot" is engaged.
>>
>
> That's a reasonable thing to wonder about. The more we (people in
> general, Tesla drivers, Tesla developers, etc.) know about such crashes,
> the better the possibilities for fixing weaknesses or understanding how
> to mitigate them. Unfortunately, the main mitigation is "don't rely on
> autopilot - stay alert and focused on the driving" does not work. For
> one thing, many people don't obey it - people have been found in the
> back seat of crashed Telsa's where they were having a nap. And those
> that try to follow it are likely to doze off from boredom.

Agreed.

Musk and his /very/ carefully worded advertising don't help
matters. That should be challenged by evidence.

I haven't seen such evidence collated anywhere else.

> However, there is no need for a list of "crashes involving Teslas",
> names of victims, and a site with a clear agenda to "prove" that Teslas
> are not as safe as they claim. It is counter-productive to real
> investigation and real learning.

As far as I can see the website does not name the dead.
The linked references may do.

Musk makes outlandish claims about his cars, which need
debunking in order to help prevent more unnecessary
accidents.

From https://catless.ncl.ac.uk/Risks/33/11/#subj3
"Weeks earlier, a Tesla using the company's advanced
driver-assistance system had crashed into a tractor-trailer
at about 70 mph, killing the driver. When National Highway
Traffic Safety Administration officials called Tesla
executives to say they were launching an investigation,
Musk screamed, protested and threatened to sue, said a
former safety official who spoke on the condition of
anonymity to discuss sensitive matters.

"The regulators knew Musk could be impulsive and stubborn;
they would need to show some spine to win his cooperation.
So they waited. And in a subsequent call, “when tempers were
a little bit cool, Musk agreed to cooperate: He was a
changed person.'' "
https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation

>> There is an attempt at comparisons, as stated in the FAQ.
>
> It is a pretty feeble attempt, hidden away.
>
> Even the comparison of "autopilot" deaths to total deaths is useless
> without information about autopilot use, and how many people rely on it.

That's too strong, but I agree most ratios (including that one)
aren't that enlightening.

> The whole post just struck me as a bit below par for your usual high
> standard. There's definitely an interesting thread possibility around
> the idea of how safe or dangerous car "autopilots" can be, and how they
> compare to average drivers. But your post was not a great starting
> point for that.

Real world experiences aren't a bad /starting/ point, but
they do have limitations. Better starting points are to
be welcomed.

An issue is, of course, that any single experience can be
dismissed as an unrepresentative aberration. Collation of
experiences is necessary.

Some of the dashcam "Tesla's making mistakes" videos on
yootoob aren't confidence inspiring. Based on one I saw,
I certainly wouldn't dare let a Tesla drive itself in
an urban environment,

I suspect there isn't sufficient experience to assess
relative dangers between "artificial intelligence" and
"natural stupidity".

Re: Every Tesla Accident Resulting in Death

<t20cah$5cs$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93335&group=sci.electronics.design#93335

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: blockedo...@foo.invalid (Don Y)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Tue, 29 Mar 2022 18:38:08 -0700
Organization: A noiseless patient Spider
Lines: 72
Message-ID: <t20cah$5cs$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Wed, 30 Mar 2022 01:38:25 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="5fdf629b4855e2cfedbffb1d324efcdd";
logging-data="5532"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19W5jVOKFxh80MbMD/ojczz"
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:52.0) Gecko/20100101
Thunderbird/52.1.1
Cancel-Lock: sha1:bmIzP1lOdo8YXyeTILj4RWkOhj8=
In-Reply-To: <t202na$bvu$1@dont-email.me>
Content-Language: en-US
 by: Don Y - Wed, 30 Mar 2022 01:38 UTC

On 3/29/2022 3:54 PM, Tom Gardner wrote:
> Some of the dashcam "Tesla's making mistakes" videos on
> yootoob aren't confidence inspiring. Based on one I saw,
> I certainly wouldn't dare let a Tesla drive itself in
> an urban environment,

+1

I'm not sure I'd rely on any of these technologies to do
more than *help* me (definitely not *replace* me!)

E.g., I like the LIDAR warning me that a vehicle is
about to pass behind my parked car as I'm backing out...
because I often can't see "a few seconds" off to each
side, given the presence of other vehicles parked on each
side of mine. But, I still look over my shoulder AND
watch the backup camera as I pull out.

> I suspect there isn't sufficient experience to assess
> relative dangers between "artificial intelligence" and
> "natural stupidity".

I'm not sure it can all be distilled to "natural stupidity".

When we last looked for a new vehicle, one of the salespersons
commented on some of this "advisory tech" with such exuberance:
"Oh, yeah! It works GREAT! I don't even bother to *look*,
anymore!"

And, to the average Joe, why should they HAVE to "look" if
the technology was (allegedly) performing that function?
("Oh, do you mean it doesn't really *work*? Then why are
you charging me for it? If I couldn't rely on the engine,
would you tell me to always wear good WALKING SHOES when
I set out on a drive???!")

And, "laziness" is often an issue.

I designed a LORAN-C -based autopilot (boat) in the 70's. You
typed in lat-lons of your destinations (a series) and the autopilot
would get you to them, correcting for drift ("cross-track error")
to ensure straight-line travel (a conventional autopilot just
kept the vessel pointed in the desired direction so ocean currents
would steadily push you off your desired course).

There was considerable debate about how to handle the sequencing
of destinations:
- should you automatically replace the current destination with
the *next* in the series, having reached the current? (and, what
do you use as criteria for reaching that current destination)
- should you require manual intervention to advance to the next
destination, having reached the current? And, if so, how will
the skipper know what the vessel's path will be AFTER overshooting
the destination? The autopilot will keep trying to return the vessel
to that position -- no control over throttle -- but how do you
anticipate the path that it will take in doing so?
- should you be able to alert the skipper to reaching the current
destination (in case he's in the stern of the vessel prepping
lobster pots for deployment)?
- should you incorporate throttle controls (what if you cut the
throttle on reaching the destination and the vessel then drifts
away from that spot)?
- should you "tune" the instrument to the vessel's characteristics
(helm control of a speedboat is considerably more responsive than
of a fishing trawler!)

There's no real "right" answer -- short of taking over more control
of the vessel (which then poses different problems).

So, you recognize the fact that skippers will act in whatever way
suits them -- at the moment -- and don't try to be their "nanny"
(cuz anything you do in that regard they will UNdo)

Re: Every Tesla Accident Resulting in Death

<7fd76b80-2511-43ab-bd43-591cb02a3cabn@googlegroups.com>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93344&group=sci.electronics.design#93344

 copy link   Newsgroups: sci.electronics.design
X-Received: by 2002:ac8:5883:0:b0:2e1:c6f9:a12f with SMTP id t3-20020ac85883000000b002e1c6f9a12fmr31291112qta.439.1648615557304;
Tue, 29 Mar 2022 21:45:57 -0700 (PDT)
X-Received: by 2002:a25:23c2:0:b0:633:b871:ce27 with SMTP id
j185-20020a2523c2000000b00633b871ce27mr32314027ybj.644.1648615557058; Tue, 29
Mar 2022 21:45:57 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!nntp.club.cc.cmu.edu!45.76.7.193.MISMATCH!3.us.feeder.erje.net!feeder.erje.net!border1.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.electronics.design
Date: Tue, 29 Mar 2022 21:45:56 -0700 (PDT)
In-Reply-To: <t202na$bvu$1@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=24.138.223.107; posting-account=I-_H_woAAAA9zzro6crtEpUAyIvzd19b
NNTP-Posting-Host: 24.138.223.107
References: <t1upig$tmg$2@dont-email.me> <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me> <t1vm1c$9lm$1@dont-email.me>
<t202na$bvu$1@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <7fd76b80-2511-43ab-bd43-591cb02a3cabn@googlegroups.com>
Subject: Re: Every Tesla Accident Resulting in Death
From: gnuarm.d...@gmail.com (Rickster)
Injection-Date: Wed, 30 Mar 2022 04:45:57 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
Lines: 143
 by: Rickster - Wed, 30 Mar 2022 04:45 UTC

On Tuesday, March 29, 2022 at 6:54:41 PM UTC-4, Tom Gardner wrote:
> On 29/03/22 20:18, David Brown wrote:
> > On 29/03/2022 17:17, Tom Gardner wrote:
> >> On 29/03/22 15:16, David Brown wrote:
> >>> On 29/03/2022 15:00, Rickster wrote:
> >>>> Yeah, it's raw data. Did you have a point?
> >>
> >> I have no point.
> >>
> >
> > Fair enough, I suppose. But was there a reason for the post then?
> Primarily to provoke thought and discussion, and
> secondarily to point to occurrences that Tesla fanbois
> and Musk prefer to sweep under the carpet.

You haven't point out anything useful. You posted a link to what is really a pretty crappy web page. Where's the thought, where's the discussion? You said yourself you had nothing to say about it. Ok, thanks for the link. Bye.

> >> I am curious about the causes of crashes when "autopilot" is engaged.
> >>
> >
> > That's a reasonable thing to wonder about. The more we (people in
> > general, Tesla drivers, Tesla developers, etc.) know about such crashes,
> > the better the possibilities for fixing weaknesses or understanding how
> > to mitigate them. Unfortunately, the main mitigation is "don't rely on
> > autopilot - stay alert and focused on the driving" does not work. For
> > one thing, many people don't obey it - people have been found in the
> > back seat of crashed Telsa's where they were having a nap. And those
> > that try to follow it are likely to doze off from boredom.
> Agreed.
>
> Musk and his /very/ carefully worded advertising don't help
> matters. That should be challenged by evidence.

Ok. where's the evidence?

> I haven't seen such evidence collated anywhere else.

I still haven't seen any evidence, although I'm not sure what it is supposed to be evidence of.

> > However, there is no need for a list of "crashes involving Teslas",
> > names of victims, and a site with a clear agenda to "prove" that Teslas
> > are not as safe as they claim. It is counter-productive to real
> > investigation and real learning.
> As far as I can see the website does not name the dead.
> The linked references may do.
>
> Musk makes outlandish claims about his cars, which need
> debunking in order to help prevent more unnecessary
> accidents.
>
> From https://catless.ncl.ac.uk/Risks/33/11/#subj3
> "Weeks earlier, a Tesla using the company's advanced
> driver-assistance system had crashed into a tractor-trailer
> at about 70 mph, killing the driver. When National Highway
> Traffic Safety Administration officials called Tesla
> executives to say they were launching an investigation,
> Musk screamed, protested and threatened to sue, said a
> former safety official who spoke on the condition of
> anonymity to discuss sensitive matters.
>
> "The regulators knew Musk could be impulsive and stubborn;
> they would need to show some spine to win his cooperation.
> So they waited. And in a subsequent call, “when tempers were
> a little bit cool, Musk agreed to cooperate: He was a
> changed person.'' "
> https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation

Ok, so???

I think we all know Musk is a jerk. He's a huge PT Barnum sales person too.. Who didn't know that? What's your point?

> >> There is an attempt at comparisons, as stated in the FAQ.
> >
> > It is a pretty feeble attempt, hidden away.
> >
> > Even the comparison of "autopilot" deaths to total deaths is useless
> > without information about autopilot use, and how many people rely on it..
> That's too strong, but I agree most ratios (including that one)
> aren't that enlightening.

I'm happy to see any ratios that mean anything, but I didn't see them. I saw a table of incidents which included at least one death. Where are the comparisons?

> > The whole post just struck me as a bit below par for your usual high
> > standard. There's definitely an interesting thread possibility around
> > the idea of how safe or dangerous car "autopilots" can be, and how they
> > compare to average drivers. But your post was not a great starting
> > point for that.
> Real world experiences aren't a bad /starting/ point, but
> they do have limitations. Better starting points are to
> be welcomed.
>
> An issue is, of course, that any single experience can be
> dismissed as an unrepresentative aberration. Collation of
> experiences is necessary.
>
> Some of the dashcam "Tesla's making mistakes" videos on
> yootoob aren't confidence inspiring. Based on one I saw,
> I certainly wouldn't dare let a Tesla drive itself in
> an urban environment,

You aren't supposed to let a Tesla drive itself in any environment. You are the driver. Autopilot is just a driving assistance tool. You seem to think autopilot is autonomous driving. It's not even remotely close. If that's what you are looking for, you won't find anyone from Tesla claiming autopilot is anything other than an "assist", including Musk.

> I suspect there isn't sufficient experience to assess
> relative dangers between "artificial intelligence" and
> "natural stupidity".

I'm not sure what you wish to measure. That's what a comparison does, it measures one thing vs. another in terms of some measurement. What exactly do you want to measure? Or are you just on a fishing trip looking for something damning to Musk or Tesla?

--

Rick C.

-- Get 1,000 miles of free Supercharging
-- Tesla referral code - https://ts.la/richard11209

Re: Every Tesla Accident Resulting in Death

<t20t89$78k$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93353&group=sci.electronics.design#93353

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: david.br...@hesbynett.no (David Brown)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Wed, 30 Mar 2022 08:27:21 +0200
Organization: A noiseless patient Spider
Lines: 152
Message-ID: <t20t89$78k$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
Injection-Date: Wed, 30 Mar 2022 06:27:21 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="eac1f14fceebe97d49d0f10bdf9ee6ea";
logging-data="7444"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+ytGvqn/NSrhQoYnZNGBf8XKRiuyvmWN4="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.11.0
Cancel-Lock: sha1:P9d3Z+VK9yTd4Vn1BlwqNBTfhTI=
In-Reply-To: <t202na$bvu$1@dont-email.me>
Content-Language: en-GB
 by: David Brown - Wed, 30 Mar 2022 06:27 UTC

On 30/03/2022 00:54, Tom Gardner wrote:
> On 29/03/22 20:18, David Brown wrote:
>> On 29/03/2022 17:17, Tom Gardner wrote:
>>> On 29/03/22 15:16, David Brown wrote:
>>>> On 29/03/2022 15:00, Rickster wrote:
>>>>> Yeah, it's raw data.  Did you have a point?
>>>
>>> I have no point.
>>>
>>
>> Fair enough, I suppose.  But was there a reason for the post then?
>
> Primarily to provoke thought and discussion, and
> secondarily to point to occurrences that Tesla fanbois
> and Musk prefer to sweep under the carpet.
>
>
>>> I am curious about the causes of crashes when "autopilot" is engaged.
>>>
>>
>> That's a reasonable thing to wonder about.  The more we (people in
>> general, Tesla drivers, Tesla developers, etc.) know about such crashes,
>> the better the possibilities for fixing weaknesses or understanding how
>> to mitigate them.  Unfortunately, the main mitigation is "don't rely on
>> autopilot - stay alert and focused on the driving" does not work.  For
>> one thing, many people don't obey it - people have been found in the
>> back seat of crashed Telsa's where they were having a nap.  And those
>> that try to follow it are likely to doze off from boredom.
>
> Agreed.
>
> Musk and his /very/ carefully worded advertising don't help
> matters. That should be challenged by evidence.
>
> I haven't seen such evidence collated anywhere else.

But that site does not have evidence of anything relevant. It shows
that people sometimes die on the road, even in Teslas. Nothing more.

If the Tesla people are using false or misleading advertising, or making
safety claims that can't be verified, then I agree they should be held
accountable. Collect evidence to show that - /real/ comparisons and
/real/ statistics.

Progress was not made against tobacco companies by compiling lists of
people who smoked and then died. It was done by comparing the death
rates of people who smoked to those of people who don't smoke.

>
>
>
>> However, there is no need for a list of "crashes involving Teslas",
>> names of victims, and a site with a clear agenda to "prove" that Teslas
>> are not as safe as they claim.  It is counter-productive to real
>> investigation and real learning.
>
> As far as I can see the website does not name the dead.
> The linked references may do.

From your initial post (you read what you quoted, didn't you?) :

"""
We provide an updated record of Tesla fatalities and Tesla accident deaths
that have been reported and as much related crash data as possible
(e.g. location of crash, names of deceased, etc.).
"""

>
> Musk makes outlandish claims about his cars, which need
> debunking in order to help prevent more unnecessary
> accidents.
>
> From https://catless.ncl.ac.uk/Risks/33/11/#subj3
>   "Weeks earlier, a Tesla using the company's advanced
>   driver-assistance system had crashed into a tractor-trailer
>   at about 70 mph, killing the driver. When National Highway
>   Traffic Safety Administration officials called Tesla
>   executives to say they were launching an investigation,
>   Musk screamed, protested and threatened to sue, said a
>   former safety official who spoke on the condition of
>   anonymity to discuss sensitive matters.
>
>   "The regulators knew Musk could be impulsive and stubborn;
>   they would need to show some spine to win his cooperation.
>   So they waited. And in a subsequent call, “when tempers were
>   a little bit cool, Musk agreed to cooperate: He was a
>   changed person.'' "
>  
> https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation
>

So people who know how to investigate these things are investigating
them. That's great. (It is also - in theory, at least - unbiased. The
autopilot might not have been at fault.) It's a lot better than some
amateur with a grudge, an ignorance of statistics and a google document
page.

>
>
>
>>> There is an attempt at comparisons, as stated in the FAQ.
>>
>> It is a pretty feeble attempt, hidden away.
>>
>> Even the comparison of "autopilot" deaths to total deaths is useless
>> without information about autopilot use, and how many people rely on it.
>
> That's too strong, but I agree most ratios (including that one)
> aren't that enlightening.

No, it is not "too strong". It is basic statistics. Bayes' theorem,
and all that. If a large proportion of people use autopilot, but only a
small fraction of the deaths had the autopilot on, then clearly the
autopilot reduces risks and saves lives (of those that drive Teslas - we
still know nothing of other car drivers).

>
>
>> The whole post just struck me as a bit below par for your usual high
>> standard.  There's definitely an interesting thread possibility around
>> the idea of how safe or dangerous car "autopilots" can be, and how they
>> compare to average drivers.  But your post was not a great starting
>> point for that.
>
> Real world experiences aren't a bad /starting/ point, but
> they do have limitations. Better starting points are to
> be welcomed.

Real world experiences are enough to say "this might be worth looking
at" - but no more than that.

>
> An issue is, of course, that any single experience can be
> dismissed as an unrepresentative aberration. Collation of
> experiences is necessary.
>
> Some of the dashcam "Tesla's making mistakes" videos on
> yootoob aren't confidence inspiring. Based on one I saw,
> I certainly wouldn't dare let a Tesla drive itself in
> an urban environment,
>
> I suspect there isn't sufficient experience to assess
> relative dangers between "artificial intelligence" and
> "natural stupidity".

I don't doubt at all that the Tesla autopilot makes mistakes. So do
human drivers. The interesting question is who makes fewer mistakes, or
mistakes with lower consequences - and that is a question for which no
amount of anecdotal yootoob videos or Tesla/Musk hate sites will help.
The only evidence you have so far is that people love to show that
something fancy and expensive is not always perfect, and I believe we
knew that already.

Re: Every Tesla Accident Resulting in Death

<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93524&group=sci.electronics.design#93524

 copy link   Newsgroups: sci.electronics.design
X-Received: by 2002:a05:6214:3c9:b0:440:c953:6923 with SMTP id ce9-20020a05621403c900b00440c9536923mr5705428qvb.94.1648759452592;
Thu, 31 Mar 2022 13:44:12 -0700 (PDT)
X-Received: by 2002:a0d:d50f:0:b0:2e5:bada:3948 with SMTP id
x15-20020a0dd50f000000b002e5bada3948mr6614610ywd.314.1648759451437; Thu, 31
Mar 2022 13:44:11 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.misty.com!border2.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.electronics.design
Date: Thu, 31 Mar 2022 13:44:11 -0700 (PDT)
In-Reply-To: <t20t89$78k$1@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=24.138.223.107; posting-account=I-_H_woAAAA9zzro6crtEpUAyIvzd19b
NNTP-Posting-Host: 24.138.223.107
References: <t1upig$tmg$2@dont-email.me> <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me> <t1vm1c$9lm$1@dont-email.me>
<t202na$bvu$1@dont-email.me> <t20t89$78k$1@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
Subject: Re: Every Tesla Accident Resulting in Death
From: gnuarm.d...@gmail.com (Ricky)
Injection-Date: Thu, 31 Mar 2022 20:44:12 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
Lines: 193
 by: Ricky - Thu, 31 Mar 2022 20:44 UTC

On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
> On 30/03/2022 00:54, Tom Gardner wrote:
> > On 29/03/22 20:18, David Brown wrote:
> >> On 29/03/2022 17:17, Tom Gardner wrote:
> >>> On 29/03/22 15:16, David Brown wrote:
> >>>> On 29/03/2022 15:00, Rickster wrote:
> >>>>> Yeah, it's raw data. Did you have a point?
> >>>
> >>> I have no point.
> >>>
> >>
> >> Fair enough, I suppose. But was there a reason for the post then?
> >
> > Primarily to provoke thought and discussion, and
> > secondarily to point to occurrences that Tesla fanbois
> > and Musk prefer to sweep under the carpet.
> >
> >
> >>> I am curious about the causes of crashes when "autopilot" is engaged.
> >>>
> >>
> >> That's a reasonable thing to wonder about. The more we (people in
> >> general, Tesla drivers, Tesla developers, etc.) know about such crashes,
> >> the better the possibilities for fixing weaknesses or understanding how
> >> to mitigate them. Unfortunately, the main mitigation is "don't rely on
> >> autopilot - stay alert and focused on the driving" does not work. For
> >> one thing, many people don't obey it - people have been found in the
> >> back seat of crashed Telsa's where they were having a nap. And those
> >> that try to follow it are likely to doze off from boredom.
> >
> > Agreed.
> >
> > Musk and his /very/ carefully worded advertising don't help
> > matters. That should be challenged by evidence.
> >
> > I haven't seen such evidence collated anywhere else.
> But that site does not have evidence of anything relevant. It shows
> that people sometimes die on the road, even in Teslas. Nothing more.
>
> If the Tesla people are using false or misleading advertising, or making
> safety claims that can't be verified, then I agree they should be held
> accountable. Collect evidence to show that - /real/ comparisons and
> /real/ statistics.
>
> Progress was not made against tobacco companies by compiling lists of
> people who smoked and then died. It was done by comparing the death
> rates of people who smoked to those of people who don't smoke.
> >
> >
> >
> >> However, there is no need for a list of "crashes involving Teslas",
> >> names of victims, and a site with a clear agenda to "prove" that Teslas
> >> are not as safe as they claim. It is counter-productive to real
> >> investigation and real learning.
> >
> > As far as I can see the website does not name the dead.
> > The linked references may do.
> From your initial post (you read what you quoted, didn't you?) :
> """
> We provide an updated record of Tesla fatalities and Tesla accident deaths
> that have been reported and as much related crash data as possible
> (e.g. location of crash, names of deceased, etc.).
> """
>
> >
> > Musk makes outlandish claims about his cars, which need
> > debunking in order to help prevent more unnecessary
> > accidents.
> >
> > From https://catless.ncl.ac.uk/Risks/33/11/#subj3
> > "Weeks earlier, a Tesla using the company's advanced
> > driver-assistance system had crashed into a tractor-trailer
> > at about 70 mph, killing the driver. When National Highway
> > Traffic Safety Administration officials called Tesla
> > executives to say they were launching an investigation,
> > Musk screamed, protested and threatened to sue, said a
> > former safety official who spoke on the condition of
> > anonymity to discuss sensitive matters.
> >
> > "The regulators knew Musk could be impulsive and stubborn;
> > they would need to show some spine to win his cooperation.
> > So they waited. And in a subsequent call, “when tempers were
> > a little bit cool, Musk agreed to cooperate: He was a
> > changed person.'' "
> >
> > https://www.washingtonpost.com/technology/2022/03/27/tesla-elon-musk-regulation
> >
> So people who know how to investigate these things are investigating
> them. That's great. (It is also - in theory, at least - unbiased. The
> autopilot might not have been at fault.) It's a lot better than some
> amateur with a grudge, an ignorance of statistics and a google document
> page.
> >
> >
> >
> >>> There is an attempt at comparisons, as stated in the FAQ.
> >>
> >> It is a pretty feeble attempt, hidden away.
> >>
> >> Even the comparison of "autopilot" deaths to total deaths is useless
> >> without information about autopilot use, and how many people rely on it.
> >
> > That's too strong, but I agree most ratios (including that one)
> > aren't that enlightening.
> No, it is not "too strong". It is basic statistics. Bayes' theorem,
> and all that. If a large proportion of people use autopilot, but only a
> small fraction of the deaths had the autopilot on, then clearly the
> autopilot reduces risks and saves lives (of those that drive Teslas - we
> still know nothing of other car drivers).

A simple comparison of numbers is not sufficient. Most Tesla autopilot usage is on highways which are much safer per mile driven than other roads. That's an inherent bias because while non-autopilot driving must include all situations, autopilot simply doesn't work in most environments.

> >> The whole post just struck me as a bit below par for your usual high
> >> standard. There's definitely an interesting thread possibility around
> >> the idea of how safe or dangerous car "autopilots" can be, and how they
> >> compare to average drivers. But your post was not a great starting
> >> point for that.
> >
> > Real world experiences aren't a bad /starting/ point, but
> > they do have limitations. Better starting points are to
> > be welcomed.
> Real world experiences are enough to say "this might be worth looking
> at" - but no more than that.
> >
> > An issue is, of course, that any single experience can be
> > dismissed as an unrepresentative aberration. Collation of
> > experiences is necessary.
> >
> > Some of the dashcam "Tesla's making mistakes" videos on
> > yootoob aren't confidence inspiring. Based on one I saw,
> > I certainly wouldn't dare let a Tesla drive itself in
> > an urban environment,
> >
> > I suspect there isn't sufficient experience to assess
> > relative dangers between "artificial intelligence" and
> > "natural stupidity".
> I don't doubt at all that the Tesla autopilot makes mistakes.

Which depends on how you define "mistakes". It's a bit like asking if your rear view mirror makes mistakes by not showing cars in the blind spot. The autopilot is not designed to drive the car. It is a tool to assist the driver. The driver is required to be responsible for the safe operation of the car at all times. I can point out to you the many, many times the car acts like a spaz and requires me to manage the situation. Early on, there was a left turn like on a 50 mph road, the car would want to turn into when intending to drive straight. Fortunately they have ironed out that level of issue. But it was always my responsibility to prevent it from causing an accident. So how would you say anything was the fault of the autopilot?

> So do
> human drivers. The interesting question is who makes fewer mistakes, or
> mistakes with lower consequences - and that is a question for which no
> amount of anecdotal yootoob videos or Tesla/Musk hate sites will help.
> The only evidence you have so far is that people love to show that
> something fancy and expensive is not always perfect, and I believe we
> knew that already.

That's where they are headed with the full self driving. But gauging the breadth of issues the car has problems with, I think it will be a long, long time before we can sit back and relax while the car drives us home.

--

Rick C.

-+ Get 1,000 miles of free Supercharging
-+ Tesla referral code - https://ts.la/richard11209

Re: Every Tesla Accident Resulting in Death

<t257ir$67k$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93531&group=sci.electronics.design#93531

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: david.br...@hesbynett.no (David Brown)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Thu, 31 Mar 2022 23:48:10 +0200
Organization: A noiseless patient Spider
Lines: 99
Message-ID: <t257ir$67k$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 7bit
Injection-Date: Thu, 31 Mar 2022 21:48:11 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="ad3ff76f857c257177d29eba148523eb";
logging-data="6388"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19LpQSvdpFtgcd/tWsrxnc49XeYIwCdxk4="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.11.0
Cancel-Lock: sha1:4G2iCASsgeYPqq5XD2s+u0pC55U=
In-Reply-To: <26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
Content-Language: en-GB
 by: David Brown - Thu, 31 Mar 2022 21:48 UTC

On 31/03/2022 22:44, Ricky wrote:
> On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
>> On 30/03/2022 00:54, Tom Gardner wrote:
>>> On 29/03/22 20:18, David Brown wrote:

<snip>

>> No, it is not "too strong". It is basic statistics. Bayes' theorem,
>> and all that. If a large proportion of people use autopilot, but
>> only a small fraction of the deaths had the autopilot on, then
>> clearly the autopilot reduces risks and saves lives (of those that
>> drive Teslas - we still know nothing of other car drivers).
>
> A simple comparison of numbers is not sufficient. Most Tesla
> autopilot usage is on highways which are much safer per mile driven
> than other roads. That's an inherent bias because while
> non-autopilot driving must include all situations, autopilot simply
> doesn't work in most environments.
>

Yes. An apples-to-apples comparison is the aim, or at least as close as
one can get.

I suspect - without statistical justification - that the accidents
involving autopilot use are precisely cases where you don't have a good,
clear highway, and autopilot was used in a situation where it was not
suitable. Getting good statistics and comparisons here could be helpful
in making it safer - perhaps adding a feature that has the autopilot say
"This is not a good road for me - you have to drive yourself" and switch
itself off. (It would be more controversial, but probably statistically
safer, if it also sometimes said "I'm better at driving on this kind of
road than you are" and switching itself on!)

>>>
>>> An issue is, of course, that any single experience can be
>>> dismissed as an unrepresentative aberration. Collation of
>>> experiences is necessary.
>>>
>>> Some of the dashcam "Tesla's making mistakes" videos on yootoob
>>> aren't confidence inspiring. Based on one I saw, I certainly
>>> wouldn't dare let a Tesla drive itself in an urban environment,
>>>
>>> I suspect there isn't sufficient experience to assess relative
>>> dangers between "artificial intelligence" and "natural
>>> stupidity".
>> I don't doubt at all that the Tesla autopilot makes mistakes.
>
> Which depends on how you define "mistakes".

Of course.

> It's a bit like asking
> if your rear view mirror makes mistakes by not showing cars in the
> blind spot. The autopilot is not designed to drive the car. It is a
> tool to assist the driver. The driver is required to be responsible
> for the safe operation of the car at all times. I can point out to
> you the many, many times the car acts like a spaz and requires me to
> manage the situation. Early on, there was a left turn like on a 50
> mph road, the car would want to turn into when intending to drive
> straight. Fortunately they have ironed out that level of issue. But
> it was always my responsibility to prevent it from causing an
> accident. So how would you say anything was the fault of the
> autopilot?
>

There are a few possibilities here (though I am not trying to claim that
any of them are "right" in some objective sense). You might say they
had believed that the "autopilot" was like a plane autopilot - you can
turn it on and leave it to safely drive itself for most of the journey
except perhaps the very beginning and very end of the trip. As you say,
the Tesla autopilot is /not/ designed for that - that might be a mistake
from the salesmen, advertisers, user-interface designers, or just the
driver's mistake.

And sometimes the autopilot does something daft - it is no longer
assisting the driver, but working against him or her. That, I think,
should be counted as a mistake by the autopilot. Tesla autopilots are
not alone in this, of course. I have heard of several cases where
"smart" cruise controls on cars have been confused by things like
changes to road layouts or when driving in tunnels underneath parts of a
city, and suddenly braking hard due to speed limit changes on surface
roads that don't apply in the tunnel.

>
>> So do human drivers. The interesting question is who makes fewer
>> mistakes, or mistakes with lower consequences - and that is a
>> question for which no amount of anecdotal yootoob videos or
>> Tesla/Musk hate sites will help. The only evidence you have so far
>> is that people love to show that something fancy and expensive is
>> not always perfect, and I believe we knew that already.
>
> That's where they are headed with the full self driving. But gauging
> the breadth of issues the car has problems with, I think it will be a
> long, long time before we can sit back and relax while the car drives
> us home.
>

Yes. Automatic driving is progressing, but it has a long way to go as yet.

Re: Every Tesla Accident Resulting in Death

<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93535&group=sci.electronics.design#93535

 copy link   Newsgroups: sci.electronics.design
X-Received: by 2002:a37:f50b:0:b0:680:d577:baf6 with SMTP id l11-20020a37f50b000000b00680d577baf6mr5039680qkk.328.1648765789047;
Thu, 31 Mar 2022 15:29:49 -0700 (PDT)
X-Received: by 2002:a25:8382:0:b0:63d:6201:fa73 with SMTP id
t2-20020a258382000000b0063d6201fa73mr188593ybk.55.1648765788627; Thu, 31 Mar
2022 15:29:48 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.misty.com!border2.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.electronics.design
Date: Thu, 31 Mar 2022 15:29:48 -0700 (PDT)
In-Reply-To: <t257ir$67k$1@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=24.138.223.107; posting-account=I-_H_woAAAA9zzro6crtEpUAyIvzd19b
NNTP-Posting-Host: 24.138.223.107
References: <t1upig$tmg$2@dont-email.me> <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me> <t1vm1c$9lm$1@dont-email.me>
<t202na$bvu$1@dont-email.me> <t20t89$78k$1@dont-email.me> <26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
Subject: Re: Every Tesla Accident Resulting in Death
From: gnuarm.d...@gmail.com (Ricky)
Injection-Date: Thu, 31 Mar 2022 22:29:49 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
Lines: 107
 by: Ricky - Thu, 31 Mar 2022 22:29 UTC

On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
> On 31/03/2022 22:44, Ricky wrote:
> > On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
> >> On 30/03/2022 00:54, Tom Gardner wrote:
> >>> On 29/03/22 20:18, David Brown wrote:
> <snip>
> >> No, it is not "too strong". It is basic statistics. Bayes' theorem,
> >> and all that. If a large proportion of people use autopilot, but
> >> only a small fraction of the deaths had the autopilot on, then
> >> clearly the autopilot reduces risks and saves lives (of those that
> >> drive Teslas - we still know nothing of other car drivers).
> >
> > A simple comparison of numbers is not sufficient. Most Tesla
> > autopilot usage is on highways which are much safer per mile driven
> > than other roads. That's an inherent bias because while
> > non-autopilot driving must include all situations, autopilot simply
> > doesn't work in most environments.
> >
> Yes. An apples-to-apples comparison is the aim, or at least as close as
> one can get.
>
> I suspect - without statistical justification -

Yes, without justification, at all.

> that the accidents
> involving autopilot use are precisely cases where you don't have a good,
> clear highway, and autopilot was used in a situation where it was not
> suitable. Getting good statistics and comparisons here could be helpful
> in making it safer - perhaps adding a feature that has the autopilot say
> "This is not a good road for me - you have to drive yourself" and switch
> itself off. (It would be more controversial, but probably statistically
> safer, if it also sometimes said "I'm better at driving on this kind of
> road than you are" and switching itself on!)
> >>>
> >>> An issue is, of course, that any single experience can be
> >>> dismissed as an unrepresentative aberration. Collation of
> >>> experiences is necessary.
> >>>
> >>> Some of the dashcam "Tesla's making mistakes" videos on yootoob
> >>> aren't confidence inspiring. Based on one I saw, I certainly
> >>> wouldn't dare let a Tesla drive itself in an urban environment,
> >>>
> >>> I suspect there isn't sufficient experience to assess relative
> >>> dangers between "artificial intelligence" and "natural
> >>> stupidity".
> >> I don't doubt at all that the Tesla autopilot makes mistakes.
> >
> > Which depends on how you define "mistakes".
> Of course.
> > It's a bit like asking
> > if your rear view mirror makes mistakes by not showing cars in the
> > blind spot. The autopilot is not designed to drive the car. It is a
> > tool to assist the driver. The driver is required to be responsible
> > for the safe operation of the car at all times. I can point out to
> > you the many, many times the car acts like a spaz and requires me to
> > manage the situation. Early on, there was a left turn like on a 50
> > mph road, the car would want to turn into when intending to drive
> > straight. Fortunately they have ironed out that level of issue. But
> > it was always my responsibility to prevent it from causing an
> > accident. So how would you say anything was the fault of the
> > autopilot?
> >
> There are a few possibilities here (though I am not trying to claim that
> any of them are "right" in some objective sense). You might say they
> had believed that the "autopilot" was like a plane autopilot -

It is exactly like an airplane autopilot.

> you can
> turn it on and leave it to safely drive itself for most of the journey
> except perhaps the very beginning and very end of the trip. As you say,
> the Tesla autopilot is /not/ designed for that - that might be a mistake
> from the salesmen, advertisers, user-interface designers, or just the
> driver's mistake.

Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.

> And sometimes the autopilot does something daft - it is no longer
> assisting the driver, but working against him or her. That, I think,
> should be counted as a mistake by the autopilot.

The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.

--

Rick C.

+- Get 1,000 miles of free Supercharging
+- Tesla referral code - https://ts.la/richard11209

Re: Every Tesla Accident Resulting in Death

<t25ai8$tbg$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93538&group=sci.electronics.design#93538

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: david.br...@hesbynett.no (David Brown)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 1 Apr 2022 00:39:03 +0200
Organization: A noiseless patient Spider
Lines: 102
Message-ID: <t25ai8$tbg$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 7bit
Injection-Date: Thu, 31 Mar 2022 22:39:04 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="8a133f0bc55e0678822d6ab5e0f9da0e";
logging-data="30064"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX181q5t2b8vyTv7R4IQQ+/ikVQ/d5LQdey4="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.11.0
Cancel-Lock: sha1:MbO4aIh0xEpYiNPpdOfIqPfQNzk=
In-Reply-To: <b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
Content-Language: en-GB
 by: David Brown - Thu, 31 Mar 2022 22:39 UTC

On 01/04/2022 00:29, Ricky wrote:
> On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
>> On 31/03/2022 22:44, Ricky wrote:
>>> On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
>>>> On 30/03/2022 00:54, Tom Gardner wrote:
>>>>> On 29/03/22 20:18, David Brown wrote:
>> <snip>
>>>> No, it is not "too strong". It is basic statistics. Bayes' theorem,
>>>> and all that. If a large proportion of people use autopilot, but
>>>> only a small fraction of the deaths had the autopilot on, then
>>>> clearly the autopilot reduces risks and saves lives (of those that
>>>> drive Teslas - we still know nothing of other car drivers).
>>>
>>> A simple comparison of numbers is not sufficient. Most Tesla
>>> autopilot usage is on highways which are much safer per mile driven
>>> than other roads. That's an inherent bias because while
>>> non-autopilot driving must include all situations, autopilot simply
>>> doesn't work in most environments.
>>>
>> Yes. An apples-to-apples comparison is the aim, or at least as close as
>> one can get.
>>
>> I suspect - without statistical justification -
>
> Yes, without justification, at all.

Which do /you/ think is most likely? Autopilot crashes on the motorway,
or autopilot crashes on smaller roads?

>
>> that the accidents
>> involving autopilot use are precisely cases where you don't have a good,
>> clear highway, and autopilot was used in a situation where it was not
>> suitable. Getting good statistics and comparisons here could be helpful
>> in making it safer - perhaps adding a feature that has the autopilot say
>> "This is not a good road for me - you have to drive yourself" and switch
>> itself off. (It would be more controversial, but probably statistically
>> safer, if it also sometimes said "I'm better at driving on this kind of
>> road than you are" and switching itself on!)
>>>>>
>>>>> An issue is, of course, that any single experience can be
>>>>> dismissed as an unrepresentative aberration. Collation of
>>>>> experiences is necessary.
>>>>>
>>>>> Some of the dashcam "Tesla's making mistakes" videos on yootoob
>>>>> aren't confidence inspiring. Based on one I saw, I certainly
>>>>> wouldn't dare let a Tesla drive itself in an urban environment,
>>>>>
>>>>> I suspect there isn't sufficient experience to assess relative
>>>>> dangers between "artificial intelligence" and "natural
>>>>> stupidity".
>>>> I don't doubt at all that the Tesla autopilot makes mistakes.
>>>
>>> Which depends on how you define "mistakes".
>> Of course.
>>> It's a bit like asking
>>> if your rear view mirror makes mistakes by not showing cars in the
>>> blind spot. The autopilot is not designed to drive the car. It is a
>>> tool to assist the driver. The driver is required to be responsible
>>> for the safe operation of the car at all times. I can point out to
>>> you the many, many times the car acts like a spaz and requires me to
>>> manage the situation. Early on, there was a left turn like on a 50
>>> mph road, the car would want to turn into when intending to drive
>>> straight. Fortunately they have ironed out that level of issue. But
>>> it was always my responsibility to prevent it from causing an
>>> accident. So how would you say anything was the fault of the
>>> autopilot?
>>>
>> There are a few possibilities here (though I am not trying to claim that
>> any of them are "right" in some objective sense). You might say they
>> had believed that the "autopilot" was like a plane autopilot -
>
> It is exactly like an airplane autopilot.
>
>
>> you can
>> turn it on and leave it to safely drive itself for most of the journey
>> except perhaps the very beginning and very end of the trip. As you say,
>> the Tesla autopilot is /not/ designed for that - that might be a mistake
>> from the salesmen, advertisers, user-interface designers, or just the
>> driver's mistake.
>
> Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.
>

Yes, that's the original idea of a plane autopilot. But modern ones are
more sophisticated and handle course changes along the planned route, as
well as being able to land automatically. And more important than what
plane autopilots actually /do/, is what people /think/ they do - and
remember we are talking about drivers that think their Tesla "autopilot"
will drive their car while they watch a movie or nap in the back seat.

>
>> And sometimes the autopilot does something daft - it is no longer
>> assisting the driver, but working against him or her. That, I think,
>> should be counted as a mistake by the autopilot.
>
> The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.
>

Well, "does something daft" is no worse than "acts like a spaz", and
it's a good deal more politically correct!

Re: Every Tesla Accident Resulting in Death

<t25ge5$27c$3@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93545&group=sci.electronics.design#93545

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: spamj...@blueyonder.co.uk (Tom Gardner)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 1 Apr 2022 01:19:17 +0100
Organization: A noiseless patient Spider
Lines: 27
Message-ID: <t25ge5$27c$3@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Fri, 1 Apr 2022 00:19:17 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="f337fdce28989ba2775f5cff9e9effed";
logging-data="2284"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX18aSfU0jAGnSCNEXAimvUi3"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101
Firefox/52.0 SeaMonkey/2.49.4
Cancel-Lock: sha1:VnGFZq0Fygo3g7VemAEJlcNClqA=
In-Reply-To: <t25ai8$tbg$1@dont-email.me>
 by: Tom Gardner - Fri, 1 Apr 2022 00:19 UTC

On 31/03/22 23:39, David Brown wrote:
> On 01/04/2022 00:29, Ricky wrote:

>> Sorry, that's not how an autopilot works. It doesn't fly the plane. It
>> simply maintains a heading and altitude.

They have been doing more than that for for > 50 years.
Cat 3b landings were in operation when I was a kid.

>> Someone still has to be watching
>> for other aircraft and otherwise flying the plane. In other words, the
>> pilot is responsible for flying the plane, with or without the autopilot.
>
> Yes, that's the original idea of a plane autopilot. But modern ones are more
> sophisticated and handle course changes along the planned route, as well as
> being able to land automatically. And more important than what plane
> autopilots actually /do/, is what people /think/ they do - and remember we
> are talking about drivers that think their Tesla "autopilot" will drive their
> car while they watch a movie or nap in the back seat.

And, to put it kindly, aren't discouraged in that misapprehension
by the statements of the cars' manufacturers and salesdroids.

Now, what's the best set of techniques to get that concept
into the heads of twats that think "autopilot" means "it does
it for me".

Re: Every Tesla Accident Resulting in Death

<t268d1$j8u$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93558&group=sci.electronics.design#93558

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: david.br...@hesbynett.no (David Brown)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 1 Apr 2022 09:08:16 +0200
Organization: A noiseless patient Spider
Lines: 40
Message-ID: <t268d1$j8u$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me> <t25ge5$27c$3@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
Injection-Date: Fri, 1 Apr 2022 07:08:17 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="d8975916bb6c8b3a71923212e910ce3c";
logging-data="19742"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/16D0oo+fpe4cRgqqQieDX1fZXYfIRMUI="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.11.0
Cancel-Lock: sha1:eAUUJDxIbLJxcasI8Pf5E3YnEmI=
In-Reply-To: <t25ge5$27c$3@dont-email.me>
Content-Language: en-GB
 by: David Brown - Fri, 1 Apr 2022 07:08 UTC

On 01/04/2022 02:19, Tom Gardner wrote:
> On 31/03/22 23:39, David Brown wrote:
>> On 01/04/2022 00:29, Ricky wrote:
>
>>> Someone still has to be watching
>>> for other aircraft and otherwise flying the plane.  In other words, the
>>> pilot is responsible for flying the plane, with or without the
>>> autopilot.
>>
>> Yes, that's the original idea of a plane autopilot.  But modern ones
>> are more
>> sophisticated and handle course changes along the planned route, as
>> well as
>> being able to land automatically.  And more important than what plane
>> autopilots actually /do/, is what people /think/ they do - and
>> remember we
>> are talking about drivers that think their Tesla "autopilot" will
>> drive their
>> car while they watch a movie or nap in the back seat.
>
> And, to put it kindly, aren't discouraged in that misapprehension
> by the statements of the cars' manufacturers and salesdroids.
>
> Now, what's the best set of techniques to get that concept
> into the heads of twats that think "autopilot" means "it does
> it for me".

You don't. Twats will always be twats. You fix the cars.

You start by changing the name. "Driver assistance" rather than
"autopilot".

You turn the steering wheel into a dead-man's handle - if the driver
releases it for more than, say, 2 seconds, the autopilot should first
beep violently, then pull over and stop the car if the driver does not
pay attention. (Maybe you have "motorway mode" that allows a longer
delay time, since autopilot works better there, and perhaps also a
"traffic queue" mode with even longer delays.)

Re: Every Tesla Accident Resulting in Death

<t268u5$tn7$2@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93559&group=sci.electronics.design#93559

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: spamj...@blueyonder.co.uk (Tom Gardner)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 1 Apr 2022 08:17:24 +0100
Organization: A noiseless patient Spider
Lines: 59
Message-ID: <t268u5$tn7$2@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me> <t25ge5$27c$3@dont-email.me>
<t268d1$j8u$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Fri, 1 Apr 2022 07:17:25 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="f337fdce28989ba2775f5cff9e9effed";
logging-data="30439"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1++eXyGv6SG052IKPfDGnS7"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101
Firefox/52.0 SeaMonkey/2.49.4
Cancel-Lock: sha1:59DLz/uqttEFdVuS/ZdovVLZSR4=
In-Reply-To: <t268d1$j8u$1@dont-email.me>
 by: Tom Gardner - Fri, 1 Apr 2022 07:17 UTC

On 01/04/22 08:08, David Brown wrote:
> On 01/04/2022 02:19, Tom Gardner wrote:
>> On 31/03/22 23:39, David Brown wrote:
>>> On 01/04/2022 00:29, Ricky wrote:
>>
>>>> Someone still has to be watching
>>>> for other aircraft and otherwise flying the plane.  In other words, the
>>>> pilot is responsible for flying the plane, with or without the
>>>> autopilot.
>>>
>>> Yes, that's the original idea of a plane autopilot.  But modern ones
>>> are more
>>> sophisticated and handle course changes along the planned route, as
>>> well as
>>> being able to land automatically.  And more important than what plane
>>> autopilots actually /do/, is what people /think/ they do - and
>>> remember we
>>> are talking about drivers that think their Tesla "autopilot" will
>>> drive their
>>> car while they watch a movie or nap in the back seat.
>>
>> And, to put it kindly, aren't discouraged in that misapprehension
>> by the statements of the cars' manufacturers and salesdroids.
>>
>> Now, what's the best set of techniques to get that concept
>> into the heads of twats that think "autopilot" means "it does
>> it for me".
>
> You don't. Twats will always be twats. You fix the cars.
>
> You start by changing the name. "Driver assistance" rather than
> "autopilot".

That's one of the things I was thinking of.

> You turn the steering wheel into a dead-man's handle - if the driver
> releases it for more than, say, 2 seconds, the autopilot should first
> beep violently, then pull over and stop the car if the driver does not
> pay attention.

I've wondered why they don't implement that, then realised
it would directly contradict their advertising.

> (Maybe you have "motorway mode" that allows a longer
> delay time, since autopilot works better there, and perhaps also a
> "traffic queue" mode with even longer delays.)

Modes are a pain[1]. Too often plane crash investigators hear
"what's it doing /now/" on the CVR.

There was also the case that wheel brakes should not be applied
until after landing, and that was defined by "wheels are rotating".
Then an aquaplaning aircraft skidded off the end of the runway!

[1]Remember the early Smalltalk T-shirt drawing attention
to the novel concept of WIMP interface using the motto
"don't mode me in"?)

Re: Every Tesla Accident Resulting in Death

<t26al0$uu8$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93560&group=sci.electronics.design#93560

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: blockedo...@foo.invalid (Don Y)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 1 Apr 2022 00:46:19 -0700
Organization: A noiseless patient Spider
Lines: 37
Message-ID: <t26al0$uu8$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me> <t25ge5$27c$3@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Fri, 1 Apr 2022 07:46:41 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="0712a263d69bbdacb042d963fe58a8e9";
logging-data="31688"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+r4PnW0vHiMDkR+LVFHmpP"
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:52.0) Gecko/20100101
Thunderbird/52.1.1
Cancel-Lock: sha1:VjQy2MGD3v9WQVu8kgms9C+dAT4=
In-Reply-To: <t25ge5$27c$3@dont-email.me>
Content-Language: en-US
 by: Don Y - Fri, 1 Apr 2022 07:46 UTC

On 3/31/2022 5:19 PM, Tom Gardner wrote:
> On 31/03/22 23:39, David Brown wrote:
>> On 01/04/2022 00:29, Ricky wrote:
>
>>> Sorry, that's not how an autopilot works. It doesn't fly the plane. It
>>> simply maintains a heading and altitude.
>
> They have been doing more than that for for > 50 years.
> Cat 3b landings were in operation when I was a kid.
>
>>> Someone still has to be watching
>>> for other aircraft and otherwise flying the plane. In other words, the
>>> pilot is responsible for flying the plane, with or without the autopilot.
>>
>> Yes, that's the original idea of a plane autopilot. But modern ones are more
>> sophisticated and handle course changes along the planned route, as well as
>> being able to land automatically. And more important than what plane
>> autopilots actually /do/, is what people /think/ they do - and remember we
>> are talking about drivers that think their Tesla "autopilot" will drive their
>> car while they watch a movie or nap in the back seat.
>
> And, to put it kindly, aren't discouraged in that misapprehension
> by the statements of the cars' manufacturers and salesdroids.
>
> Now, what's the best set of techniques to get that concept
> into the heads of twats that think "autopilot" means "it does
> it for me".

"Pilots" and "drivers" approach their efforts entirely differently
and with different mindsets.

ANYONE can drive a car. By contrast, a fair bit more understanding,
reasoning and skill is required to pilot an aircraft.

I.e., a pilot is a lot more likely to understand the function
AND LIMITATIONS of an (aircraft) autopilot than a driver is to
have similar appreciation for an (automobile) "autopilot".

Re: Every Tesla Accident Resulting in Death

<t26e57$gkc$1@gioia.aioe.org>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93562&group=sci.electronics.design#93562

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!aioe.org!ibADNtQA8F9SGMN+KcrLbQ.user.46.165.242.91.POSTED!not-for-mail
From: jer...@nospam.please (Jeroen Belleman)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 01 Apr 2022 10:46:30 +0200
Organization: Aioe.org NNTP Server
Message-ID: <t26e57$gkc$1@gioia.aioe.org>
References: <t1upig$tmg$2@dont-email.me> <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com> <t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me> <t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me> <t20t89$78k$1@dont-email.me> <26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com> <t257ir$67k$1@dont-email.me> <b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com> <t25ai8$tbg$1@dont-email.me> <t25ge5$27c$3@dont-email.me> <t268d1$j8u$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="17036"; posting-host="ibADNtQA8F9SGMN+KcrLbQ.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Thunderbird/31.0
X-Notice: Filtered by postfilter v. 0.9.2
 by: Jeroen Belleman - Fri, 1 Apr 2022 08:46 UTC

On 2022-04-01 09:08, David Brown wrote:
> On 01/04/2022 02:19, Tom Gardner wrote:
>> On 31/03/22 23:39, David Brown wrote:
>>> On 01/04/2022 00:29, Ricky wrote:
>>
>>>> Someone still has to be watching
>>>> for other aircraft and otherwise flying the plane. In other words, the
>>>> pilot is responsible for flying the plane, with or without the
>>>> autopilot.
>>>
>>> Yes, that's the original idea of a plane autopilot. But modern ones
>>> are more
>>> sophisticated and handle course changes along the planned route, as
>>> well as
>>> being able to land automatically. And more important than what plane
>>> autopilots actually /do/, is what people /think/ they do - and
>>> remember we
>>> are talking about drivers that think their Tesla "autopilot" will
>>> drive their
>>> car while they watch a movie or nap in the back seat.
>>
>> And, to put it kindly, aren't discouraged in that misapprehension
>> by the statements of the cars' manufacturers and salesdroids.
>>
>> Now, what's the best set of techniques to get that concept
>> into the heads of twats that think "autopilot" means "it does
>> it for me".
>
> You don't. Twats will always be twats. You fix the cars.
>
> You start by changing the name. "Driver assistance" rather than
> "autopilot".
>
> You turn the steering wheel into a dead-man's handle - if the driver
> releases it for more than, say, 2 seconds, the autopilot should first
> beep violently, then pull over and stop the car if the driver does not
> pay attention. (Maybe you have "motorway mode" that allows a longer
> delay time, since autopilot works better there, and perhaps also a
> "traffic queue" mode with even longer delays.)
>

All these 'assistants' with their multiple 'modes' only make things
more complicated and therefor unsafe. Simple is better.

I recently got a car that came standard with 'lane assist'. I
hate it. It's like having a passenger tugging on the steering wheel,
absolutely intolerable. It also can't be switched off permanently.
For the first week or two, I just blindfolded the camera it uses to
watch the road, until I found out how to switch it off with a single
button press. (There are far too many buttons, for that matter, and
all with multiple functions, too. Bad!)

That said, some automatic functions /are/ good. Climate control with
a real thermostat, auto-darkening rear view mirrors, mostly functions
that have nothing to do with driving per se. The only /good/ automatic
functions are those you don't notice until they /stop/ working. I also
like the GPS with head-up display.

Jeroen Belleman

Re: Every Tesla Accident Resulting in Death

<t26l2c$idv$2@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93566&group=sci.electronics.design#93566

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: spamj...@blueyonder.co.uk (Tom Gardner)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 1 Apr 2022 11:44:28 +0100
Organization: A noiseless patient Spider
Lines: 61
Message-ID: <t26l2c$idv$2@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me> <t25ge5$27c$3@dont-email.me>
<t26al0$uu8$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Fri, 1 Apr 2022 10:44:28 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="f337fdce28989ba2775f5cff9e9effed";
logging-data="18879"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19X7BDGEiNMY8/OwPR7u4cF"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101
Firefox/52.0 SeaMonkey/2.49.4
Cancel-Lock: sha1:2WRe1KPj1qvaDN2ZazbSx3SzSEs=
In-Reply-To: <t26al0$uu8$1@dont-email.me>
 by: Tom Gardner - Fri, 1 Apr 2022 10:44 UTC

On 01/04/22 08:46, Don Y wrote:
> On 3/31/2022 5:19 PM, Tom Gardner wrote:
>> On 31/03/22 23:39, David Brown wrote:
>>> On 01/04/2022 00:29, Ricky wrote:
>>
>>>> Sorry, that's not how an autopilot works.  It doesn't fly the plane.  It
>>>> simply maintains a heading and altitude.
>>
>> They have been doing more than that for for > 50 years.
>> Cat 3b landings were in operation when I was a kid.
>>
>>>> Someone still has to be watching
>>>> for other aircraft and otherwise flying the plane.  In other words, the
>>>> pilot is responsible for flying the plane, with or without the autopilot.
>>>
>>> Yes, that's the original idea of a plane autopilot.  But modern ones are more
>>> sophisticated and handle course changes along the planned route, as well as
>>> being able to land automatically.  And more important than what plane
>>> autopilots actually /do/, is what people /think/ they do - and remember we
>>> are talking about drivers that think their Tesla "autopilot" will drive their
>>> car while they watch a movie or nap in the back seat.
>>
>> And, to put it kindly, aren't discouraged in that misapprehension
>> by the statements of the cars' manufacturers and salesdroids.
>>
>> Now, what's the best set of techniques to get that concept
>> into the heads of twats that think "autopilot" means "it does
>> it for me".
>
> "Pilots" and "drivers" approach their efforts entirely differently
> and with different mindsets.

They should do in one sense (differing machine/automation)
and shouldn't in another (both are lethal instruments).

Problem starts with the marketing.

> ANYONE can drive a car.  By contrast, a fair bit more understanding,
> reasoning and skill is required to pilot an aircraft.

Not entirely sure about that. 14yo can be solo, and a
very few are even aerobatic pilots.

The main difference is that you can't stop and catch
your breath, or stop and have a pee.

Overall learning to fly a glider is pretty much similar
to learning to drive - in cost, time and skill.

The training
is more rigorous, though, and isn't a one-off event.

> I.e., a pilot is a lot more likely to understand the function
> AND LIMITATIONS of an (aircraft) autopilot than a driver is to
> have similar appreciation for an (automobile) "autopilot".

Pilots often don't understand what's going on; just
listen to the accident reports on the news :(

Re: Every Tesla Accident Resulting in Death

<t26nt6$d1e$1@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93567&group=sci.electronics.design#93567

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: blockedo...@foo.invalid (Don Y)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 1 Apr 2022 04:32:32 -0700
Organization: A noiseless patient Spider
Lines: 106
Message-ID: <t26nt6$d1e$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me> <t25ge5$27c$3@dont-email.me>
<t26al0$uu8$1@dont-email.me> <t26l2c$idv$2@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Fri, 1 Apr 2022 11:32:54 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="0712a263d69bbdacb042d963fe58a8e9";
logging-data="13358"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX18EvAO2wGGFZFjoByHt55sX"
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:52.0) Gecko/20100101
Thunderbird/52.1.1
Cancel-Lock: sha1:Guwx8XuN1zbvt9KbIeSy+lupdxI=
In-Reply-To: <t26l2c$idv$2@dont-email.me>
Content-Language: en-US
 by: Don Y - Fri, 1 Apr 2022 11:32 UTC

On 4/1/2022 3:44 AM, Tom Gardner wrote:
> On 01/04/22 08:46, Don Y wrote:
>> On 3/31/2022 5:19 PM, Tom Gardner wrote:
>>> On 31/03/22 23:39, David Brown wrote:
>>>> On 01/04/2022 00:29, Ricky wrote:
>>>
>>>>> Sorry, that's not how an autopilot works. It doesn't fly the plane. It
>>>>> simply maintains a heading and altitude.
>>>
>>> They have been doing more than that for for > 50 years.
>>> Cat 3b landings were in operation when I was a kid.
>>>
>>>>> Someone still has to be watching
>>>>> for other aircraft and otherwise flying the plane. In other words, the
>>>>> pilot is responsible for flying the plane, with or without the autopilot.
>>>>
>>>> Yes, that's the original idea of a plane autopilot. But modern ones are more
>>>> sophisticated and handle course changes along the planned route, as well as
>>>> being able to land automatically. And more important than what plane
>>>> autopilots actually /do/, is what people /think/ they do - and remember we
>>>> are talking about drivers that think their Tesla "autopilot" will drive their
>>>> car while they watch a movie or nap in the back seat.
>>>
>>> And, to put it kindly, aren't discouraged in that misapprehension
>>> by the statements of the cars' manufacturers and salesdroids.
>>>
>>> Now, what's the best set of techniques to get that concept
>>> into the heads of twats that think "autopilot" means "it does
>>> it for me".
>>
>> "Pilots" and "drivers" approach their efforts entirely differently
>> and with different mindsets.
>
> They should do in one sense (differing machine/automation)
> and shouldn't in another (both are lethal instruments).
>
> Problem starts with the marketing.

Cars are far more ubiquitous. And, navigation is a 2-dimensional activity.

An "average joe" isn't likely to think hes gonna "hop in a piper cub" and
be off on a jaunt to run errands. And, navigation is a 3-dimensional
undertaking (you don't worry about vehicles "above" or "below", when driving!)

>> ANYONE can drive a car. By contrast, a fair bit more understanding,
>> reasoning and skill is required to pilot an aircraft.
>
> Not entirely sure about that. 14yo can be solo, and a
> very few are even aerobatic pilots.

And a "youngster" can drive a car (or other sort of motorized vehicle, e.g., on
a farm or other private property). The 16yo (15.5) restriction only applies to
the use on public roadways.

<https://www.abc4.com/news/local-news/underage-utah-boy-caught-driving-wrong-way-in-slc/>

<https://www.kgun9.com/news/local-news/cochise-county-four-smuggling-busts-within-five-hours-14-year-old-driver-involved>

Cars are "simple" to operate; can-your-feet-reach-the-pedals being the only
practical criteria. I'd wager *I* would have a hard time walking up to
an aircraft, "cold", and trying to sort out how to get it off the ground...

> The main difference is that you can't stop and catch
> your breath, or stop and have a pee.
>
> Overall learning to fly a glider is pretty much similar
> to learning to drive - in cost, time and skill.

But not opportunity. I'd have to spend a fair bit of effort researching
where to gain access to any sort of aircraft. OTOH, I can readily "borrow"
(with consent) any of my neighbors' vehicles and operate all of them in
a fairly consistent manner: sports cars, trucks, commercial trucks, even
motorcycles (though never having driven one, before!).

> The training
> is more rigorous, though, and isn't a one-off event.

It's likely more technical, too. Most auto-driving instruction deals
with laws, not the technical "piloting" of the vehicle. The driving test
is similarly focused on whether or not you put that law knowledge into
effect (did you stop *at* the proper point? did you observe the speed
limit and other posted requirements?)

[When taking the test for my *first* DL, the DMV was notorious for
having a stop sign *in* the (tiny) parking lot -- in an unexpected
place. Folks who weren't observant -- or tipped off to this ahead
of time -- were "failed" before ever getting out on the roadway!]

Testing for a CDL (commercial) is considerably different; you are
quizzed on technical details of the vehicle that affect the safety of
you and others on the roadway -- because you are operating a much more
"lethal" vehicle (< 26,000 pounds GVW). You also have to prove yourself
medically *fit* to operate (not color blind, not an insulin user,
"controlled" blood pressure, nonepileptic, alchoholic, etc.!

And, other "endorsements" have further requirements (e.g., hauling
tandem/triples, hazardous products, etc.)

>> I.e., a pilot is a lot more likely to understand the function
>> AND LIMITATIONS of an (aircraft) autopilot than a driver is to
>> have similar appreciation for an (automobile) "autopilot".
>
> Pilots often don't understand what's going on; just
> listen to the accident reports on the news :(

I think those events are caused by cognitive overload, not ignorance.

Re: Every Tesla Accident Resulting in Death

<t26q9q$vdr$2@dont-email.me>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93568&group=sci.electronics.design#93568

 copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: spamj...@blueyonder.co.uk (Tom Gardner)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 1 Apr 2022 13:13:46 +0100
Organization: A noiseless patient Spider
Lines: 156
Message-ID: <t26q9q$vdr$2@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me> <t25ge5$27c$3@dont-email.me>
<t26al0$uu8$1@dont-email.me> <t26l2c$idv$2@dont-email.me>
<t26nt6$d1e$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Fri, 1 Apr 2022 12:13:46 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="f337fdce28989ba2775f5cff9e9effed";
logging-data="32187"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/BsOC5/ymOHEbhcqtkofh7"
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101
Firefox/52.0 SeaMonkey/2.49.4
Cancel-Lock: sha1:5lb8DbPM3KjH86OQSwg8WqU03h8=
In-Reply-To: <t26nt6$d1e$1@dont-email.me>
 by: Tom Gardner - Fri, 1 Apr 2022 12:13 UTC

On 01/04/22 12:32, Don Y wrote:
> On 4/1/2022 3:44 AM, Tom Gardner wrote:
>> On 01/04/22 08:46, Don Y wrote:
>>> On 3/31/2022 5:19 PM, Tom Gardner wrote:
>>>> On 31/03/22 23:39, David Brown wrote:
>>>>> On 01/04/2022 00:29, Ricky wrote:
>>>>
>>>>>> Sorry, that's not how an autopilot works.  It doesn't fly the plane.  It
>>>>>> simply maintains a heading and altitude.
>>>>
>>>> They have been doing more than that for for > 50 years.
>>>> Cat 3b landings were in operation when I was a kid.
>>>>
>>>>>> Someone still has to be watching
>>>>>> for other aircraft and otherwise flying the plane.  In other words, the
>>>>>> pilot is responsible for flying the plane, with or without the autopilot.
>>>>>
>>>>> Yes, that's the original idea of a plane autopilot.  But modern ones are more
>>>>> sophisticated and handle course changes along the planned route, as well as
>>>>> being able to land automatically.  And more important than what plane
>>>>> autopilots actually /do/, is what people /think/ they do - and remember we
>>>>> are talking about drivers that think their Tesla "autopilot" will drive their
>>>>> car while they watch a movie or nap in the back seat.
>>>>
>>>> And, to put it kindly, aren't discouraged in that misapprehension
>>>> by the statements of the cars' manufacturers and salesdroids.
>>>>
>>>> Now, what's the best set of techniques to get that concept
>>>> into the heads of twats that think "autopilot" means "it does
>>>> it for me".
>>>
>>> "Pilots" and "drivers" approach their efforts entirely differently
>>> and with different mindsets.
>>
>> They should do in one sense (differing machine/automation)
>> and shouldn't in another (both are lethal instruments).
>>
>> Problem starts with the marketing.
>
> Cars are far more ubiquitous.  And, navigation is a 2-dimensional activity.
>
> An "average joe" isn't likely to think hes gonna "hop in a piper cub" and
> be off on a jaunt to run errands.  And, navigation is a 3-dimensional
> undertaking (you don't worry about vehicles "above" or "below", when driving!)

True, but it doesn't change any of my points.

>>> ANYONE can drive a car.  By contrast, a fair bit more understanding,
>>> reasoning and skill is required to pilot an aircraft.
>>
>> Not entirely sure about that. 14yo can be solo, and a
>> very few are even aerobatic pilots.
>
> And a "youngster" can drive a car (or other sort of motorized vehicle, e.g., on
> a farm or other private property).  The 16yo (15.5) restriction only applies to
> the use on public roadways.

12yo fly across the country with an instructor behind.
14yo can do it on their own.

Daughter was driving my car and a double decker bus at 15yo,
on the runway and peritrack :)

> Cars are "simple" to operate; can-your-feet-reach-the-pedals being the only
> practical criteria.  I'd wager *I* would have a hard time walking up to
> an aircraft, "cold", and trying to sort out how to get it off the ground...

Same is true of a glider. There are only 4 controls: rudder,
stick, airbrake, cable release. Two instruments, airspeed
and barometer (i.e. height differential).

You are taught to do without them, because they all lie to
you.

>> The main difference is that you can't stop and catch
>> your breath, or stop and have a pee.
>>
>> Overall learning to fly a glider is pretty much similar
>> to learning to drive - in cost, time and skill.
>
> But not opportunity.  I'd have to spend a fair bit of effort researching
> where to gain access to any sort of aircraft.  OTOH, I can readily "borrow"
> (with consent) any of my neighbors' vehicles and operate all of them in
> a fairly consistent manner: sports cars, trucks, commercial trucks, even
> motorcycles (though never having driven one, before!).

True, but it doesn't change any of my points.

>> The training
>> is more rigorous, though, and isn't a one-off event.
>
> It's likely more technical, too.  Most auto-driving instruction deals
> with laws, not the technical "piloting" of the vehicle.  The driving test
> is similarly focused on whether or not you put that law knowledge into
> effect (did you stop *at* the proper point?  did you observe the speed
> limit and other posted requirements?)

Not much is required to go solo.

Does the glider's responsiveness indicate you are flying
fast enough; are you at a reasonable height in the circuit;
what to do when you find you aren't and when the cable snaps.

> [When taking the test for my *first* DL, the DMV was notorious for
> having a stop sign *in* the (tiny) parking lot -- in an unexpected
> place.  Folks who weren't observant -- or tipped off to this ahead
> of time -- were "failed" before ever getting out on the roadway!]

Pre-solo tests include the instructor putting you in a
stupid position, and saying "now get us back safely".

> Testing for a CDL (commercial) is considerably different; you are
> quizzed on technical details of the vehicle that affect the safety of
> you and others on the roadway -- because you are operating a much more
> "lethal" vehicle (< 26,000 pounds GVW).  You also have to prove yourself
> medically *fit* to operate (not color blind, not an insulin user,
> "controlled" blood pressure, nonepileptic, alchoholic, etc.!

Ditto being an instructor or having a passenger.

> And, other "endorsements" have further requirements (e.g., hauling
> tandem/triples, hazardous products, etc.)

Ditto flying cross country or in clouds.

>>> I.e., a pilot is a lot more likely to understand the function
>>> AND LIMITATIONS of an (aircraft) autopilot than a driver is to
>>> have similar appreciation for an (automobile) "autopilot".

That's true for the aircraft, but nobody has developed
an autopilot. You have to stay awake feel (literally,
by the seat of your pants) what's happening. The nearest
to an autopilot is a moving map airspace display.

>> Pilots often don't understand what's going on; just
>> listen to the accident reports on the news :(
>
> I think those events are caused by cognitive overload, not ignorance.

Not always, e.g. the recent 737 crashes.

Re: Every Tesla Accident Resulting in Death

<ccf76ab3-8f2b-4704-83ee-65d618b20f7bn@googlegroups.com>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93570&group=sci.electronics.design#93570

 copy link   Newsgroups: sci.electronics.design
X-Received: by 2002:a05:622a:120a:b0:2e1:c9ba:e99b with SMTP id y10-20020a05622a120a00b002e1c9bae99bmr8208994qtx.685.1648816694866;
Fri, 01 Apr 2022 05:38:14 -0700 (PDT)
X-Received: by 2002:a81:1704:0:b0:2e5:d98b:e185 with SMTP id
4-20020a811704000000b002e5d98be185mr9872669ywx.354.1648816694661; Fri, 01 Apr
2022 05:38:14 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.misty.com!border2.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.electronics.design
Date: Fri, 1 Apr 2022 05:38:14 -0700 (PDT)
In-Reply-To: <t25ai8$tbg$1@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=24.138.223.107; posting-account=I-_H_woAAAA9zzro6crtEpUAyIvzd19b
NNTP-Posting-Host: 24.138.223.107
References: <t1upig$tmg$2@dont-email.me> <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me> <t1vm1c$9lm$1@dont-email.me>
<t202na$bvu$1@dont-email.me> <t20t89$78k$1@dont-email.me> <26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me> <b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <ccf76ab3-8f2b-4704-83ee-65d618b20f7bn@googlegroups.com>
Subject: Re: Every Tesla Accident Resulting in Death
From: gnuarm.d...@gmail.com (Ricky)
Injection-Date: Fri, 01 Apr 2022 12:38:14 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
Lines: 160
 by: Ricky - Fri, 1 Apr 2022 12:38 UTC

On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
> On 01/04/2022 00:29, Ricky wrote:
> > On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
> >> On 31/03/2022 22:44, Ricky wrote:
> >>> On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
> >>>> On 30/03/2022 00:54, Tom Gardner wrote:
> >>>>> On 29/03/22 20:18, David Brown wrote:
> >> <snip>
> >>>> No, it is not "too strong". It is basic statistics. Bayes' theorem,
> >>>> and all that. If a large proportion of people use autopilot, but
> >>>> only a small fraction of the deaths had the autopilot on, then
> >>>> clearly the autopilot reduces risks and saves lives (of those that
> >>>> drive Teslas - we still know nothing of other car drivers).
> >>>
> >>> A simple comparison of numbers is not sufficient. Most Tesla
> >>> autopilot usage is on highways which are much safer per mile driven
> >>> than other roads. That's an inherent bias because while
> >>> non-autopilot driving must include all situations, autopilot simply
> >>> doesn't work in most environments.
> >>>
> >> Yes. An apples-to-apples comparison is the aim, or at least as close as
> >> one can get.
> >>
> >> I suspect - without statistical justification -
> >
> > Yes, without justification, at all.
> Which do /you/ think is most likely? Autopilot crashes on the motorway,
> or autopilot crashes on smaller roads?

Because autopilot doesn't work off the highway (it can't make turns, for example) more often autopilot involved crashes are on the highways.

I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use it.

> >> that the accidents
> >> involving autopilot use are precisely cases where you don't have a good,
> >> clear highway, and autopilot was used in a situation where it was not
> >> suitable. Getting good statistics and comparisons here could be helpful
> >> in making it safer - perhaps adding a feature that has the autopilot say
> >> "This is not a good road for me - you have to drive yourself" and switch
> >> itself off. (It would be more controversial, but probably statistically
> >> safer, if it also sometimes said "I'm better at driving on this kind of
> >> road than you are" and switching itself on!)
> >>>>>
> >>>>> An issue is, of course, that any single experience can be
> >>>>> dismissed as an unrepresentative aberration. Collation of
> >>>>> experiences is necessary.
> >>>>>
> >>>>> Some of the dashcam "Tesla's making mistakes" videos on yootoob
> >>>>> aren't confidence inspiring. Based on one I saw, I certainly
> >>>>> wouldn't dare let a Tesla drive itself in an urban environment,
> >>>>>
> >>>>> I suspect there isn't sufficient experience to assess relative
> >>>>> dangers between "artificial intelligence" and "natural
> >>>>> stupidity".
> >>>> I don't doubt at all that the Tesla autopilot makes mistakes.
> >>>
> >>> Which depends on how you define "mistakes".
> >> Of course.
> >>> It's a bit like asking
> >>> if your rear view mirror makes mistakes by not showing cars in the
> >>> blind spot. The autopilot is not designed to drive the car. It is a
> >>> tool to assist the driver. The driver is required to be responsible
> >>> for the safe operation of the car at all times. I can point out to
> >>> you the many, many times the car acts like a spaz and requires me to
> >>> manage the situation. Early on, there was a left turn like on a 50
> >>> mph road, the car would want to turn into when intending to drive
> >>> straight. Fortunately they have ironed out that level of issue. But
> >>> it was always my responsibility to prevent it from causing an
> >>> accident. So how would you say anything was the fault of the
> >>> autopilot?
> >>>
> >> There are a few possibilities here (though I am not trying to claim that
> >> any of them are "right" in some objective sense). You might say they
> >> had believed that the "autopilot" was like a plane autopilot -
> >
> > It is exactly like an airplane autopilot.
> >
> >
> >> you can
> >> turn it on and leave it to safely drive itself for most of the journey
> >> except perhaps the very beginning and very end of the trip. As you say,
> >> the Tesla autopilot is /not/ designed for that - that might be a mistake
> >> from the salesmen, advertisers, user-interface designers, or just the
> >> driver's mistake.
> >
> > Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.
> >
> Yes, that's the original idea of a plane autopilot. But modern ones are
> more sophisticated and handle course changes along the planned route, as
> well as being able to land automatically. And more important than what
> plane autopilots actually /do/, is what people /think/ they do - and
> remember we are talking about drivers that think their Tesla "autopilot"
> will drive their car while they watch a movie or nap in the back seat.

Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It's not! Just like in the car, there is a pilot who's job is to fly/drive and assure safety.

As to the movie idea, no, people don't think that. People might "pretend" that, but there's no level of "thinking" that says you can climb in the back seat while driving. Please don't say silly things.

> >> And sometimes the autopilot does something daft - it is no longer
> >> assisting the driver, but working against him or her. That, I think,
> >> should be counted as a mistake by the autopilot.
> >
> > The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.
> >
> Well, "does something daft" is no worse than "acts like a spaz", and
> it's a good deal more politically correct!

Bzzzz. Sorry, you failed.

--

Rick C.

++ Get 1,000 miles of free Supercharging
++ Tesla referral code - https://ts.la/richard11209

Re: Every Tesla Accident Resulting in Death

<00a253d1-8498-48d4-a29d-3977bd0f2075n@googlegroups.com>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93571&group=sci.electronics.design#93571

 copy link   Newsgroups: sci.electronics.design
X-Received: by 2002:ad4:5dea:0:b0:441:5fdf:dd9c with SMTP id jn10-20020ad45dea000000b004415fdfdd9cmr7718034qvb.44.1648816954249;
Fri, 01 Apr 2022 05:42:34 -0700 (PDT)
X-Received: by 2002:a05:6902:1149:b0:63d:8700:d007 with SMTP id
p9-20020a056902114900b0063d8700d007mr298324ybu.344.1648816953991; Fri, 01 Apr
2022 05:42:33 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.misty.com!border2.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.electronics.design
Date: Fri, 1 Apr 2022 05:42:33 -0700 (PDT)
In-Reply-To: <t25ge5$27c$3@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=24.138.223.107; posting-account=I-_H_woAAAA9zzro6crtEpUAyIvzd19b
NNTP-Posting-Host: 24.138.223.107
References: <t1upig$tmg$2@dont-email.me> <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me> <t1vm1c$9lm$1@dont-email.me>
<t202na$bvu$1@dont-email.me> <t20t89$78k$1@dont-email.me> <26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me> <b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me> <t25ge5$27c$3@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <00a253d1-8498-48d4-a29d-3977bd0f2075n@googlegroups.com>
Subject: Re: Every Tesla Accident Resulting in Death
From: gnuarm.d...@gmail.com (Ricky)
Injection-Date: Fri, 01 Apr 2022 12:42:34 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
Lines: 42
 by: Ricky - Fri, 1 Apr 2022 12:42 UTC

On Thursday, March 31, 2022 at 8:19:31 PM UTC-4, Tom Gardner wrote:
> On 31/03/22 23:39, David Brown wrote:
> > On 01/04/2022 00:29, Ricky wrote:
>
> >> Sorry, that's not how an autopilot works. It doesn't fly the plane. It
> >> simply maintains a heading and altitude.
> They have been doing more than that for for > 50 years.
> Cat 3b landings were in operation when I was a kid.
> >> Someone still has to be watching
> >> for other aircraft and otherwise flying the plane. In other words, the
> >> pilot is responsible for flying the plane, with or without the autopilot.
> >
> > Yes, that's the original idea of a plane autopilot. But modern ones are more
> > sophisticated and handle course changes along the planned route, as well as
> > being able to land automatically. And more important than what plane
> > autopilots actually /do/, is what people /think/ they do - and remember we
> > are talking about drivers that think their Tesla "autopilot" will drive their
> > car while they watch a movie or nap in the back seat.
> And, to put it kindly, aren't discouraged in that misapprehension
> by the statements of the cars' manufacturers and salesdroids.
>
> Now, what's the best set of techniques to get that concept
> into the heads of twats that think "autopilot" means "it does
> it for me".

That's Tom Gardner level misinformation. Comments about what people think are spurious and unsubstantiated. A class of "twats" can be invented that think anything. Nothing matters other than what Tesla owners think. They are the ones driving the cars.

--

Rick C.

--- Get 1,000 miles of free Supercharging
--- Tesla referral code - https://ts.la/richard11209

Re: Every Tesla Accident Resulting in Death

<6b9184c0-16b4-416c-8617-5d73b29855e8n@googlegroups.com>

 copy mid

https://www.novabbs.com/tech/article-flat.php?id=93573&group=sci.electronics.design#93573

 copy link   Newsgroups: sci.electronics.design
X-Received: by 2002:a05:6214:e87:b0:441:a5d:681c with SMTP id hf7-20020a0562140e8700b004410a5d681cmr7619367qvb.38.1648817079397;
Fri, 01 Apr 2022 05:44:39 -0700 (PDT)
X-Received: by 2002:a25:dec2:0:b0:61d:e09e:94d1 with SMTP id
v185-20020a25dec2000000b0061de09e94d1mr8185722ybg.287.1648817079188; Fri, 01
Apr 2022 05:44:39 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.misty.com!border2.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: sci.electronics.design
Date: Fri, 1 Apr 2022 05:44:38 -0700 (PDT)
In-Reply-To: <t268d1$j8u$1@dont-email.me>
Injection-Info: google-groups.googlegroups.com; posting-host=24.138.223.107; posting-account=I-_H_woAAAA9zzro6crtEpUAyIvzd19b
NNTP-Posting-Host: 24.138.223.107
References: <t1upig$tmg$2@dont-email.me> <65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me> <t1vm1c$9lm$1@dont-email.me>
<t202na$bvu$1@dont-email.me> <t20t89$78k$1@dont-email.me> <26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me> <b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me> <t25ge5$27c$3@dont-email.me> <t268d1$j8u$1@dont-email.me>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <6b9184c0-16b4-416c-8617-5d73b29855e8n@googlegroups.com>
Subject: Re: Every Tesla Accident Resulting in Death
From: gnuarm.d...@gmail.com (Ricky)
Injection-Date: Fri, 01 Apr 2022 12:44:39 +0000
Content-Type: text/plain; charset="UTF-8"
Lines: 46
 by: Ricky - Fri, 1 Apr 2022 12:44 UTC

On Friday, April 1, 2022 at 3:08:25 AM UTC-4, David Brown wrote:
> On 01/04/2022 02:19, Tom Gardner wrote:
> > On 31/03/22 23:39, David Brown wrote:
> >> On 01/04/2022 00:29, Ricky wrote:
> >
> >>> Someone still has to be watching
> >>> for other aircraft and otherwise flying the plane. In other words, the
> >>> pilot is responsible for flying the plane, with or without the
> >>> autopilot.
> >>
> >> Yes, that's the original idea of a plane autopilot. But modern ones
> >> are more
> >> sophisticated and handle course changes along the planned route, as
> >> well as
> >> being able to land automatically. And more important than what plane
> >> autopilots actually /do/, is what people /think/ they do - and
> >> remember we
> >> are talking about drivers that think their Tesla "autopilot" will
> >> drive their
> >> car while they watch a movie or nap in the back seat.
> >
> > And, to put it kindly, aren't discouraged in that misapprehension
> > by the statements of the cars' manufacturers and salesdroids.
> >
> > Now, what's the best set of techniques to get that concept
> > into the heads of twats that think "autopilot" means "it does
> > it for me".
> You don't. Twats will always be twats. You fix the cars.
>
> You start by changing the name. "Driver assistance" rather than
> "autopilot".
>
> You turn the steering wheel into a dead-man's handle - if the driver
> releases it for more than, say, 2 seconds, the autopilot should first
> beep violently, then pull over and stop the car if the driver does not
> pay attention. (Maybe you have "motorway mode" that allows a longer
> delay time, since autopilot works better there, and perhaps also a
> "traffic queue" mode with even longer delays.)

Do you know anything about how the Tesla autopilot actually works? Anything at all?

--

Rick C.

--+ Get 1,000 miles of free Supercharging
--+ Tesla referral code - https://ts.la/richard11209

Pages:123
server_pubkey.txt

rocksolid light 0.9.7
clearnet tor