Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

Klingon phaser attack from front!!!!! 100% Damage to life support!!!!


tech / sci.electronics.design / Re: Every Tesla Accident Resulting in Death

Re: Every Tesla Accident Resulting in Death

<t2728v$skf$1@dont-email.me>

  copy mid

https://www.novabbs.com/tech/article-flat.php?id=93581&group=sci.electronics.design#93581

  copy link   Newsgroups: sci.electronics.design
Path: i2pn2.org!i2pn.org!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: david.br...@hesbynett.no (David Brown)
Newsgroups: sci.electronics.design
Subject: Re: Every Tesla Accident Resulting in Death
Date: Fri, 1 Apr 2022 16:29:50 +0200
Organization: A noiseless patient Spider
Lines: 139
Message-ID: <t2728v$skf$1@dont-email.me>
References: <t1upig$tmg$2@dont-email.me>
<65698443-2b83-425e-a0a1-282715b6331dn@googlegroups.com>
<t1v4co$llm$1@dont-email.me> <t1v7um$ica$2@dont-email.me>
<t1vm1c$9lm$1@dont-email.me> <t202na$bvu$1@dont-email.me>
<t20t89$78k$1@dont-email.me>
<26220125-cd7d-4034-885d-b7348b491723n@googlegroups.com>
<t257ir$67k$1@dont-email.me>
<b1438197-faf8-450d-be5e-84feeb5e7c5dn@googlegroups.com>
<t25ai8$tbg$1@dont-email.me>
<ccf76ab3-8f2b-4704-83ee-65d618b20f7bn@googlegroups.com>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 7bit
Injection-Date: Fri, 1 Apr 2022 14:29:51 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="d8975916bb6c8b3a71923212e910ce3c";
logging-data="29327"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19b/r3Gqu5ZdHaAvpBL5m4RsbzzW3FQuPs="
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
Thunderbird/78.11.0
Cancel-Lock: sha1:4aSbfUs0yoc3E/kMqLtbWU9Trvk=
In-Reply-To: <ccf76ab3-8f2b-4704-83ee-65d618b20f7bn@googlegroups.com>
Content-Language: en-GB
 by: David Brown - Fri, 1 Apr 2022 14:29 UTC

On 01/04/2022 14:38, Ricky wrote:
> On Thursday, March 31, 2022 at 6:39:11 PM UTC-4, David Brown wrote:
>> On 01/04/2022 00:29, Ricky wrote:
>>> On Thursday, March 31, 2022 at 5:48:18 PM UTC-4, David Brown wrote:
>>>> On 31/03/2022 22:44, Ricky wrote:
>>>>> On Wednesday, March 30, 2022 at 2:27:30 AM UTC-4, David Brown wrote:
>>>>>> On 30/03/2022 00:54, Tom Gardner wrote:
>>>>>>> On 29/03/22 20:18, David Brown wrote:
>>>> <snip>
>>>>>> No, it is not "too strong". It is basic statistics. Bayes' theorem,
>>>>>> and all that. If a large proportion of people use autopilot, but
>>>>>> only a small fraction of the deaths had the autopilot on, then
>>>>>> clearly the autopilot reduces risks and saves lives (of those that
>>>>>> drive Teslas - we still know nothing of other car drivers).
>>>>>
>>>>> A simple comparison of numbers is not sufficient. Most Tesla
>>>>> autopilot usage is on highways which are much safer per mile driven
>>>>> than other roads. That's an inherent bias because while
>>>>> non-autopilot driving must include all situations, autopilot simply
>>>>> doesn't work in most environments.
>>>>>
>>>> Yes. An apples-to-apples comparison is the aim, or at least as close as
>>>> one can get.
>>>>
>>>> I suspect - without statistical justification -
>>>
>>> Yes, without justification, at all.
>> Which do /you/ think is most likely? Autopilot crashes on the motorway,
>> or autopilot crashes on smaller roads?
>
> Because autopilot doesn't work off the highway (it can't make turns, for example) more often autopilot involved crashes are on the highways.
>

I was not aware of that limitation. Thanks for providing some relevant
information.

> I recall a news article that said experimenters were able to fool autopilot into making a left turn at an intersection by putting two or three small squares on the roadway. In city driving the limitations are at a level that no one would try to use it.
>
>
>>>> that the accidents
>>>> involving autopilot use are precisely cases where you don't have a good,
>>>> clear highway, and autopilot was used in a situation where it was not
>>>> suitable. Getting good statistics and comparisons here could be helpful
>>>> in making it safer - perhaps adding a feature that has the autopilot say
>>>> "This is not a good road for me - you have to drive yourself" and switch
>>>> itself off. (It would be more controversial, but probably statistically
>>>> safer, if it also sometimes said "I'm better at driving on this kind of
>>>> road than you are" and switching itself on!)
>>>>>>>
>>>>>>> An issue is, of course, that any single experience can be
>>>>>>> dismissed as an unrepresentative aberration. Collation of
>>>>>>> experiences is necessary.
>>>>>>>
>>>>>>> Some of the dashcam "Tesla's making mistakes" videos on yootoob
>>>>>>> aren't confidence inspiring. Based on one I saw, I certainly
>>>>>>> wouldn't dare let a Tesla drive itself in an urban environment,
>>>>>>>
>>>>>>> I suspect there isn't sufficient experience to assess relative
>>>>>>> dangers between "artificial intelligence" and "natural
>>>>>>> stupidity".
>>>>>> I don't doubt at all that the Tesla autopilot makes mistakes.
>>>>>
>>>>> Which depends on how you define "mistakes".
>>>> Of course.
>>>>> It's a bit like asking
>>>>> if your rear view mirror makes mistakes by not showing cars in the
>>>>> blind spot. The autopilot is not designed to drive the car. It is a
>>>>> tool to assist the driver. The driver is required to be responsible
>>>>> for the safe operation of the car at all times. I can point out to
>>>>> you the many, many times the car acts like a spaz and requires me to
>>>>> manage the situation. Early on, there was a left turn like on a 50
>>>>> mph road, the car would want to turn into when intending to drive
>>>>> straight. Fortunately they have ironed out that level of issue. But
>>>>> it was always my responsibility to prevent it from causing an
>>>>> accident. So how would you say anything was the fault of the
>>>>> autopilot?
>>>>>
>>>> There are a few possibilities here (though I am not trying to claim that
>>>> any of them are "right" in some objective sense). You might say they
>>>> had believed that the "autopilot" was like a plane autopilot -
>>>
>>> It is exactly like an airplane autopilot.
>>>
>>>
>>>> you can
>>>> turn it on and leave it to safely drive itself for most of the journey
>>>> except perhaps the very beginning and very end of the trip. As you say,
>>>> the Tesla autopilot is /not/ designed for that - that might be a mistake
>>>> from the salesmen, advertisers, user-interface designers, or just the
>>>> driver's mistake.
>>>
>>> Sorry, that's not how an autopilot works. It doesn't fly the plane. It simply maintains a heading and altitude. Someone still has to be watching for other aircraft and otherwise flying the plane. In other words, the pilot is responsible for flying the plane, with or without the autopilot.
>>>
>> Yes, that's the original idea of a plane autopilot. But modern ones are
>> more sophisticated and handle course changes along the planned route, as
>> well as being able to land automatically. And more important than what
>> plane autopilots actually /do/, is what people /think/ they do - and
>> remember we are talking about drivers that think their Tesla "autopilot"
>> will drive their car while they watch a movie or nap in the back seat.
>
> Great! But the autopilot is not watching for other aircraft, not monitoring communications and not able to deal with any unusual events. You keep coming back to a defective idea that autopilot means the airplane is flying itself. It's not! Just like in the car, there is a pilot who's job is to fly/drive and assure safety.
>

I am fully aware that plane autopilots are limited. I am also aware
that they are good enough (in planes equipped with modern systems) to
allow pilots to let the system handle most of the flight itself, even
including landing. The pilot is, of course, expected to be paying
attention, watching for other aircraft, communicating with air traffic
controllers and all the rest of it. But there have been cases of pilots
falling asleep, or missing their destination because they were playing
around on their laptops. What people /should/ be doing, and what they
are /actually/ doing, is not always the same.

> As to the movie idea, no, people don't think that. People might "pretend" that, but there's no level of "thinking" that says you can climb in the back seat while driving. Please don't say silly things.
>

You can google for "backseat Tesla drivers" as well as I can. I am
confident that some of these are staged, and equally confident that some
are not. There is no minimum level of "thinking" - no matter how daft
something might be, there is always a dafter person who will think it's
a good idea.

>
>>>> And sometimes the autopilot does something daft - it is no longer
>>>> assisting the driver, but working against him or her. That, I think,
>>>> should be counted as a mistake by the autopilot.
>>>
>>> The Tesla autopilot can barely manage to go 10 miles without some sort of glitch. "Daft" is not a very useful term, as it means what you want it to mean. "I know it when I see it." Hard to design to that sort of specification.
>>>
>> Well, "does something daft" is no worse than "acts like a spaz", and
>> it's a good deal more politically correct!
>
> Bzzzz. Sorry, you failed.
>

Really? You think describing the autopilot's actions as "acts like a
spaz" is useful and specific, while "does something daft" is not? As
for the political correctness - find a real spastic and ask them what
they think of your phrase.

SubjectRepliesAuthor
o Every Tesla Accident Resulting in Death

By: Tom Gardner on Tue, 29 Mar 2022

55Tom Gardner
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor