Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

6 May, 2024: The networking issue during the past two days has been identified and fixed.


computers / comp.mobile.android / Apple wants to check your iphone for child abuse images - what could possibly go wrong?

SubjectAuthor
* Apple wants to check your iphone for child abuse images - what couldFritz Wuehler
+* Re: Apple wants to check your iphone for child abuse images - whatcern
|+* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
||`- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|`* Re: Apple wants to check your iphone for child abuse images - whatAnonymous Remailer (austria)
| +- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
| `- Re: Apple wants to check your iphone for child abuse images - whatJolly Roger
+* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|`* Re: Apple wants to check your iphone for child abuse images - whatAnonymous Remailer (austria)
| +- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
| `- Re: Apple wants to check your iphone for child abuse images - whatJolly Roger
+* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
|+- Re: Apple wants to check your iphone for child abuse images - what could possiblGronk
|`* Re: Apple wants to check your iphone for child abuse images - what could possiblMayayana
| +- Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
| +* Re: Apple wants to check your iphone for child abuse imagesbadgolferman
| |+- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
| |+* Re: Apple wants to check your iphone for child abuse images - what could possiblMayayana
| ||`- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
| |`* Re: Apple wants to check your iphone for child abuse images - whatsms
| | +- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
| | `* Re: Apple wants to check your iphone for child abuse images - whatMark Lloyd
| |  `* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
| |   `* Re: Apple wants to check your iphone for child abuse images - whatLewis
| |    +* Re: Apple wants to check your iphone for child abuse images - what could possiblMayayana
| |    |+- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
| |    |+* Re: Apple wants to check your iphone for child abuse images - whatAJL
| |    ||`* Re: Apple wants to check your iphone for child abuse images - what could possiblMayayana
| |    || +- Re: Apple wants to check your iphone for child abuse images - whatAJL
| |    || `* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
| |    ||  `- Re: Apple wants to check your iphone for child abuse images - what could possiblbadgolferman
| |    |`* Re: Apple wants to check your iphone for child abuse images - what could possiblmicky
| |    | `* Re: Apple wants to check your iphone for child abuse images - whatJörg Lorenz
| |    |  +- Re: Apple wants to check your iphone for child abuse imagesHank Rogers
| |    |  `- Re: Apple wants to check your iphone for child abuse images - whatAlan Browne
| |    `- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
| `* Re: Apple wants to check your iphone for child abuse images - whatRainer Zwerschke
|  `- Re: Apple wants to check your iphone for child abuse images - whatAllodoxaphobia
+* Re: Apple wants to check your iphone for child abuse images - whatsms
|+- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|+- Re: Apple wants to check your iphone for child abuse images - what could possiblGronk
|`* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
| `* Re: Apple wants to check your iphone for child abuse images - what could possiblchrisv
|  +* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  |`* Re: Apple wants to check your iphone for child abuse images - what could possiblchrisv
|  | +* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  | |`* Re: Apple wants to check your iphone for child abuse images - what could possiblchrisv
|  | | `- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  | `* Re: Apple wants to check your iphone for child abuse images - whatLewis
|  |  +- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  |  `* Re: Apple wants to check your iphone for child abuse images - what could possiblallspam
|  |   `* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
|  |    +* Re: Apple wants to check your iphone for child abuse images - what could possiblallspam
|  |    |+* Re: Apple wants to check your iphone for child abuse imagesbadgolferman
|  |    ||`* Re: Apple wants to check your iphone for child abuse images - what could possiblallspam
|  |    || `* Re: Apple wants to check your iphone for child abuse images - whatsms
|  |    ||  `- Re: Apple wants to check your iphone for child abuse images - what could possiblbadgolferman
|  |    |+* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
|  |    ||+- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  |    ||`* Re: Apple wants to check your iphone for child abuse images - whatAlan Baker
|  |    || `- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  |    |`- Re: Apple wants to check your iphone for child abuse images - whatsms
|  |    `- Re: Apple wants to check your iphone for child abuse images - whatLewis
|  +* Re: Apple wants to check your iphone for child abuse images - what could possiblMayayana
|  |+* Re: Apple wants to check your iphone for child abuse images - whatsms
|  ||`- Re: Apple wants to check your iphone for child abuse images - what could possiblMayayana
|  |+* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||+* Re: Apple wants to check your iphone for child abuse images - what could possiblJohn Robertson
|  |||`- Re: Apple wants to check your iphone for child abuse images - what could possiblSnit
|  ||`* Re: Apple wants to check your iphone for child abuse images - what could possiblGronk
|  || +* Re: Apple wants to check your iphone for child abuse images - whatLewis
|  || |`- Re: Apple wants to check your iphone for child abuse images - what could possiblallspam
|  || `* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||  +- Re: Apple wants to check your iphone for child abuse images - what could possiblallspam
|  ||  `* Re: Apple wants to check your iphone for child abuse images - whatJolly Roger
|  ||   `* Re: Apple wants to check your iphone for child abuse images - what could possiblallspam
|  ||    +* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
|  ||    |`* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||    | `* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
|  ||    |  `* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||    |   `* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
|  ||    |    `* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||    |     +* Re: Apple wants to check your iphone for child abuse images - what could possiblallspam
|  ||    |     |+* Re: Apple wants to check your iphone for child abuse images - whatNomen Nescio
|  ||    |     ||`- Re: Apple wants to check your iphone for child abuse images - whatNomen Nescio
|  ||    |     |`- Re: Apple wants to check your iphone for child abuse images - whatAlan Baker
|  ||    |     `* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
|  ||    |      `- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||    `* Re: Apple wants to check your iphone for child abuse images - what could possiblMayayana
|  ||     `* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||      `* Re: Apple wants to check your iphone for child abuse images - what could possiblallspam
|  ||       +- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||       `* Re: Apple wants to check your iphone for child abuse images - whatAlan Baker
|  ||        +- Re: Apple wants to check your iphone for child abuse images - whatNomen Nescio
|  ||        +* Re: Apple wants to check your iphone for child abuse images - whatNomen Nescio
|  ||        |`* Re: Apple wants to check your iphone for child abuse images - whatAlan Baker
|  ||        | `* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
|  ||        |  +* Re: Apple wants to check your iphone for child abuse images - whatAlan Baker
|  ||        |  |+* Re: Apple wants to check your iphone for child abuse images - whatSilverSlimer
|  ||        |  ||+- Re: Apple wants to check your iphone for child abuse images - what could possiblchrisv
|  ||        |  ||+* Re: Apple wants to check your iphone for child abuse images - whatAlan Baker
|  ||        |  ||`- Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||        |  |`* Re: Apple wants to check your iphone for child abuse images - whatJolly Roger
|  ||        |  `* Re: Apple wants to check your iphone for child abuse images - what could possiblnospam
|  ||        `- Re: Apple wants to check your iphone for child abuse images - whatJolly Roger
|  |`* Re: Apple wants to check your iphone for child abuse images - whatAlan Baker
|  +- Re: Apple wants to check your iphone for child abuse images - whatJolly Roger
|  `* Re: Apple wants to check your iphone for child abuse images - what could possiblJ. P. Gilliver (John)
+- Re: Apple wants to check your iphone for child abuse images - whatJolly Roger
+- Re: Apple wants to check your iphone for child abuse images - whatAnonymous Remailer (austria)
`- Re: Apple wants to check your iphone for child abuse images - what could possiblallspam

Pages:123456
Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18360&group=comp.mobile.android#18360

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
From: fri...@spamexpire-202108.rodent.frell.theremailer.net (Fritz Wuehler)
Subject: Apple wants to check your iphone for child abuse images - what could
possibly go wrong?
Message-ID: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
Date: Sun, 08 Aug 2021 09:23:17 +0000
Newsgroups: alt.privacy.anon-server, misc.phone.mobile.iphone,
comp.os.linux.advocacy, comp.mobile.android, alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!sewer!news.dizum.net!not-for-mail
Organization: dizum.com - The Internet Problem Provider
X-Abuse: abuse@dizum.com
Injection-Info: sewer.dizum.com - 2001::1/128
 by: Fritz Wuehler - Sun, 8 Aug 2021 09:23 UTC

On the surface Apple�s new features sound both sensible and
commendable � but they also open a Pandora�s box of privacy and
surveillance issues

Privacy. That�s (no longer) iPhone.

Apple, which has spent big bucks on ad campaigns boasting about
how much it values its users privacy, is about to start poking
through all your text messages and photos. Don�t worry, the tech
company has assured everyone, the prying is for purely
benevolent purposes. On Thursday Apple announced a new set of
�protection for children� features that will look through US
iPhones for images of child abuse. One of these features is a
tool called neuralMatch, which will scan photo libraries to see
if they contain anything that matches a database of known child
abuse imagery. Another feature, which parents can enable or
disable, scans iMessage images sent or received by accounts that
belong to a minor. It will then notify the parents when a child
receives sexually explicit imagery.

On the surface Apple�s new features sound both sensible and
commendable. Technology-facilitated child sexual exploitation is
an enormous problem; one that�s spiralling out of control. In
1998 there were more than 3,000 reports of child sex abuse
imagery, according to a 2019 paper published in conjunction with
the National Center for Missing and Exploited Children. In 2018
there were 18.4m. These reports included more than 45m images
and videos that were flagged as child sexual abuse. Technology
companies have a duty to curb the terrible abuses their
platforms help facilitate. Apple�s new features are an attempt
to do just that.

But while Apple�s attempts to protect children may be valiant,
they also open a Pandora�s box of privacy and surveillance
issues. Of particular concern to security researchers and
privacy activists is the fact that this new feature doesn�t just
look at images stored on the cloud; it scans users� devices
without their consent. Essentially that means there�s now a sort
of �backdoor� into an individual�s iPhone, one which has the
potential to grow wider and wider. The Electronic Frontier
Foundation (EFF), an online civil liberties advocacy group,
warns that �all it would take to widen the narrow backdoor that
Apple is building is an expansion of the machine learning
parameters to look for additional types of content � That�s not
a slippery slope; that�s a fully built system just waiting for
external pressure to make the slightest change.� You can
imagine, for example, how certain countries might pressure Apple
to scan for anti-government messages or LGBTQ content.

Jillian York, the author of a new book about how surveillance
capitalism affects free speech, is also concerned that Apple�s
new parental controls mean images shared between two minors
could be non-consensually shared with one of their parents.
�This strikes me as assumptive of two things,� she told me.
�One, That adults can be trusted with these images and two, that
every other culture has the same ideas about what constitutes
nudity and sexuality as the US does.�

Edward Snowden, who knows a thing or two about abuses of
surveillance, has also voiced concerns about Apple�s new
features. �No matter how well-intentioned, @Apple is rolling out
mass surveillance to the entire world with this,� Snowden
tweeted. �Make no mistake: if they can scan for kiddie porn
today, they can scan for anything tomorrow. They turned a
trillion dollars of devices into iNarcs�*without asking.*�

But why would a technology company bother asking the public what
it wants? We all know that big tech knows what�s best for us
plebs. While mass surveillance may sound scary, I�m sure we can
all trust Apple et al. to do the right thing. No need to worry
about hackers or Apple contractors accessing and uploading your
nudes! No need to worry about Apple employees exploiting the
technology to spy on people, in the same way that Uber employees
did with their �God View� tool! I�m sure it will all be
perfectly fine.

https://www.theguardian.com/commentisfree/2021/aug/07/week-in-
patriarchy-apple-privacy

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<seoja5$4p0$1@news.mixmin.net>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18365&group=comp.mobile.android#18365

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!.POSTED!not-for-mail
From: lou...@cern.ch (cern)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,
comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Date: Sun, 8 Aug 2021 07:39:33 -0500
Organization: Mixmin
Message-ID: <seoja5$4p0$1@news.mixmin.net>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
Injection-Date: Sun, 8 Aug 2021 12:39:34 -0000 (UTC)
Injection-Info: news.mixmin.net; posting-host="6a2ae55401cdaebb179f9909fa4a7baa17b67e93";
logging-data="4896"; mail-complaints-to="abuse@mixmin.net"
 by: cern - Sun, 8 Aug 2021 12:39 UTC

All, I say again, all companies today are pathological liars. All
they want is money. If Apples was so safe, why did they just
release a message about their intentions of going through every
IPhone to look for kiddy porn? How is it that Apple has this
capability? Because the IPhone is a spy phone. You need to get a
Pixel and load it with CalyxOS - a de-Googled operating system.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<080820211036008314%nospam@nospam.invalid>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18370&group=comp.mobile.android#18370

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@nospam.invalid (nospam)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 08 Aug 2021 10:36:00 -0400
Organization: A noiseless patient Spider
Lines: 65
Message-ID: <080820211036008314%nospam@nospam.invalid>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
Mime-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: reader02.eternal-september.org; posting-host="976663b646664dc58de25ce38ba76dbf";
logging-data="23835"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/YHQtRBPKd2PvBJ2yYsox+"
User-Agent: Thoth/1.9.0 (Mac OS X)
Cancel-Lock: sha1:Mu8phgDHJw+tykepksjXgQqjF0c=
 by: nospam - Sun, 8 Aug 2021 14:36 UTC

In article
<29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>, Fritz
Wuehler <fritz@spamexpire-202108.rodent.frell.theremailer.net> wrote:

> On the surface Apple’s new features sound both sensible and
> commendable. Technology-facilitated child sexual exploitation is
> an enormous problem; one that’s spiralling out of control. In
> 1998 there were more than 3,000 reports of child sex abuse
> imagery, according to a 2019 paper published in conjunction with
> the National Center for Missing and Exploited Children. In 2018
> there were 18.4m. These reports included more than 45m images
> and videos that were flagged as child sexual abuse. Technology
> companies have a duty to curb the terrible abuses their
> platforms help facilitate. Apple’s new features are an attempt
> to do just that.

yep.

> But while Apple’s attempts to protect children may be valiant,
> they also open a Pandora’s box of privacy and surveillance
> issues. Of particular concern to security researchers and
> privacy activists is the fact that this new feature doesn’t just
> look at images stored on the cloud; it scans users’ devices
> without their consent.

absolutely false.

*only* images uploaded to icloud are checked against a known database
of child porn.

images that are never sent anywhere are never checked.

google, facebook, twitter, dropbox, discord and many, many other
services already check for child porn and have been doing so for years.

facebook started checking in 2011.

> Essentially that means there’s now a sort
> of “backdoor” into an individual’s iPhone, one which has the
> potential to grow wider and wider.

it's not a backdoor.

> The Electronic Frontier
> Foundation (EFF), an online civil liberties advocacy group,
> warns that “all it would take to widen the narrow backdoor that
> Apple is building is an expansion of the machine learning
> parameters to look for additional types of content … That’s not
> a slippery slope; that’s a fully built system just waiting for
> external pressure to make the slightest change.” You can
> imagine, for example, how certain countries might pressure Apple
> to scan for anti-government messages or LGBTQ content.

you can imagine all sorts of things, including that the earth is flat
and that the moon is made of cheese.

that doesn't mean any of it is true.

> Jillian York, the author of a new book about how surveillance
> capitalism affects free speech, is also concerned that Apple’s
> new parental controls mean images shared between two minors
> could be non-consensually shared with one of their parents.

she should try learning how it actually works before commenting, let
alone write a book.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<L7SPI.7275$uV3.4755@fx18.iad>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18371&group=comp.mobile.android#18371

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!feeder1.feed.usenet.farm!feed.usenet.farm!peer03.ams4!peer.am4.highwinds-media.com!peer02.iad!feed-me.highwinds-media.com!news.highwinds-media.com!fx18.iad.POSTED!not-for-mail
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
From: sil...@slim.er (SilverSlimer)
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:78.0) Gecko/20100101
Thunderbird/78.12.0
MIME-Version: 1.0
In-Reply-To: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 8bit
Lines: 88
Message-ID: <L7SPI.7275$uV3.4755@fx18.iad>
X-Complaints-To: abuse@blocknews.net
NNTP-Posting-Date: Sun, 08 Aug 2021 14:47:07 UTC
Organization: blocknews - www.blocknews.net
Date: Sun, 8 Aug 2021 10:47:07 -0400
X-Received-Bytes: 5700
 by: SilverSlimer - Sun, 8 Aug 2021 14:47 UTC

On 2021-08-08 5:23 a.m., Fritz Wuehler wrote:
> On the surface Apple’s new features sound both sensible and
> commendable – but they also open a Pandora’s box of privacy and
> surveillance issues
>
> Privacy. That’s (no longer) iPhone.
>
> Apple, which has spent big bucks on ad campaigns boasting about
> how much it values its users privacy, is about to start poking
> through all your text messages and photos. Don’t worry, the tech
> company has assured everyone, the prying is for purely
> benevolent purposes. On Thursday Apple announced a new set of
> “protection for children” features that will look through US
> iPhones for images of child abuse. One of these features is a
> tool called neuralMatch, which will scan photo libraries to see
> if they contain anything that matches a database of known child
> abuse imagery. Another feature, which parents can enable or
> disable, scans iMessage images sent or received by accounts that
> belong to a minor. It will then notify the parents when a child
> receives sexually explicit imagery.
>
> On the surface Apple’s new features sound both sensible and
> commendable. Technology-facilitated child sexual exploitation is
> an enormous problem; one that’s spiralling out of control. In
> 1998 there were more than 3,000 reports of child sex abuse
> imagery, according to a 2019 paper published in conjunction with
> the National Center for Missing and Exploited Children. In 2018
> there were 18.4m. These reports included more than 45m images
> and videos that were flagged as child sexual abuse. Technology
> companies have a duty to curb the terrible abuses their
> platforms help facilitate. Apple’s new features are an attempt
> to do just that.
>
> But while Apple’s attempts to protect children may be valiant,
> they also open a Pandora’s box of privacy and surveillance
> issues. Of particular concern to security researchers and
> privacy activists is the fact that this new feature doesn’t just
> look at images stored on the cloud; it scans users’ devices
> without their consent. Essentially that means there’s now a sort
> of “backdoor” into an individual’s iPhone, one which has the
> potential to grow wider and wider. The Electronic Frontier
> Foundation (EFF), an online civil liberties advocacy group,
> warns that “all it would take to widen the narrow backdoor that
> Apple is building is an expansion of the machine learning
> parameters to look for additional types of content … That’s not
> a slippery slope; that’s a fully built system just waiting for
> external pressure to make the slightest change.” You can
> imagine, for example, how certain countries might pressure Apple
> to scan for anti-government messages or LGBTQ content.
>
> Jillian York, the author of a new book about how surveillance
> capitalism affects free speech, is also concerned that Apple’s
> new parental controls mean images shared between two minors
> could be non-consensually shared with one of their parents.
> “This strikes me as assumptive of two things,” she told me.
> “One, That adults can be trusted with these images and two, that
> every other culture has the same ideas about what constitutes
> nudity and sexuality as the US does.”
>
> Edward Snowden, who knows a thing or two about abuses of
> surveillance, has also voiced concerns about Apple’s new
> features. “No matter how well-intentioned, @Apple is rolling out
> mass surveillance to the entire world with this,” Snowden
> tweeted. “Make no mistake: if they can scan for kiddie porn
> today, they can scan for anything tomorrow. They turned a
> trillion dollars of devices into iNarcs–*without asking.*”
>
> But why would a technology company bother asking the public what
> it wants? We all know that big tech knows what’s best for us
> plebs. While mass surveillance may sound scary, I’m sure we can
> all trust Apple et al. to do the right thing. No need to worry
> about hackers or Apple contractors accessing and uploading your
> nudes! No need to worry about Apple employees exploiting the
> technology to spy on people, in the same way that Uber employees
> did with their “God View” tool! I’m sure it will all be
> perfectly fine.
>
> https://www.theguardian.com/commentisfree/2021/aug/07/week-in-
> patriarchy-apple-privacy

To me, the only solution is a de-Googled Android device using LineageOS,
CalyxOS or GrapheneOS but that also means buying a Google Pixel phone
since those systems seem to cater to that line the most.

--
SilverSlimer
@silverslimer

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<UiSPI.3043$GW7.2964@fx16.iad>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18372&group=comp.mobile.android#18372

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.uzoreto.com!news-out.netnews.com!news.alt.net!fdc2.netnews.com!peer03.ams1!peer.ams1.xlned.com!news.xlned.com!peer01.iad!feed-me.highwinds-media.com!news.highwinds-media.com!fx16.iad.POSTED!not-for-mail
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
<seoja5$4p0$1@news.mixmin.net>
From: sil...@slim.er (SilverSlimer)
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:78.0) Gecko/20100101
Thunderbird/78.12.0
MIME-Version: 1.0
In-Reply-To: <seoja5$4p0$1@news.mixmin.net>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 7bit
Lines: 19
Message-ID: <UiSPI.3043$GW7.2964@fx16.iad>
X-Complaints-To: abuse@blocknews.net
NNTP-Posting-Date: Sun, 08 Aug 2021 14:59:00 UTC
Organization: blocknews - www.blocknews.net
Date: Sun, 8 Aug 2021 10:59:00 -0400
X-Received-Bytes: 1848
 by: SilverSlimer - Sun, 8 Aug 2021 14:59 UTC

On 2021-08-08 8:39 a.m., cern wrote:
> All, I say again, all companies today are pathological liars. All
> they want is money. If Apples was so safe, why did they just
> release a message about their intentions of going through every
> IPhone to look for kiddy porn? How is it that Apple has this
> capability? Because the IPhone is a spy phone. You need to get a
> Pixel and load it with CalyxOS - a de-Googled operating system.

That last suggestion is definitely resonating with me. I'm not sure
whether I would choose Lineage, Calyx or Graphene though.

If I had had an iPhone, I imagine that Apple would already be contacting
the authorities over what is on my phone since I take pictures of my boy
and, for memories, I even took some of him in the bath.

--
SilverSlimer
@silverslimer

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<seosae$75s$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18374&group=comp.mobile.android#18374

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!weretis.net!feeder8.news.weretis.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: scharf.s...@geemail.com (sms)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Date: Sun, 8 Aug 2021 08:13:18 -0700
Organization: A noiseless patient Spider
Lines: 119
Message-ID: <seosae$75s$1@dont-email.me>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Sun, 8 Aug 2021 15:13:19 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="ef16134decf3ec51dd214a1ef59d4104";
logging-data="7356"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX181ESzO7VOr6loLdwA0213z"
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:78.0) Gecko/20100101
Thunderbird/78.12.0
Cancel-Lock: sha1:ZlaFiEHE9MAjHyQ/PUlPEdf8Cfw=
In-Reply-To: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
Content-Language: en-US
 by: sms - Sun, 8 Aug 2021 15:13 UTC

On 8/8/2021 2:23 AM, Fritz Wuehler wrote:
> On the surface Apple’s new features sound both sensible and
> commendable – but they also open a Pandora’s box of privacy and
> surveillance issues
>
> Privacy. That’s (no longer) iPhone.
>
> Apple, which has spent big bucks on ad campaigns boasting about
> how much it values its users privacy, is about to start poking
> through all your text messages and photos. Don’t worry, the tech
> company has assured everyone, the prying is for purely
> benevolent purposes. On Thursday Apple announced a new set of
> “protection for children” features that will look through US
> iPhones for images of child abuse. One of these features is a
> tool called neuralMatch, which will scan photo libraries to see
> if they contain anything that matches a database of known child
> abuse imagery. Another feature, which parents can enable or
> disable, scans iMessage images sent or received by accounts that
> belong to a minor. It will then notify the parents when a child
> receives sexually explicit imagery.
>
> On the surface Apple’s new features sound both sensible and
> commendable. Technology-facilitated child sexual exploitation is
> an enormous problem; one that’s spiralling out of control. In
> 1998 there were more than 3,000 reports of child sex abuse
> imagery, according to a 2019 paper published in conjunction with
> the National Center for Missing and Exploited Children. In 2018
> there were 18.4m. These reports included more than 45m images
> and videos that were flagged as child sexual abuse. Technology
> companies have a duty to curb the terrible abuses their
> platforms help facilitate. Apple’s new features are an attempt
> to do just that.
>
> But while Apple’s attempts to protect children may be valiant,
> they also open a Pandora’s box of privacy and surveillance
> issues. Of particular concern to security researchers and
> privacy activists is the fact that this new feature doesn’t just
> look at images stored on the cloud; it scans users’ devices
> without their consent. Essentially that means there’s now a sort
> of “backdoor” into an individual’s iPhone, one which has the
> potential to grow wider and wider. The Electronic Frontier
> Foundation (EFF), an online civil liberties advocacy group,
> warns that “all it would take to widen the narrow backdoor that
> Apple is building is an expansion of the machine learning
> parameters to look for additional types of content … That’s not
> a slippery slope; that’s a fully built system just waiting for
> external pressure to make the slightest change.” You can
> imagine, for example, how certain countries might pressure Apple
> to scan for anti-government messages or LGBTQ content.
>
> Jillian York, the author of a new book about how surveillance
> capitalism affects free speech, is also concerned that Apple’s
> new parental controls mean images shared between two minors
> could be non-consensually shared with one of their parents.
> “This strikes me as assumptive of two things,” she told me.
> “One, That adults can be trusted with these images and two, that
> every other culture has the same ideas about what constitutes
> nudity and sexuality as the US does.”
>
> Edward Snowden, who knows a thing or two about abuses of
> surveillance, has also voiced concerns about Apple’s new
> features. “No matter how well-intentioned, @Apple is rolling out
> mass surveillance to the entire world with this,” Snowden
> tweeted. “Make no mistake: if they can scan for kiddie porn
> today, they can scan for anything tomorrow. They turned a
> trillion dollars of devices into iNarcs–*without asking.*”
>
> But why would a technology company bother asking the public what
> it wants? We all know that big tech knows what’s best for us
> plebs. While mass surveillance may sound scary, I’m sure we can
> all trust Apple et al. to do the right thing. No need to worry
> about hackers or Apple contractors accessing and uploading your
> nudes! No need to worry about Apple employees exploiting the
> technology to spy on people, in the same way that Uber employees
> did with their “God View” tool! I’m sure it will all be
> perfectly fine.
>
> https://www.theguardian.com/commentisfree/2021/aug/07/week-in-
> patriarchy-apple-privacy

As the L.A. Times reported, "The company also said it can adjust the
algorithm over time." This is what is a huge concern to security
researchers, privacy advocates, LGBTQ+ organization, religious freedom
groups, and political freedom groups.

The potential for abuse is enormous. The code is there, it just takes
tweaking to match to anything a government wants to look for, and the
databases will be provided by whatever entity demands that Apple make
this "feature" available as a condition of continuing to do business in
that country. It's already occurred with iCloud with Apple in China
<https://www.amnesty.org/en/latest/news/2018/03/apple-privacy-betrayal-for-chinese-icloud-users/>.
Google finally gave up on China, and abandoned a project to create a
censored search engine
<https://www.technologyreview.com/2018/12/19/138307/how-google-took-on-china-and-lost/>.

In countries where homosexuality is a crime Apple could adjust the
algorithm to look for photos containing images depicting homosexual
activity. In countries that are oppressing minority populations it could
be adjusted to look for photos and messages of the political opposition.
India could target Muslims, Pakistan could target Hindus, etc..

Of course this sort of scanning has been done for years on users cloud
storage accounts, and not just iCloud. It's on-device scanning that is
unprecedented. The Electronic Frontier Foundation called it “a shocking
about-face for users who have relied on the company’s leadership in
privacy and security.” OTOH, for Apple, instead of expanding server
capacity they can now distribute the task of scanning for photos to
hundreds of millions of iOS and Mac devices.

This could actually make things worse in terms of the spread of CSAM.
Aware of what Apple is doing, the people that were using their iPhones
for nefarious purposes will no longer do so and those images won't be
either in their iCloud/iPhoto account _or_ on their devices. They've
essentially driven those users underground. It also hurts the reputation
of NMEC which will lead to less financial support from donors.

Hopefully Apple will find a way to gracefully back down from on-device
scanning. iCloud/iPhoto scanning is sufficient.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<seou8l$unr$1@news.mixmin.net>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18377&group=comp.mobile.android#18377

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!.POSTED!not-for-mail
From: inva...@invalid.invalid (Gronk)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 8 Aug 2021 09:46:44 -0600
Organization: 2.0
Message-ID: <seou8l$unr$1@news.mixmin.net>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <L7SPI.7275$uV3.4755@fx18.iad>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Sun, 8 Aug 2021 15:46:30 -0000 (UTC)
Injection-Info: news.mixmin.net; posting-host="f9ca328ae2ad9d184d6630824237e70ab33648e7";
logging-data="31483"; mail-complaints-to="abuse@mixmin.net"
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:52.0) Gecko/20100101 Firefox/52.0 SeaMonkey/2.49.4
 by: Gronk - Sun, 8 Aug 2021 15:46 UTC

SilverSlimer wrote on 08.08.2021 14:47
> To me, the only solution is a de-Googled Android device using LineageOS,
> CalyxOS or GrapheneOS but that also means buying a Google Pixel phone
> since those systems seem to cater to that line the most.

You don't even need to be rooted to remove almost all Google from Android.

1. The first step is don't set up Android with a Google Account
2. Then replace all important Google apps with their FOSS alternatives
(YouTube==>NewPipe, GooglePlay==>AuroraStore, GMail==>K-9Mail, etc.)
3. Set Aurora to filter out apps which require Google Services Framework

After that, almost all Google core components can then be deleted/disabled.
[https://f-droid.org/en/packages/io.github.muntashirakon.AppManager/]

A. Your phone will have _more_ capability than it had before.
B. Simultaneously almost all connections to Google will have been severed.

All without rooting.
--
Rooting is better though as root provides for many more FOSS alternatives.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<seoubo$lcn$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18378&group=comp.mobile.android#18378

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: mayay...@invalid.nospam (Mayayana)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 8 Aug 2021 11:47:10 -0400
Organization: A noiseless patient Spider
Lines: 20
Message-ID: <seoubo$lcn$1@dont-email.me>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <L7SPI.7275$uV3.4755@fx18.iad>
Injection-Date: Sun, 8 Aug 2021 15:48:08 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="d1bde6b641e8811379f7f55e1de93776";
logging-data="21911"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX183z8farJQLxyd8xF9NoWxol+qyXicKZ0o="
Cancel-Lock: sha1:O3yBl1wCsSclbg/2818Zk15+F5w=
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.5512
X-Newsreader: Microsoft Outlook Express 6.00.2900.5512
X-Priority: 3
X-MSMail-Priority: Normal
 by: Mayayana - Sun, 8 Aug 2021 15:47 UTC

"SilverSlimer" <silver@slim.er> wrote

| To me, the only solution is a de-Googled Android device using LineageOS,
| CalyxOS or GrapheneOS but that also means buying a Google Pixel phone
| since those systems seem to cater to that line the most.
|

I have another solution. It's a phone with a wire that I
plug into the wall. The sound quality is wonderful, the
device is ergonomic, unlike slab phones. And it has a number
of great inventions that I wouldn't want to be without. For
example, it's completely immune to Uber and Lyft. It can
never be attacked by inane Twitter posts or restaurant
recommendations. Facebook can't possibly gain a foothold.
And when I go for a walk, no one can reach me via text
message. But they can leave a voice message. Best of all,
I can use my device to make phone calls. What'll they
think of next?!

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<080820211151038465%nospam@nospam.invalid>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18381&group=comp.mobile.android#18381

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!weretis.net!feeder8.news.weretis.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@nospam.invalid (nospam)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 08 Aug 2021 11:51:03 -0400
Organization: A noiseless patient Spider
Lines: 25
Message-ID: <080820211151038465%nospam@nospam.invalid>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <seoja5$4p0$1@news.mixmin.net> <UiSPI.3043$GW7.2964@fx16.iad>
Mime-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: reader02.eternal-september.org; posting-host="976663b646664dc58de25ce38ba76dbf";
logging-data="22435"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+qO5bg22EaFnUIC+p3wkwO"
User-Agent: Thoth/1.9.0 (Mac OS X)
Cancel-Lock: sha1:WXOdrTh4zHNScTOZAIjhH57C+uU=
 by: nospam - Sun, 8 Aug 2021 15:51 UTC

In article <UiSPI.3043$GW7.2964@fx16.iad>, SilverSlimer
<silver@slim.er> wrote:

> If I had had an iPhone, I imagine that Apple would already be contacting
> the authorities over what is on my phone since I take pictures of my boy
> and, for memories, I even took some of him in the bath.

you can imagine all sorts of things, most of which aren't true.

apple checks photos against a database of known child porn *only* if
the photos are uploaded to icloud.

photos not uploaded are never checked. full stop.

photos of your own kids, naked or not, should not be in the database of
child porn and therefore will *not* be flagged if they are uploaded to
icloud.

if photos of your kids are in the database of known child porn, you
have *much* bigger problems, as do your kids. hire a lawyer *now*.

using android with a custom rom doesn't change anything, because if you
upload photos to google, microsoft, facebook, twitter, dropbox and many
other services, regardless of device (mobile or desktop), they will be
scanned for porn.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<080820211151048556%nospam@nospam.invalid>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18382&group=comp.mobile.android#18382

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!weretis.net!feeder8.news.weretis.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@nospam.invalid (nospam)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 08 Aug 2021 11:51:04 -0400
Organization: A noiseless patient Spider
Lines: 65
Message-ID: <080820211151048556%nospam@nospam.invalid>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <seosae$75s$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: reader02.eternal-september.org; posting-host="976663b646664dc58de25ce38ba76dbf";
logging-data="22435"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/DsukJaiEuH5p+bXhukbFY"
User-Agent: Thoth/1.9.0 (Mac OS X)
Cancel-Lock: sha1:kkZrj9ZYWL5IYlaHwrQz9zX30Ks=
 by: nospam - Sun, 8 Aug 2021 15:51 UTC

In article <seosae$75s$1@dont-email.me>, sms
<scharf.steven@geemail.com> wrote:

> As the L.A. Times reported, "The company also said it can adjust the
> algorithm over time." This is what is a huge concern to security
> researchers, privacy advocates, LGBTQ+ organization, religious freedom
> groups, and political freedom groups.

adjusting the algorithm doesn't magically crate a new database of
images to be matched.

it helps to know how it actually works rather than criticize problems
that don't actually exist.

> The potential for abuse is enormous. The code is there, it just takes
> tweaking to match to anything a government wants to look for, and the
> databases will be provided by whatever entity demands that Apple make
> this "feature" available as a condition of continuing to do business in
> that country. It's already occurred with iCloud with Apple in China

nope. what happened in china is that apple is obeying china's laws, as
do other companies who wish to do business there. one example is that
icloud servers for chinese users are kept within china.

> In countries where homosexuality is a crime Apple could adjust the
> algorithm to look for photos containing images depicting homosexual
> activity. In countries that are oppressing minority populations it could
> be adjusted to look for photos and messages of the political opposition.
> India could target Muslims, Pakistan could target Hindus, etc..

no they can't. the system doesn't work that way.

> Of course this sort of scanning has been done for years on users cloud
> storage accounts, and not just iCloud. It's on-device scanning that is
> unprecedented.

it's only for photos uploaded to icloud, and yes it has been done by
other services. apple is actually late to the game.

> This could actually make things worse in terms of the spread of CSAM.

no it very definitely won't.

> Aware of what Apple is doing, the people that were using their iPhones
> for nefarious purposes will no longer do so and those images won't be
> either in their iCloud/iPhoto account _or_ on their devices. They've
> essentially driven those users underground. It also hurts the reputation
> of NMEC which will lead to less financial support from donors.

false. only what's uploaded to icloud is checked.

do you have a better solution or are you just trolling? (rhetorical
question).

> Hopefully Apple will find a way to gracefully back down from on-device
> scanning.

that's not what they're doing.

> iCloud/iPhoto scanning is sufficient.

that *is* what they're doing, plus iphoto hasn't existed for many years.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<seovuu$2vs$1@news.mixmin.net>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18383&group=comp.mobile.android#18383

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!.POSTED!not-for-mail
From: inva...@invalid.invalid (Gronk)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 8 Aug 2021 10:15:41 -0600
Organization: 2.0
Message-ID: <seovuu$2vs$1@news.mixmin.net>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <seosae$75s$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Sun, 8 Aug 2021 16:15:26 -0000 (UTC)
Injection-Info: news.mixmin.net; posting-host="f9ca328ae2ad9d184d6630824237e70ab33648e7";
logging-data="3068"; mail-complaints-to="abuse@mixmin.net"
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:52.0) Gecko/20100101 Firefox/52.0 SeaMonkey/2.49.4
 by: Gronk - Sun, 8 Aug 2021 16:15 UTC

sms wrote:
> The potential for abuse is enormous.

Apple has so many zero-day bugs in both chips & s/w this is no different.
https://www.cyberscoop.com/ios-zero-day-zerodium-high-supply/
"Stop sending us zero-day Apple bugs - we have too many already!"

Plus the huge potential for abuse is a problem (because it WILL happen!).
https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life
"Apple Opens a Backdoor to Your Private Life on Your iPhone"

Apple is already well known to sell their soul to the highest bidder.
https://www.usatoday.com/story/tech/talkingtech/2018/04/17/apple-make-simpler-download-your-privacy-data-year/521786002/
"Apple sold your privacy to the highest bidder"

Do you have any idea what Apple has caved in on for China for example?
https://www.nytimes.com/2021/05/17/technology/apple-china-censorship-data.html
"A Hard Bargain for Apple in China - where the government owns everything"

Do you know who is Apple's single largest customer in the United States?
https://www.techtimes.com/articles/262157/20210629/apple-google-cloud-storage.htm
"Apple Becomes the Largest Corporate Customer of Google"

Apple instantly caved in to both of them (for money) without even a fight.
https://www.nytimes.com/2020/10/25/technology/apple-google-search-antitrust.html
"Apple & Google Made a Deal That Controls the Internet"

Even without the numerous zero-day iOS flaws, there is no iPhone privacy.
https://www.politico.eu/article/apple-privacy-problem/
"Apple's privacy problem is that iPhone privacy is an Apple lie"

We're trusting Apple to test this when they're the worst in all smartphone
suppliers in zero-day flaws and in huge holes in their iOS operating system.

Even almost all the Apple chips have zero-day non fixable holes built in!
https://www.wired.com/story/apple-t2-chip-unfixable-flaw-jailbreak-mac/
"Apple's T2 Security Chip Has Unfixable Flaws"

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<inaig3F4uvnU4@mid.individual.net>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18388&group=comp.mobile.android#18388

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Followup: misc.phone.mobile.iphone
Path: i2pn2.org!i2pn.org!weretis.net!feeder8.news.weretis.net!lilly.ping.de!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail
From: jollyro...@pobox.com (Jolly Roger)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Followup-To: misc.phone.mobile.iphone
Date: 8 Aug 2021 17:27:31 GMT
Organization: People for the Ethical Treatment of Pirates
Lines: 50
Message-ID: <inaig3F4uvnU4@mid.individual.net>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
X-Trace: individual.net gRjeZ00djFac4I9WIndbFwEc2p3IZcYiG3U07vQZ0OXc8FrmqO
Cancel-Lock: sha1:KxOjPnYxO16/5xwD2iHsw1REaxk=
Mail-Copies-To: nobody
X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1<n@LQ!aZ7vLO_nWbK~@T'XIS0,oAJcU.qLM
dk/j8Udo?O"o9B9Jyx+ez2:B<nx(k3EdHnTvB]'eoVaR495,Rv~/vPa[e^JI+^h5Zk*i`Q;ezqDW<
ZFs6kmAJWZjOH\8[$$7jm,Ogw3C_%QM'|H6nygNGhhl+@}n30Nz(^vWo@h>Y%b|b-Y~()~\t,LZ3e
up1/bO{=-)
User-Agent: slrn/1.0.3 (Darwin)
 by: Jolly Roger - Sun, 8 Aug 2021 17:27 UTC

On 2021-08-08, Fritz Wuehler <fritz@spamexpire-202108.rodent.frell.theremailer.net> wrote:

[plagiarized clickbait article snipped]

Here are the actual facts the "Outrage Club" doesn't want you to know:

* only photos uploaded or transferred to Apple’s iCloud servers are
examined; they are examined by generating a hash of the photo and
comparing that hash to a list of hashes of known child sexual abuse
photos

* only those hashes that match the hashes of known child sexual abuse
photos are flagged as potential violations by generating encrypted
safety vouchers containing metadata and visual derivatives of matched
photos

* Apple employees know absolutely nothing about images that are not
uploaded or transferred to Apple servers — nor do they know anything
about photos that do not match hashes of known child sexual abuse
photos

* the risk of the system incorrectly flagging an account is an extremely
low (1 in 1 trillion) probability of incorrectly flagging a given
account

* only accounts with safety vouchers that exceed a threshold of multiple
matches are able to be reviewed by Apple employees - until the
threshold is exceeded, the encrypted vouchers cannot be viewed by
anyone

* end users cannot access or view the database of known child sexual
abuse photos - nor can they identify which images were flagged

* only photos that were reviewed and verified to be child sexual abuse
are forwarded to authorities

With this in mind, we know that if this matching activity concerns you,
you can opt out by refraining from uploading photos to iCloud (by
disabling iCloud Photos, My Photo Stream, and iMessage). Since these are
all optional services, this is very easy to do.

Claims stating that Apple is supposedly scanning your entire device 24/7 are unfounded.

Claims that Apple is scanning every single photo on your device are also unfounded.

--
E-mail sent to this address may be devoured by my ravenous SPAM filter.
I often ignore posts from Google. Use a real news client instead.

JR

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<soVPI.12610$Thb4.4429@fx35.iad>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18400&group=comp.mobile.android#18400

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.theuse.net!aioe.org!feeder1.feed.usenet.farm!feed.usenet.farm!peer01.ams4!peer.am4.highwinds-media.com!peer02.iad!feed-me.highwinds-media.com!news.highwinds-media.com!fx35.iad.POSTED!not-for-mail
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
<seosae$75s$1@dont-email.me>
From: sil...@slim.er (SilverSlimer)
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:78.0) Gecko/20100101
Thunderbird/78.12.0
MIME-Version: 1.0
In-Reply-To: <seosae$75s$1@dont-email.me>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 8bit
Lines: 136
Message-ID: <soVPI.12610$Thb4.4429@fx35.iad>
X-Complaints-To: abuse@blocknews.net
NNTP-Posting-Date: Sun, 08 Aug 2021 18:29:44 UTC
Organization: blocknews - www.blocknews.net
Date: Sun, 8 Aug 2021 14:29:44 -0400
X-Received-Bytes: 8743
 by: SilverSlimer - Sun, 8 Aug 2021 18:29 UTC

On 2021-08-08 11:13 a.m., sms wrote:
> On 8/8/2021 2:23 AM, Fritz Wuehler wrote:
>> On the surface Apple’s new features sound both sensible and
>> commendable – but they also open a Pandora’s box of privacy and
>> surveillance issues
>>
>> Privacy. That’s (no longer) iPhone.
>>
>> Apple, which has spent big bucks on ad campaigns boasting about
>> how much it values its users privacy, is about to start poking
>> through all your text messages and photos. Don’t worry, the tech
>> company has assured everyone, the prying is for purely
>> benevolent purposes. On Thursday Apple announced a new set of
>> “protection for children” features that will look through US
>> iPhones for images of child abuse. One of these features is a
>> tool called neuralMatch, which will scan photo libraries to see
>> if they contain anything that matches a database of known child
>> abuse imagery. Another feature, which parents can enable or
>> disable, scans iMessage images sent or received by accounts that
>> belong to a minor. It will then notify the parents when a child
>> receives sexually explicit imagery.
>>
>> On the surface Apple’s new features sound both sensible and
>> commendable. Technology-facilitated child sexual exploitation is
>> an enormous problem; one that’s spiralling out of control. In
>> 1998 there were more than 3,000 reports of child sex abuse
>> imagery, according to a 2019 paper published in conjunction with
>> the National Center for Missing and Exploited Children. In 2018
>> there were 18.4m. These reports included more than 45m images
>> and videos that were flagged as child sexual abuse. Technology
>> companies have a duty to curb the terrible abuses their
>> platforms help facilitate. Apple’s new features are an attempt
>> to do just that.
>>
>> But while Apple’s attempts to protect children may be valiant,
>> they also open a Pandora’s box of privacy and surveillance
>> issues. Of particular concern to security researchers and
>> privacy activists is the fact that this new feature doesn’t just
>> look at images stored on the cloud; it scans users’ devices
>> without their consent. Essentially that means there’s now a sort
>> of “backdoor” into an individual’s iPhone, one which has the
>> potential to grow wider and wider. The Electronic Frontier
>> Foundation (EFF), an online civil liberties advocacy group,
>> warns that “all it would take to widen the narrow backdoor that
>> Apple is building is an expansion of the machine learning
>> parameters to look for additional types of content … That’s not
>> a slippery slope; that’s a fully built system just waiting for
>> external pressure to make the slightest change.” You can
>> imagine, for example, how certain countries might pressure Apple
>> to scan for anti-government messages or LGBTQ content.
>>
>> Jillian York, the author of a new book about how surveillance
>> capitalism affects free speech, is also concerned that Apple’s
>> new parental controls mean images shared between two minors
>> could be non-consensually shared with one of their parents.
>> “This strikes me as assumptive of two things,” she told me.
>> “One, That adults can be trusted with these images and two, that
>> every other culture has the same ideas about what constitutes
>> nudity and sexuality as the US does.”
>>
>> Edward Snowden, who knows a thing or two about abuses of
>> surveillance, has also voiced concerns about Apple’s new
>> features. “No matter how well-intentioned, @Apple is rolling out
>> mass surveillance to the entire world with this,” Snowden
>> tweeted. “Make no mistake: if they can scan for kiddie porn
>> today, they can scan for anything tomorrow. They turned a
>> trillion dollars of devices into iNarcs–*without asking.*”
>>
>> But why would a technology company bother asking the public what
>> it wants? We all know that big tech knows what’s best for us
>> plebs. While mass surveillance may sound scary, I’m sure we can
>> all trust Apple et al. to do the right thing. No need to worry
>> about hackers or Apple contractors accessing and uploading your
>> nudes! No need to worry about Apple employees exploiting the
>> technology to spy on people, in the same way that Uber employees
>> did with their “God View” tool! I’m sure it will all be
>> perfectly fine.
>>
>> https://www.theguardian.com/commentisfree/2021/aug/07/week-in-
>> patriarchy-apple-privacy
>
> As the L.A. Times reported, "The company also said it can adjust the
> algorithm over time." This is what is a huge concern to security
> researchers, privacy advocates, LGBTQ+ organization, religious freedom
> groups, and political freedom groups.
>
> The potential for abuse is enormous. The code is there, it just takes
> tweaking to match to anything a government wants to look for, and the
> databases will be provided by whatever entity demands that Apple make
> this "feature" available as a condition of continuing to do business in
> that country. It's already occurred with iCloud with Apple in China
> <https://www.amnesty.org/en/latest/news/2018/03/apple-privacy-betrayal-for-chinese-icloud-users/>.
> Google finally gave up on China, and abandoned a project to create a
> censored search engine
> <https://www.technologyreview.com/2018/12/19/138307/how-google-took-on-china-and-lost/>.
>
>
> In countries where homosexuality is a crime Apple could adjust the
> algorithm to look for photos containing images depicting homosexual
> activity. In countries that are oppressing minority populations it could
> be adjusted to look for photos and messages of the political opposition.
> India could target Muslims, Pakistan could target Hindus, etc..
>
> Of course this sort of scanning has been done for years on users cloud
> storage accounts, and not just iCloud. It's on-device scanning that is
> unprecedented. The Electronic Frontier Foundation called it “a shocking
> about-face for users who have relied on the company’s leadership in
> privacy and security.” OTOH, for Apple, instead of expanding server
> capacity they can now distribute the task of scanning for photos to
> hundreds of millions of iOS and Mac devices.
>
> This could actually make things worse in terms of the spread of CSAM.
> Aware of what Apple is doing, the people that were using their iPhones
> for nefarious purposes will no longer do so and those images won't be
> either in their iCloud/iPhoto account _or_ on their devices. They've
> essentially driven those users underground. It also hurts the reputation
> of NMEC which will lead to less financial support from donors.
>
> Hopefully Apple will find a way to gracefully back down from on-device
> scanning. iCloud/iPhoto scanning is sufficient.

The sad part is that we've known for _years_ that not only does Apple
participate in the PRISM program, but that they have no interest in
protecting a user's privacy. Their latest attempt to change their
reputation by preventing applications from tracking a user now seems
like a bait for the dumber users to think that Apple has their back. As
always, once they've lured the users, they switch their plans around to
remind us that they serve the wolves, not the sheep. I can imagine that
users can remedy the situation by simply not sending their photos to
iCloud and probably have nothing to worry about since their system only
checks for known child pornography but can we really trust a corporation
to do what they say they'll do and no more?

--
SilverSlimer
@silverslimer

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<GpVPI.12611$Thb4.4594@fx35.iad>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18402&group=comp.mobile.android#18402

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!feeder5.feed.usenet.farm!feeder1.feed.usenet.farm!feed.usenet.farm!peer01.ams4!peer.am4.highwinds-media.com!peer01.iad!feed-me.highwinds-media.com!news.highwinds-media.com!fx35.iad.POSTED!not-for-mail
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
<L7SPI.7275$uV3.4755@fx18.iad> <seoubo$lcn$1@dont-email.me>
From: sil...@slim.er (SilverSlimer)
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:78.0) Gecko/20100101
Thunderbird/78.12.0
MIME-Version: 1.0
In-Reply-To: <seoubo$lcn$1@dont-email.me>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 7bit
Lines: 26
Message-ID: <GpVPI.12611$Thb4.4594@fx35.iad>
X-Complaints-To: abuse@blocknews.net
NNTP-Posting-Date: Sun, 08 Aug 2021 18:31:02 UTC
Organization: blocknews - www.blocknews.net
Date: Sun, 8 Aug 2021 14:31:03 -0400
X-Received-Bytes: 2046
 by: SilverSlimer - Sun, 8 Aug 2021 18:31 UTC

On 2021-08-08 11:47 a.m., Mayayana wrote:
> "SilverSlimer" <silver@slim.er> wrote
>
> | To me, the only solution is a de-Googled Android device using LineageOS,
> | CalyxOS or GrapheneOS but that also means buying a Google Pixel phone
> | since those systems seem to cater to that line the most.
> |
>
> I have another solution. It's a phone with a wire that I
> plug into the wall. The sound quality is wonderful, the
> device is ergonomic, unlike slab phones. And it has a number
> of great inventions that I wouldn't want to be without. For
> example, it's completely immune to Uber and Lyft. It can
> never be attacked by inane Twitter posts or restaurant
> recommendations. Facebook can't possibly gain a foothold.
> And when I go for a walk, no one can reach me via text
> message. But they can leave a voice message. Best of all,
> I can use my device to make phone calls. What'll they
> think of next?!

Ha! Good post.

--
SilverSlimer
@silverslimer

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<sep8tq$5uu$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18410&group=comp.mobile.android#18410

  copy link   Newsgroups: misc.phone.mobile.iphone alt.privacy.anon-server comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!weretis.net!feeder8.news.weretis.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: REMOVETH...@gmail.com (badgolferman)
Newsgroups: misc.phone.mobile.iphone,alt.privacy.anon-server,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images
- what could possibly go wrong?
Date: Sun, 8 Aug 2021 18:48:26 -0000 (UTC)
Organization: A noiseless patient Spider
Lines: 26
Message-ID: <sep8tq$5uu$1@dont-email.me>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
<L7SPI.7275$uV3.4755@fx18.iad>
<seoubo$lcn$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Injection-Date: Sun, 8 Aug 2021 18:48:26 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="871dd6277356c7eb601b4c1c9f2fd78f";
logging-data="6110"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/K4DsT8OOkEcG9jN2ks1ql2YIL4hxQjRk="
User-Agent: NewsTap/5.5 (iPhone/iPod Touch)
Cancel-Lock: sha1:xOTlzA/6pabXgY0+hU3z3GNGRIc=
sha1:eI07/pGATsurnRCvZmDVm8O7RKA=
 by: badgolferman - Sun, 8 Aug 2021 18:48 UTC

Mayayana <mayayana@invalid.nospam> wrote:
> "SilverSlimer" <silver@slim.er> wrote
>
> | To me, the only solution is a de-Googled Android device using LineageOS,
> | CalyxOS or GrapheneOS but that also means buying a Google Pixel phone
> | since those systems seem to cater to that line the most.
> |
>
> I have another solution. It's a phone with a wire that I
> plug into the wall. The sound quality is wonderful, the
> device is ergonomic, unlike slab phones. And it has a number
> of great inventions that I wouldn't want to be without. For
> example, it's completely immune to Uber and Lyft. It can
> never be attacked by inane Twitter posts or restaurant
> recommendations. Facebook can't possibly gain a foothold.
> And when I go for a walk, no one can reach me via text
> message. But they can leave a voice message. Best of all,
> I can use my device to make phone calls. What'll they
> think of next?!
>
>
>

Constant spam calls to the point that you completely ignore the phone
forever.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<080820211508037711%nospam@nospam.invalid>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18414&group=comp.mobile.android#18414

  copy link   Newsgroups: misc.phone.mobile.iphone alt.privacy.anon-server comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@nospam.invalid (nospam)
Newsgroups: misc.phone.mobile.iphone,alt.privacy.anon-server,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 08 Aug 2021 15:08:03 -0400
Organization: A noiseless patient Spider
Lines: 21
Message-ID: <080820211508037711%nospam@nospam.invalid>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <L7SPI.7275$uV3.4755@fx18.iad> <seoubo$lcn$1@dont-email.me> <sep8tq$5uu$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: reader02.eternal-september.org; posting-host="976663b646664dc58de25ce38ba76dbf";
logging-data="10596"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+cFyZwAzngm45t0byRMik5"
User-Agent: Thoth/1.9.0 (Mac OS X)
Cancel-Lock: sha1:cDDuwtSzSWBgmpDj2GvaMbLJBS0=
 by: nospam - Sun, 8 Aug 2021 19:08 UTC

In article <sep8tq$5uu$1@dont-email.me>, badgolferman
<REMOVETHISbadgolferman@gmail.com> wrote:

> >
> > I have another solution. It's a phone with a wire that I
> > plug into the wall. The sound quality is wonderful, the
> > device is ergonomic, unlike slab phones. And it has a number
> > of great inventions that I wouldn't want to be without. For
> > example, it's completely immune to Uber and Lyft. It can
> > never be attacked by inane Twitter posts or restaurant
> > recommendations. Facebook can't possibly gain a foothold.
> > And when I go for a walk, no one can reach me via text
> > message. But they can leave a voice message. Best of all,
> > I can use my device to make phone calls. What'll they
> > think of next?!
> >
>
> Constant spam calls to the point that you completely ignore the phone
> forever.

get a spam call blocker.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<sepe29$97e$3@gioia.aioe.org>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18434&group=comp.mobile.android#18434

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.mobile.android
Path: i2pn2.org!i2pn.org!aioe.org!QDghBgMjeGK0M3lN/spSFw.user.46.165.242.75.POSTED!not-for-mail
From: rainer.z...@web.de (Rainer Zwerschke)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.mobile.android
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Date: Sun, 8 Aug 2021 22:16:09 +0200
Organization: Aioe.org NNTP Server
Message-ID: <sepe29$97e$3@gioia.aioe.org>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
<L7SPI.7275$uV3.4755@fx18.iad> <seoubo$lcn$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: gioia.aioe.org; logging-data="9454"; posting-host="QDghBgMjeGK0M3lN/spSFw.user.gioia.aioe.org"; mail-complaints-to="abuse@aioe.org";
User-Agent: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:78.0) Gecko/20100101
Thunderbird/78.12.0
Content-Language: de-DE
X-Notice: Filtered by postfilter v. 0.9.2
 by: Rainer Zwerschke - Sun, 8 Aug 2021 20:16 UTC

Am 08.08.2021 um 17:47 schrieb Mayayana:
> "SilverSlimer" <silver@slim.er> wrote
>
> | To me, the only solution is a de-Googled Android device using LineageOS,
> | CalyxOS or GrapheneOS but that also means buying a Google Pixel phone
> | since those systems seem to cater to that line the most.
> |
>
> I have another solution. It's a phone with a wire that I
> plug into the wall. The sound quality is wonderful, the
> device is ergonomic, unlike slab phones. And it has a number
> of great inventions that I wouldn't want to be without. For
> example, it's completely immune to Uber and Lyft. It can
> never be attacked by inane Twitter posts or restaurant
> recommendations. Facebook can't possibly gain a foothold.
> And when I go for a walk, no one can reach me via text
> message. But they can leave a voice message. Best of all,
> I can use my device to make phone calls. What'll they
> think of next?!
>
>

++1

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<sepeso$5pq$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18437&group=comp.mobile.android#18437

  copy link   Newsgroups: misc.phone.mobile.iphone alt.privacy.anon-server comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: mayay...@invalid.nospam (Mayayana)
Newsgroups: misc.phone.mobile.iphone,alt.privacy.anon-server,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 8 Aug 2021 16:29:06 -0400
Organization: A noiseless patient Spider
Lines: 11
Message-ID: <sepeso$5pq$1@dont-email.me>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <L7SPI.7275$uV3.4755@fx18.iad> <seoubo$lcn$1@dont-email.me> <sep8tq$5uu$1@dont-email.me>
Injection-Date: Sun, 8 Aug 2021 20:30:16 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="d1bde6b641e8811379f7f55e1de93776";
logging-data="5946"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/GDXb/yjSe4zKbx8cAzkbcCYJaQHQEtdg="
Cancel-Lock: sha1:5FqiTFTeNig7iCee8MjewzT4sVQ=
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2900.5512
X-Newsreader: Microsoft Outlook Express 6.00.2900.5512
X-Priority: 3
X-MSMail-Priority: Normal
 by: Mayayana - Sun, 8 Aug 2021 20:29 UTC

"badgolferman" <REMOVETHISbadgolferman@gmail.com> wrote

| Constant spam calls to the point that you completely ignore the phone
| forever.
|

There is some truth to that, but I don't get many because
I don't give out the number. And I have caller ID. So I know
whether to pick up.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<080820211839369202%nospam@nospam.invalid>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18458&group=comp.mobile.android#18458

  copy link   Newsgroups: misc.phone.mobile.iphone alt.privacy.anon-server comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@nospam.invalid (nospam)
Newsgroups: misc.phone.mobile.iphone,alt.privacy.anon-server,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 08 Aug 2021 18:39:36 -0400
Organization: A noiseless patient Spider
Lines: 22
Message-ID: <080820211839369202%nospam@nospam.invalid>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <L7SPI.7275$uV3.4755@fx18.iad> <seoubo$lcn$1@dont-email.me> <sep8tq$5uu$1@dont-email.me> <sepeso$5pq$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: reader02.eternal-september.org; posting-host="36c4e30cf687fcf27dcd82a66673683f";
logging-data="3200"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19+gF/RBtLrRsMsx7B4VpUd"
User-Agent: Thoth/1.9.0 (Mac OS X)
Cancel-Lock: sha1:l66EYKXMJG/abblCScIv67/9hIk=
 by: nospam - Sun, 8 Aug 2021 22:39 UTC

In article <sepeso$5pq$1@dont-email.me>, Mayayana
<mayayana@invalid.nospam> wrote:

>
> | Constant spam calls to the point that you completely ignore the phone
> | forever.
> |
>
> There is some truth to that, but I don't get many because
> I don't give out the number.

that doesn't help when they autodial every number.

you also don't have to give out your number for spammers to call. it's
in many, many public databases.

> And I have caller ID. So I know
> whether to pick up.

caller id is easily spoofed. it is not reliable.

stir/shaken has helped, but it still happens.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<sepqal$j37$1@dont-email.me>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18464&group=comp.mobile.android#18464

  copy link   Newsgroups: misc.phone.mobile.iphone alt.privacy.anon-server comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: scharf.s...@geemail.com (sms)
Newsgroups: misc.phone.mobile.iphone,alt.privacy.anon-server,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Date: Sun, 8 Aug 2021 16:45:25 -0700
Organization: A noiseless patient Spider
Lines: 15
Message-ID: <sepqal$j37$1@dont-email.me>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
<L7SPI.7275$uV3.4755@fx18.iad> <seoubo$lcn$1@dont-email.me>
<sep8tq$5uu$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Sun, 8 Aug 2021 23:45:25 -0000 (UTC)
Injection-Info: reader02.eternal-september.org; posting-host="4bd3068037462510fc11cd03a23546e8";
logging-data="19559"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/cxxba2O1hhrFoo5SVogsj"
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:78.0) Gecko/20100101
Thunderbird/78.12.0
Cancel-Lock: sha1:A9sZFHLYkj0Tm86zu8eQYotJ/lA=
In-Reply-To: <sep8tq$5uu$1@dont-email.me>
Content-Language: en-US
 by: sms - Sun, 8 Aug 2021 23:45 UTC

On 8/8/2021 11:48 AM, badgolferman wrote:

<snip>

> Constant spam calls to the point that you completely ignore the phone
> forever.

Yes, that's an issue with a simple POTS line. But I think that most
former POTS users have switched to some sort of VOIP service that uses
their analog phones with an ATA, be it from Comcast, Sonic, AT&T, Ooma,
or Obi. All of these have spam filtering. Obi with Google Voice is very
good at filtering spam calls, especially with call screening turned on.
I heard that Ooma is even better, avoiding that annoying single initial
ring.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<080820211953405892%nospam@nospam.invalid>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18468&group=comp.mobile.android#18468

  copy link   Newsgroups: misc.phone.mobile.iphone alt.privacy.anon-server comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@nospam.invalid (nospam)
Newsgroups: misc.phone.mobile.iphone,alt.privacy.anon-server,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 08 Aug 2021 19:53:40 -0400
Organization: A noiseless patient Spider
Lines: 11
Message-ID: <080820211953405892%nospam@nospam.invalid>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <L7SPI.7275$uV3.4755@fx18.iad> <seoubo$lcn$1@dont-email.me> <sep8tq$5uu$1@dont-email.me> <sepqal$j37$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: reader02.eternal-september.org; posting-host="36c4e30cf687fcf27dcd82a66673683f";
logging-data="17277"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1+EPW17+oQbUGholVYYPjAR"
User-Agent: Thoth/1.9.0 (Mac OS X)
Cancel-Lock: sha1:wIC17oBCwGsfTDjSftOUfHqV8Wg=
 by: nospam - Sun, 8 Aug 2021 23:53 UTC

In article <sepqal$j37$1@dont-email.me>, sms
<scharf.steven@geemail.com> wrote:

> > Constant spam calls to the point that you completely ignore the phone
> > forever.
>
> Yes, that's an issue with a simple POTS line. But I think that most
> former POTS users have switched to some sort of VOIP service that uses
> their analog phones with an ATA,

they've overwhelmingly switched to cellular.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<4ff6a8f49692a9ae300b963c739a5c34@remailer.privacy.at>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18483&group=comp.mobile.android#18483

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
From: mixmas...@remailer.privacy.at (Anonymous Remailer (austria))
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
<seoja5$4p0$1@news.mixmin.net>
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Message-ID: <4ff6a8f49692a9ae300b963c739a5c34@remailer.privacy.at>
Date: Mon, 9 Aug 2021 03:46:17 +0200 (CEST)
Newsgroups: alt.privacy.anon-server, misc.phone.mobile.iphone,
comp.os.linux.advocacy, comp.mobile.android, alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!sewer!news.dizum.net!not-for-mail
Organization: dizum.com - The Internet Problem Provider
X-Abuse: abuse@dizum.com
Injection-Info: sewer.dizum.com - 2001::1/128
 by: Anonymous Remailer ( - Mon, 9 Aug 2021 01:46 UTC

In article <seoja5$4p0$1@news.mixmin.net>
"cern" <louis@cern.ch> wrote:
>
> All, I say again, all companies today are pathological liars. All
> they want is money. If Apples was so safe, why did they just
> release a message about their intentions of going through every
> IPhone to look for kiddy porn? How is it that Apple has this
> capability? Because the IPhone is a spy phone. You need to get a
> Pixel and load it with CalyxOS - a de-Googled operating system.

Apple has been untrustworthy from day one.

In the BBS days, Mac software products were the most infected
and Apple actively encouraged developers to upload infected
software.

They said it was to combat piracy.

Right, sure it was. Shit code is more like it.

Remember when Apple refused to admit the existence of the back
door access when it involved terrorists shooting Americans in
San Bernardino?

Everybody who owns an iPhone is funding these corporate
criminals.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<080820212158325409%nospam@nospam.invalid>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18489&group=comp.mobile.android#18489

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!eternal-september.org!reader02.eternal-september.org!.POSTED!not-for-mail
From: nos...@nospam.invalid (nospam)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?
Date: Sun, 08 Aug 2021 21:58:32 -0400
Organization: A noiseless patient Spider
Lines: 16
Message-ID: <080820212158325409%nospam@nospam.invalid>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net> <seoja5$4p0$1@news.mixmin.net> <4ff6a8f49692a9ae300b963c739a5c34@remailer.privacy.at>
Mime-Version: 1.0
Content-Type: text/plain; charset=ISO-8859-1
Content-Transfer-Encoding: 8bit
Injection-Info: reader02.eternal-september.org; posting-host="36c4e30cf687fcf27dcd82a66673683f";
logging-data="13727"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX18j+lzF3CWb+YvBzKu8TUdi"
User-Agent: Thoth/1.9.0 (Mac OS X)
Cancel-Lock: sha1:vyQzfc+HCQNe5OfTax+e6DJ6m6U=
 by: nospam - Mon, 9 Aug 2021 01:58 UTC

In article <4ff6a8f49692a9ae300b963c739a5c34@remailer.privacy.at>,
Anonymous Remailer (austria) <mixmaster@remailer.privacy.at> wrote:

> In the BBS days, Mac software products were the most infected
> and Apple actively encouraged developers to upload infected
> software.

bullshit.

> Remember when Apple refused to admit the existence of the back
> door access when it involved terrorists shooting Americans in
> San Bernardino?

bullshit to that too. there was no back door and apple refused to add
one.

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<inbh3nFaocbU7@mid.individual.net>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18492&group=comp.mobile.android#18492

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail
From: jollyro...@pobox.com (Jolly Roger)
Newsgroups: alt.privacy.anon-server,misc.phone.mobile.iphone,comp.os.linux.advocacy,comp.mobile.android,alt.comp.os.windows-10
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Date: 9 Aug 2021 02:09:59 GMT
Organization: People for the Ethical Treatment of Pirates
Lines: 28
Message-ID: <inbh3nFaocbU7@mid.individual.net>
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
<seoja5$4p0$1@news.mixmin.net>
<4ff6a8f49692a9ae300b963c739a5c34@remailer.privacy.at>
X-Trace: individual.net 9py4KJi4Gg8pw3CPaEnrBwet9gWSEMl4rThZnhv1s/ezWlvwVK
Cancel-Lock: sha1:g3iXpTwuu1rn9AuB33WvbWTnHsA=
Mail-Copies-To: nobody
X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1<n@LQ!aZ7vLO_nWbK~@T'XIS0,oAJcU.qLM
dk/j8Udo?O"o9B9Jyx+ez2:B<nx(k3EdHnTvB]'eoVaR495,Rv~/vPa[e^JI+^h5Zk*i`Q;ezqDW<
ZFs6kmAJWZjOH\8[$$7jm,Ogw3C_%QM'|H6nygNGhhl+@}n30Nz(^vWo@h>Y%b|b-Y~()~\t,LZ3e
up1/bO{=-)
User-Agent: slrn/1.0.3 (Darwin)
 by: Jolly Roger - Mon, 9 Aug 2021 02:09 UTC

On 2021-08-09, Anonymous Remailer (austria) <mixmaster@remailer.privacy.at> wrote:
>
> In article <seoja5$4p0$1@news.mixmin.net>
> "cern" <louis@cern.ch> wrote:
>>
>> All, I say again, all companies today are pathological liars. All
>> they want is money. If Apples was so safe, why did they just
>> release a message about their intentions of going through every
>> IPhone to look for kiddy porn? How is it that Apple has this
>> capability? Because the IPhone is a spy phone. You need to get a
>> Pixel and load it with CalyxOS - a de-Googled operating system.
>
> In the BBS days, Mac software products were the most infected

LOL! The fuck they were. I ran my own BBS, child, and there were *way*
more malware detections in Windows software than Mac software. You're
fucking clueless.

> and Apple actively encouraged developers to upload infected
> software.

More bullshit from a complete known-nothing chode.

--
E-mail sent to this address may be devoured by my ravenous SPAM filter.
I often ignore posts from Google. Use a real news client instead.

JR

Re: Apple wants to check your iphone for child abuse images - what could possibly go wrong?

<14c7b1272bf1cabaea4a2bdc74e4317c@remailer.privacy.at>

  copy mid

https://www.novabbs.com/computers/article-flat.php?id=18493&group=comp.mobile.android#18493

  copy link   Newsgroups: alt.privacy.anon-server misc.phone.mobile.iphone comp.os.linux.advocacy comp.mobile.android alt.comp.os.windows-10
From: mixmas...@remailer.privacy.at (Anonymous Remailer (austria))
References: <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>
<080820211036008314%nospam@nospam.invalid>
Subject: Re: Apple wants to check your iphone for child abuse images - what
could possibly go wrong?
Message-ID: <14c7b1272bf1cabaea4a2bdc74e4317c@remailer.privacy.at>
Date: Mon, 9 Aug 2021 04:13:45 +0200 (CEST)
Newsgroups: alt.privacy.anon-server, misc.phone.mobile.iphone,
comp.os.linux.advocacy, comp.mobile.android, alt.comp.os.windows-10
Path: i2pn2.org!rocksolid2!news.neodome.net!news.mixmin.net!sewer!news.dizum.net!not-for-mail
Organization: dizum.com - The Internet Problem Provider
X-Abuse: abuse@dizum.com
Injection-Info: sewer.dizum.com - 2001::1/128
 by: Anonymous Remailer ( - Mon, 9 Aug 2021 02:13 UTC

In article <080820211036008314%nospam@nospam.invalid>
nospam <nospam@nospam.invalid> wrote:
>
> In article
> <29f6feb93945ef171df247722dfc6bf1@msgid.frell.theremailer.net>, Fritz
> Wuehler <fritz@spamexpire-202108.rodent.frell.theremailer.net> wrote:
>
> > On the surface Apple�s new features sound both sensible and
> > commendable. Technology-facilitated child sexual exploitation is
> > an enormous problem; one that�s spiralling out of control. In
> > 1998 there were more than 3,000 reports of child sex abuse
> > imagery, according to a 2019 paper published in conjunction with
> > the National Center for Missing and Exploited Children. In 2018
> > there were 18.4m. These reports included more than 45m images
> > and videos that were flagged as child sexual abuse. Technology
> > companies have a duty to curb the terrible abuses their
> > platforms help facilitate. Apple�s new features are an attempt
> > to do just that.
>
> yep.

But they won't do anything when it involves terrorists shooting
innocent people on American soil.

> > But while Apple�s attempts to protect children may be valiant,
> > they also open a Pandora�s box of privacy and surveillance
> > issues. Of particular concern to security researchers and
> > privacy activists is the fact that this new feature doesn�t just
> > look at images stored on the cloud; it scans users� devices
> > without their consent.
>
> absolutely false.

You don't know that for a fact.

> *only* images uploaded to icloud are checked against a known database
> of child porn.

I might be a data security analyst for a fortune 200 company. I
might be able to modify any file to deceive file checkers. I
might be able to make a picture of your 80-year-old grandma fit
the profile of any of a thousand kiddie porn pics or millions of
random loan docs.

> images that are never sent anywhere are never checked.
>
> google, facebook, twitter, dropbox, discord and many, many other
> services already check for child porn and have been doing so for years.

Oh the irony of that statement. Twitter actively promotes child
porn.

> facebook started checking in 2011.

Facebook didn't have the technology to do anything like that in
2011. They were still unfucking their SQL mess.

> > Essentially that means there�s now a sort
> > of �backdoor� into an individual�s iPhone, one which has the
> > potential to grow wider and wider.
>
> it's not a backdoor.

Lol! Yeah it's a wide open door.

There's a poster in this group who openly admits to abusing
iPhones in airports.

It can be done.

> > The Electronic Frontier
> > Foundation (EFF), an online civil liberties advocacy group,
> > warns that �all it would take to widen the narrow backdoor that
> > Apple is building is an expansion of the machine learning
> > parameters to look for additional types of content � That�s not
> > a slippery slope; that�s a fully built system just waiting for
> > external pressure to make the slightest change.� You can
> > imagine, for example, how certain countries might pressure Apple
> > to scan for anti-government messages or LGBTQ content.
>
> you can imagine all sorts of things, including that the earth is flat
> and that the moon is made of cheese.
>
> that doesn't mean any of it is true.

Apple is untrustworthy. They became even more untrustworthy
after the terrorist incident where they refused to help the
government access the phones of the shooters.

> > Jillian York, the author of a new book about how surveillance
> > capitalism affects free speech, is also concerned that Apple�s
> > new parental controls mean images shared between two minors
> > could be non-consensually shared with one of their parents.
>
> she should try learning how it actually works before commenting, let
> alone write a book.

You go right on living in that bizarro world of yours.

Pages:123456
server_pubkey.txt

rocksolid light 0.9.81
clearnet tor