Rocksolid Light

Welcome to novaBBS (click a section below)

mail  files  register  newsreader  groups  login

Message-ID:  

Ya'll hear about the geometer who went to the beach to catch some rays and became a tangent ?


tech / rec.photo.digital / Re: Critics Say Apple Built a 'Backdoor' Into Your iPhone With Its New Child Abuse Detection Tools

SubjectAuthor
o Re: Critics Say Apple Built a 'Backdoor' Into Your iPhone With ItsWhisky-dave

1
Re: Critics Say Apple Built a 'Backdoor' Into Your iPhone With Its New Child Abuse Detection Tools

<91500471-648a-45e5-9665-e0b23a1d2e29n@googlegroups.com>

  copy mid

https://www.novabbs.com/tech/article-flat.php?id=10278&group=rec.photo.digital#10278

  copy link   Newsgroups: rec.photo.digital
X-Received: by 2002:ad4:4a04:: with SMTP id m4mr11893328qvz.42.1630930992252;
Mon, 06 Sep 2021 05:23:12 -0700 (PDT)
X-Received: by 2002:a05:6902:1106:: with SMTP id o6mr15734912ybu.534.1630930991960;
Mon, 06 Sep 2021 05:23:11 -0700 (PDT)
Path: i2pn2.org!i2pn.org!weretis.net!feeder6.news.weretis.net!news.misty.com!border2.nntp.dca1.giganews.com!nntp.giganews.com!news-out.google.com!nntp.google.com!postnews.google.com!google-groups.googlegroups.com!not-for-mail
Newsgroups: rec.photo.digital
Date: Mon, 6 Sep 2021 05:23:11 -0700 (PDT)
In-Reply-To: <lnsAD9A77CCCFD316F089P2473@0.0.0.1>
Injection-Info: google-groups.googlegroups.com; posting-host=138.37.177.46; posting-account=Fal3rgoAAABua4brvRuRwdmPfigIDi6x
NNTP-Posting-Host: 138.37.177.46
References: <lnsAD9A77CCCFD316F089P2473@0.0.0.1>
User-Agent: G2/1.0
MIME-Version: 1.0
Message-ID: <91500471-648a-45e5-9665-e0b23a1d2e29n@googlegroups.com>
Subject: Re: Critics Say Apple Built a 'Backdoor' Into Your iPhone With Its
New Child Abuse Detection Tools
From: whisky.d...@gmail.com (Whisky-dave)
Injection-Date: Mon, 06 Sep 2021 12:23:12 +0000
Content-Type: text/plain; charset="UTF-8"
Content-Transfer-Encoding: quoted-printable
Lines: 306
 by: Whisky-dave - Mon, 6 Sep 2021 12:23 UTC

On Friday, 3 September 2021 at 19:46:42 UTC+1, Leroy N. Soetoro wrote:
> Privacy advocates worry the new features could be a slippery slope.
>
> https://gizmodo.com/critics-say-apple-built-a-backdoor-into-your-iphone-
> wit-1847438624
>
> Apple’s plans to roll out new features aimed at combating Child Sexual
> Abuse Material (CSAM) on its platforms have caused no small amount of
> controversy.
>
> The company is basically trying to a pioneer a solution to a problem that,
> in recent years, has stymied law enforcement officials and technology
> companies alike: the large, ongoing crisis of CSAM proliferation on major
> internet platforms. As recently as 2018, tech firms reported the existence
> of as many as 45 million photos and videos that constituted child sex
> abuse material—a terrifyingly high number.
>
> Yet while this crisis is very real, critics fear that Apple’s new
> features—which involve algorithmic scanning of users’ devices and
> messages—constitute a privacy violation and, more worryingly, could one
> day be repurposed to search for different kinds of material other than
> CSAM. Such a shift could open the door to new forms of widespread
> surveillance and serve as a potential workaround for encrypted
> communications—one of privacy’s last, best hopes.
>
> To understand these concerns, we should take a quick look at the specifics
> of the proposed changes. First, the company will be rolling out a new tool
> to scan photos uploaded to iCloud from Apple devices in an effort to
> search for signs of child sex abuse material. According to a technical
> paper published by Apple, the new feature uses a “neural matching
> function,” called NeuralHash, to assess whether images on a user’s iPhone
> match known “hashes,” or unique digital fingerprints, of CSAM. It does
> this by comparing the images shared with iCloud to a large database of
> CSAM imagery that has been compiled by the National Center for Missing and
> Exploited Children (NCMEC). If enough images are discovered, they are then
> flagged for a review by human operators, who then alert NCMEC (who then
> presumably tip off the FBI).
>
> Some people have expressed concerns that their phones may contain pictures
> of their own children in a bathtub or running naked through a sprinkler or
> something like that. But, according to Apple, you don’t have to worry
> about that. The company has stressed that it does not “learn anything
> about images that do not match [those in] the known CSAM database”—so it’s
> not just rifling through your photo albums, looking at whatever it wants.
>
> Meanwhile, Apple will also be rolling out a new iMessage feature designed
> to “warn children and their parents when [a child is] receiving or sending
> sexually explicit photos.” Specifically, the feature is built to caution
> children when they are about to send or receive an image that the
> company’s algorithm has deemed sexually explicit. The child gets a
> notification, explaining to them that they are about to look at a sexual
> image and assuring them that it is OK not to look at the photo (the
> incoming image remains blurred until the user consents to viewing it). If
> a child under 13 breezes past that notification to send or receive the
> image, a notification will subsequently be sent to the child’s parent
> alerting them about the incident.
>
> Suffice it to say, news of both of these updates—which will be commencing
> later this year with the release of the iOS 15 and iPadOS 15—has not been
> met kindly by civil liberties advocates. The concerns may vary, but in
> essence, critics worry the deployment of such powerful new technology
> presents a number of privacy hazards.
>
> In terms of the iMessage update, concerns are based around how encryption
> works, the protection it is supposed to provide, and what the update does
> to basically circumvent that protection. Encryption protects the contents
> of a user’s message by scrambling it into unreadable cryptographic
> signatures before it is sent, essentially nullifying the point of
> intercepting the message because it’s unreadable. However, because of the
> way Apple’s new feature is set up, communications with child accounts will
> be scanned to look for sexually explicit material before a message is
> encrypted. Again, this doesn’t mean that Apple has free rein to read a
> child’s text messages—it’s just looking for what its algorithm considers
> to be inappropriate images.
>
> However, the precedent set by such a shift is potentially worrying. In a
> statement published Thursday, the Center for Democracy and Technology took
> aim at the iMessage update, calling it an erosion of the privacy provided
> by Apple’s end-to-end encryption: “The mechanism that will enable Apple to
> scan images in iMessages is not an alternative to a backdoor—it is a
> backdoor,” the Center said. “Client-side scanning on one ‘end’ of the
> communication breaks the security of the transmission, and informing a
> third-party (the parent) about the content of the communication undermines
> its privacy.”
>
> The plan to scan iCloud uploads has similarly riled privacy advocates.
> Jennifer Granick, surveillance and cybersecurity counsel for the ACLU’s
> Speech, Privacy, and Technology Project, told Gizmodo via email that she
> is concerned about the potential implications of the photo scans: “However
> altruistic its motives, Apple has built an infrastructure that could be
> subverted for widespread surveillance of the conversations and information
> we keep on our phones,” she said. “The CSAM scanning capability could be
> repurposed for censorship or for identification and reporting of content
> that is not illegal depending on what hashes the company decides to, or is
> forced to, include in the matching database. For this and other reasons,
> it is also susceptible to abuse by autocrats abroad, by overzealous
> government officials at home, or even by the company itself.”
>
> Even Edward Snowden chimed in:
>
>
> Edward Snowden
> @Snowden
> No matter how well-intentioned, @Apple is rolling out mass surveillance to
> the entire world with this. Make no mistake: if they can scan for kiddie
> porn today, they can scan for anything tomorrow.
>
> They turned a trillion dollars of devices into iNarcs—*without asking.*
> Edward Snowden
> @Snowden
> ???? Apple says to "protect children," they're updating every iPhone to
> continuously compare your photos and cloud storage against a secret
> blacklist. If it finds a hit, they call the cops.
>
> iOS will also tell your parents if you view a nude in iMessage.
>
> https://eff.org/deeplinks/2021/08/apples-plan-think-different-about-
> encryption-opens-backdoor-your-private-life
> 7:23 PM · Aug 5, 2021
> 24.3K
> 865
> Share this Tweet
>
> The concern here obviously isn’t Apple’s mission to fight CSAM, it’s the
> tools that it’s using to do so—which critics fear represent a slippery
> slope. In an article published Thursday, the privacy-focused Electronic
> Frontier Foundation noted that scanning capabilities similar to Apple’s
> tools could eventually be repurposed to make its algorithms hunt for other
> kinds of images or text—which would basically mean a workaround for
> encrypted communications, one designed to police private interactions and
> personal content. According to the EFF:
>
> All it would take to widen the narrow backdoor that Apple is building is
> an expansion of the machine learning parameters to look for additional
> types of content, or a tweak of the configuration flags to scan, not just
> children’s, but anyone’s accounts. That’s not a slippery slope; that’s a
> fully built system just waiting for external pressure to make the
> slightest change.
>
> Such concerns become especially germane when it comes to the features’
> rollout in other countries—with some critics warning that Apple’s tools
> could be abused and subverted by corrupt foreign governments. In response
> to these concerns, Apple confirmed to MacRumors on Friday that it plans to
> expand the features on a country-by-country basis. When it does consider
> distribution in a given country, it will do a legal evaluation beforehand,
> the outlet reported.
>
> In a phone call with Gizmodo Friday, India McKinney, director of federal
> affairs for EFF, raised another concern: the fact that both tools are un-
> auditable means that it’s impossible to independently verify that they are
> working the way they’re supposed to be working.
>
> “There is no way for outside groups like ours or anybody
> else—researchers—to look under the hood to see how well it’s working, is
> it accurate, is this doing what its supposed to be doing, how many false-
> positives are there,” she said. “Once they roll this system out and start
> pushing it onto the phones, who’s to say they’re not going to respond to
> government pressure to start including other things—terrorism content,
> memes that depict political leaders in unflattering ways, all sorts of
> other stuff.” Relevantly, in its article on Thursday, EFF noted that one
> of the technologies “originally built to scan and hash child sexual abuse
> imagery” was recently retooled to create a database run by the Global
> Internet Forum to Counter Terrorism (GIFCT)—the likes of which now helps
> online platforms to search for and moderate/ban “terrorist” content,
> centered around violence and extremism.
>
> Because of all these concerns, a cadre of privacy advocates and security
> experts have written an open letter to Apple, asking that the company
> reconsider its new features. As of Sunday, the letter had over 5,000
> signatures.
>
> However, it’s unclear whether any of this will have an impact on the tech
> giant’s plans. In an internal company memo leaked Friday, Apple’s software
> VP Sebastien Marineau-Mes acknowledged that “some people have
> misunderstandings and more than a few are worried about the implications”
> of the new rollout, but that the company will “continue to explain and
> detail the features so people understand what we’ve built.” Meanwhile,
> NMCEC sent a letter to Apple staff internally in which they referred to
> the program’s critics as “the screeching voices of the minority” and
> championed Apple for its efforts.
>
>
>
> --
> "LOCKDOWN", left-wing COVID fearmongering. 95% of COVID infections
> recover with no after effects.
>
> No collusion - Special Counsel Robert Swan Mueller III, March 2019.
> Officially made Nancy Pelosi a two-time impeachment loser.
>
> Donald J. Trump, cheated out of a second term by fraudulent "mail-in"
> ballots. Report voter fraud: sf.n...@mail.house.gov
>
> Thank you for cleaning up the disaster of the 2008-2017 Obama / Biden
> fiasco, President Trump.
>
> Under Barack Obama's leadership, the United States of America became the
> The World According To Garp. Obama sold out heterosexuals for Hollywood
> queer liberal democrat donors.
>
> President Trump boosted the economy, reduced illegal invasions, appointed
> dozens of judges and three SCOTUS justices.


Click here to read the complete article
1
server_pubkey.txt

rocksolid light 0.9.8
clearnet tor