Apple’s move against child pornography is shifting battle lines for law enforcement and technologists

From the Washington Post: Apple’s latest move to fight the digital sharing of child pornography is opening up some fissures in a seven-year standoff between technologists and law enforcement over fighting the spread of criminal activity online.

That fight has centered primarily on FBI and Justice Department demands for special police access to encrypted communications that would otherwise be shielded from everyone, including the platform where the conversation is happening.

Justice Department and FBI officials say that access — with a warrant — is vital to stop terrorists, purveyors of child pornography and other criminals from “going dark” and acting with impunity online. Technologists nearly uniformly say creating such an encryption backdoor will make everyone more hackable and that the trade-offs aren’t worth it. The fight’s flared periodically since 2014 with no significant give or take from either side.

READ MORE

18 thoughts on “Apple’s move against child pornography is shifting battle lines for law enforcement and technologists

  • August 10, 2021 at 12:59 pm
    Permalink

    So it wouldn’t be that difficult for someone to insert known hashes that falsely represent CP and trafficked persons in a manner of swatting someone.

    Reply
  • August 10, 2021 at 1:28 pm
    Permalink

    While I am firmly against ANY exploitation of children, and have even as alerted parents to possible predators grooming their children to get access, these kinds of laws are always open to being abused by authorities. The so called warrants are only shown to suspects AFTER they have been arrested and charged. Judges never refuse to grant the warrants. AND, most of all, police will use the technology to thoroughly spy on everyone until they find even the smallest hint of evidence justifying seeking a warrant. I personally don’t put anything out over the internet that would be embarrassing, but I also am always on alert to the possibility that someone could send that kind of trash to me without my prior knowledge.

    Reply
    • August 10, 2021 at 8:15 pm
      Permalink

      Gerald….in Flori-DUH, for example, it is called Strict Liability….So you ARE fcked either way!

      Make it a Great Day!

      Reply
  • August 10, 2021 at 6:16 pm
    Permalink

    A child sends a nude photo to a CP producer masquerading as another child. Said CP producer gets arrested and the new material gets hashed. Original child gets flagged as CP producer and spends the rest of his or her life publicly shamed and horrendously punished.

    Reply
  • August 11, 2021 at 1:27 am
    Permalink

    High tech is compromised, period. Those who distrust our government already know this and if there came a time for open revolt a-symetrical resistance would be the tactic. Use your Iphones and your ipads at your own risk and please at the very least follow some online security measures that will prevent most if not all (excepting government three letter agencies) from gaining access to your system and/or communications.

    Reply
  • August 11, 2021 at 11:25 am
    Permalink

    I have trolled some of theedia groups and even had disagreements with people regarding this. Some of the typical responses are still well I got nothing to hide or as long as they are going after them. Many people who I spoke to fail to understand the gravity of this situation.

    Reply
    • August 11, 2021 at 2:12 pm
      Permalink

      They might have nothing to hide; however this program won’t protect them if a photo gets a hit. Programs like this are a slippery slope and could cost a person their privacy and freedom.

      Reply
      • August 11, 2021 at 2:52 pm
        Permalink

        I’ve read little about this but I surmise that Apple will be scanning all of your photos which are stored (or backed up) to the cloud. Will it also scan photos that you only take with your phone and do not send to the cloud? Doesn’t really matter. It certainly could do that. And might at any time.

        But anyway, so you and your wife take some interesting photos of yourselves doing whatever. Or perhaps you just take one to send to your wife. How is Apple’s algorithm going to work? Is it only looking to match a specific set of photos, like say perhaps a set of 10,000,000 photos? If that is the case, probably no one will ever see your photo of your junk. But even that depends on if they are trying to match the photos exactly or if they are also watching out for people modifying such photos enough to throw their software off.

        If instead, Apple is looking for any photo which might be child porn, then your photo could easily enough get flagged and then I assume it would be sent off to a human somewhere to look at the photo to determine if it is actually illegal or not. And since we know that any person who has a naked photo of themselves viewed non-consensually is harmed by every view, congratulations, you are now a victim of a heinous sex crime.

        I don’t personally care if people see me naked, but I’m not going to be using a phone that scans my photos for the nefarious purposes of big government.

        Reply
        • August 11, 2021 at 4:35 pm
          Permalink

          I’d bet Apple and Google already scan everyone’s photos and messages in order to target advertising, and this thing about looking for child porn is just a means of getting people to stop complaining about privacy.

          Reply
  • August 11, 2021 at 4:39 pm
    Permalink

    All,

    Let’s not overlook the likelihood that LE will send CP to a target’s phone so they can pop people for having it. Much easier than than having to physically plant it on the person or in his house during a search or “compliance check.”

    Reply
  • August 11, 2021 at 6:25 pm
    Permalink

    I saw a news show on YouTube talking about this. Their view was spot on and sensible. Photos will be scanned by an AI program marking photos to be checked by a person. Then a person will look at them. If it is something deemed inappropriate, it will be forwarded to the police.
    The problem is who wants people looking at all your pictures? If they say they only do a little how long until they look at everything. As I try to point out to people online, if they make a registry for sex offenders and spend years perfecting the process. How long until everyone has a label and there are 20 registries. How much power is too much to give before everyone is labeled, or segregated

    Reply
    • August 12, 2021 at 11:36 am
      Permalink

      I can’t imagine how many websites are going to be harassed that specialize in porn that uses images of people who appear to be younger than they are. CP is defined as sexual images of people under 18 years of age. Nobody can tell the difference between a nude 17 year old and a nude 18 year old. So if an 18 year old woman sends nude photos to her 18 year old boyfriend, we should expect that search warrants will be issued to examine the files of BOTH of them, and without their knowledge or consent. Imagine how that young woman will feel about numerous people ogling her personal photos in the name of public safety. And will they require polygraphs for those employed to look at those images to ensure that they aren’t doing it for their own sexual gratification? I imagine that they would object to that on the grounds that it violates their personal privacy.

      Reply
  • August 12, 2021 at 11:21 pm
    Permalink

    Years ago, Walmart workers turned in folks who took innocent pics of their babies in the bathtub and had those photos processed (just Google “A.J. and Lisa Demaree” for the story).

    This was an isolated case, but the Apple policy could repeat this nightmare scenarion on a grand scale.

    Reply
    • August 13, 2021 at 10:12 am
      Permalink

      When my son was about 8 months old, I had a christmas card photo created with him from behind, sitting naked, looking over his shoulder wearing a santa hat. I thought it was the most adorable picture, and still do today. A friend mentioned I could get in trouble for such a picture. I did not know at the time where such censored ideas came from, I do now. That way of thinking saddens me to the core.

      Reply
    • August 13, 2021 at 10:49 am
      Permalink

      I do remember one of those stories, where a school teacher’s life was torn apart over pictures that any reasonable person would recognize as the same kind of cute baby photos that are in the family albums of many thousands of people. The old Coppertone Tan billboards of the 60’s depicting a little girl whose dog pulled her suit down to reveal her bare bottom and tan lines would surely be classified as child porn by today’s witch hunters.

      Reply
  • August 13, 2021 at 5:23 pm
    Permalink

    Time to get rid of your iPhone!

    https://www.natlawreview.com/article/will-neuralhash-make-hash-privacy

    “Apple’s strategy includes a tool called “NeuralHash” that scans images prior to uploading the images to the cloud. If NeuralHash finds that a picture meets its criteria for child sexual abuse, then the user’s account will be disabled and Apple will notify law enforcement and the Center for Missing and Exploited Children.”

    Reply
  • August 14, 2021 at 8:52 am
    Permalink

    How many people have innocent pictures of their children, grandchildren, and great grandchildren in the tub naked or sitting on the toilet? How in the hell are those images child sex abuse? What about the images on tv that children see year after year that are sexual in nature? This country’s attitude on sex is disgusting. I guess we’re all offenders now!!

    Reply
  • August 15, 2021 at 6:27 am
    Permalink

    The Technical Term is called ‘HASHING’ and it is a wonderful tool for alpha-numeric and numeric sequences…like protecting your name, social security number and other important information from hackers…

    HOWEVER, when this technology is used on Photos, Images or Art, etc, Algorithms are employed and an error rate of less than 1% occurs; which justify the means, that more attorneys and computer forensic specialists will be hired to defend the innocent and the county prosecutors and sheriffs, Like, ‘Elmer Fudd’ will have field day!

    As one Geek-A-Zoid colleague of mine, told me, “This will open up another Pandora’s box”…..

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *