Skip to main content


You’ve got nothing to hide so you’ve got nothing to fear, right?

“A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged Him as a Criminal.”

https://inkl.com/a/EbYLJlcYkxY

(Remember Apple was/is(?) going to implement this sort of scanning by default on all your devices.)

#BigTech #privacy #surveillance
in reply to Aral Balkan

Wow that's scary.

I've taken photos of my son when he's had a medical problem to send to the docs when a visit wasn't possible. Nothing that includes genitals thankfully, but an infected and swollen penis in young boys is not uncommon. That happened to my son twice IIRC, both requiring medical help to clear up.

Given doctors are seeing patients less these days sending pics to complement a phone call is quite normal, so more parents are going to get caught by this.
in reply to Mackaj

Also, given that Google wants people to abandon passwords in favour of using their mobile phone as the key for passwordless sign-ins across the web, the consequences of having that key unceremoniously removed with no warning could be devastating.

That's not a feature I'm ever going to use personally. I've had too many issues with phone problems over the years. @bitwarden ticks all my boxes and I'm sticking with it.
in reply to Aral Balkan

Shocking! 😳 We were all forced to take photos to send to doctors during lockdown! How many more people have been caught out like this?!
in reply to Aral Balkan

Not implausible voice analysis (Google Home, Alexa et al) will be used similarly, looking for supposed criminal intent in what is spoken.

Aral Balkan reshared this.

in reply to Aral Balkan

as a Google photos user and a father of a toddler, this has been on my mind practically the entire time, especially now as we vacation on the Mediterranean coast and my kid is swimming naked sometimes. You're not paranoid if you're right, or however that saying goes. Sadly, I can't switch to some other, self hosted solution due to the high technological barrier to entry.
in reply to Aral Balkan

I've got some mixed feelings about these things:

1. On one hand, making sure that child pornography doesn't proliferate is a battle worth fighting - while being well aware that automatic detection of suspicious content will necessarily violate people's privacy, and that finding a trade-off may prove impossible.

2. On the other hand, *real* pedophiles are probably already aware that they could end up in jail for the content that they store and exchange, so they are already likely to employ strategies to bypass these checks - from E2EE used to store and exchange content, to just not using Google/Apple photos or camera apps on their phones.

It'd be nice for Google and Apple to disclose how many real pedophiles they managed to catch thanks to their media content checks, and how many were false positives like this poor father sharing pictures of his son to his doctor. If the number of false positives outweighs the number of bad guys caught, there should be no doubt that such checks should be ditched. However, of course, Google and Apple will never disclose such data - like many others, they care more of having a handy excuse to justify their invasive surveillance strategies rather than actually protecting the weakest. Had they really cared of fighting child pornography, they would have directed their resources and efforts to the copious amount that is shared on the dark web.

And, even if such checks were actually effective and legitimate, Google ought to apologize to those caught in the network for no fault of theirs. Imagine being a father with a sick child who suddenly has to explain to the police why he sent a picture of his kid to his doctor. Imagine what an ugly and embarrassing moment it must be. What really enraged me was Google's reaction - after the father had *already* been interrogated and cleared of all charges. They responded with their usual impersonal, faceless and dull corporate message - "the safety of our customers and fighting online crime are our priorities bla bla bla". Not a single word of apology to a father who had to spend hours in a police station while unjustly reported for something as ugly as child pornography.
in reply to Aral Balkan

I believe apple was implementing detection of previously known images by signature, not doing some sort of automatic discovery of new, offending images.
in reply to Dan

@dfraser Indeed, version 1 was to have hash-based detection (which has already been broken via proof of false positives). Another difference was that it was going to take place on your device. But all of these are slippery slopes. And Apple hasn’t abandoned its plans either (at least it hasn’t committed to doing so publicly) so they can resurface at any point (likely with better public relations this time).
@Dan
in reply to Aral Balkan

@thatkruegergirl Apple is matching known images not content.

So don’t share it until it ends up a known image, since that means you’re distributing child pornography by definition.
Unknown parent

Aral Balkan
@pim They backed down temporarily. They never said they scrapped it.