Wednesday, 11 January 2017

Microsoft Anti-Porn Workers Sue Over PTSD

quote [ Ex-employees of the company’s online safety team say they had to watch horrific online videos of child abuse, bestiality, and murders—and that Microsoft ignored their PTSD. ]
[SFW] [health] [+4 Sad]
[by XregnaR@9:30pmGMT]


eidolon said @ 5:25am GMT on 12th Jan [Score:1 Underrated]
I thought there was a personal experiences article on Cracked about this exact monitoring job. I even mentioned the PTSD. But I can't find it. And now I've explicitly searched for phrases that have probably put me on a new watchlist.

Does anyone else remember an article, pre-lawsuit, about this issue from the last few years?

The whole thing has led me to wonder how many people have temp data thumbnails in our PCs that are of underage people. Probably all of them. How can I tell if that person is 16 or 18? What if an image search loads some bad images? There's a tiny thumbnail of that stored in my PC whether I meant to find this image or even looked at it unless I make sure to delete all that data.

Every person who uses the internet probably has files that include underage persons even if they never intended to and are not aware of it. Right now, people being convicted for this crime have large amounts of this pornography and frequent sites intended for its trade, making them unquestionably consumers of child pornography.

But what about the people with incidental images who never meant to find that content or had no idea they were consuming it? What's to stop it from being a catch-all to convict anyone? For instance, for political speech?


I often think we should all be much more afraid of the internet than we are. There are days I wonder if I should put all my hard drives in the microwave, move to another country, change my name there, and never touch the internet again. It's not bad yet... but it easily could be. This very comment might be the thing I hang myself on someday.
cb361 said @ 9:08am GMT on 12th Jan [Score:1 Informative]
I think I mentioned on SE that I was once clicking from porn ad to porn ad, and arrived at a real goodness-to-fuck child porn site. I hit CTRL-ALT-DELETE and killed Firefox, forgetting that when I re-launched the it, it would helpfully "recover" my lost browser windows...
arrowhen said @ 9:20am GMT on 12th Jan [Score:1 Informative]
I definitely remember an article about PTSD in people monitoring stuff like this being posted either on Old SE or the early days of the new site.

I've also definitely had child porn on a PC: back in the pre-bittorrent era of P2P filesharing (eDonkey2000 was especially bad about this) trolls would sometimes post what were ostensibly popular movies or cracked games or whatever but would actually turn out to just be a rar file full of fucked up images. Usually it was just gore and bestiality, but occasionally you'd get some kiddie stuff in there too.
mechanical contrivance said @ 2:48pm GMT on 12th Jan
Was the game you were trying to download Tekken 3?
zarathustra said @ 1:52am GMT on 13th Jan [Score:1 Interesting]
I don't recall the details of it and it isn't something I have ever dealt with, but my ex prosecuted a case once and mentioned that they had to prove that the person looked below something like 14 ( using expert witnesses) to prove intent and not just mistake. Above that and they couldn't get mens rea beyond a reasonable doubt.
eidolon said @ 6:27pm GMT on 13th Jan
Fair point. I see that as a tough case to sell without corroborating evidence. How else do you prove someone intended to search for 14-year-olds and not persons over the age of 18 who happen to look young? The first one is a crime, even if the resulting search yields people over 18, since the attempt to commit a crime matters regardless of success at committing the crime.

Unfortunately, mens rea doesn't generally work in the reverse, does it? If I succeed at committing a crime without meaning to, I am still prosecutable, it would simply be a lesser offense than if it were my intention.

I'm not saying that's bad. "It was an accident" is little consolation to, for instance, the family who lost a loved one to negligent homicide. However, I am not sure we've considered how easy it is to not only commit a crime without meaning to, but to do so without being aware one was committed, particularly under circumstances where the crime does not have a direct material effect. In the case of accidental child porn, the primary damage was in the creation and wide viewing of the material, while an accidental viewing still adds to this damage, is it substantial enough to justify harming someone who, to a lesser extent, is also a victim (in this case of sexual harassment since they were exposed to sexual images they did not consent to be exposed to).

If you'll excuse me, I'm going to go add myself to more FBI watchlists by attempting to research cases that might clarify how we evaluate these situations.
zarathustra said @ 7:10pm GMT on 13th Jan [Score:1 Informative]
If you succeed at committing a crime without meaning to you are still guilty - depends on the crime but in most crimes you need intent ( either specific intent or general intent depending on the crime.) The only exception I can think of of the top of my head is statutory rape - no intent and even mistaken belief will not protect you there. The general rule is that to be guilty of a crime you need both the guilty act and a guilty mind.

However doing an act that one does not know is illegal is still a crime if the person did the act and intended to do the act. Mens rea does not include intending to break the law it is intending to do the act the law says is illegal.
milkman666 said @ 6:49pm GMT on 12th Jan Lupe Fuentes, is of age, but looks really freaking young. Some dude who bought a dvd with her on it got charged with possession of cp.
lilmookieesquire said @ 11:31pm GMT on 11th Jan
Ain't no job pay enough money to watch that.
satanspenis666 said @ 12:03am GMT on 12th Jan
I know right... I'll do it for free.
cb361 said[1] @ 12:16am GMT on 12th Jan
Don't joke. This is a serious, serious matter.

Clearly they should hire only psychopaths and child abusers for the job. Not only would Microsoft not need to worry about the role damaging their employees any further, they would probably volunteer for unpaid overtime.
arrowhen said @ 9:22am GMT on 12th Jan
I think the obvious candidates would be edgy 14 year olds, but there's probably some kind of bullshit child-labor laws standing in the way.
cb361 said @ 10:00am GMT on 12th Jan
A hundred years ago, there was full employment for children. Now, they have (almost) all been put out of work. Won't somebody think of the children?
Kama-Kiri said @ 11:49pm GMT on 11th Jan
I thought that stuff was handled by law enforcement directly or screened by affiliated companies, people who are specifically trained to be able to handle the stress as much as that's humanly possible anyway. It's pretty shocking that regular MS drones were made to do that work.
kylemcbitch said @ 11:52pm GMT on 11th Jan
One of my clients were I work is the Post Office. They apparently run a bunch of honeypot sites for child porn.

I found this out because I was doing maintenance on the server one of their sites were hosted from and found the most disturbing shit I've ever seen. I reported it, and was told why it was there.

I can't imagine doing that every day. 30 minutes for one day was enough to scar me for life.
papango said @ 1:11am GMT on 12th Jan
A lot of it is farmed out to companies in the Philippines, just like call centre work, to untrained people under massive amounts of stress.

The Laborers Who Keep Dick Pics and Beheadings Out of Your Facebook Feed
5th Earth said @ 1:02am GMT on 12th Jan
IIRC the Google teams that do this stuff only work 3 months at a time and get in-house counseling.
hellboy said[1] @ 8:51am GMT on 12th Jan
I don't understand a few things about this story. For instance, why it's necessary to watch the videos for more than a few seconds. As soon as it's apparent what's happening, wouldn't you stop the video and report it? Why is it necessary to watch the entire video?

Also, while I'm sure looking at this stuff all day long - wait, why are you looking at this stuff all day long? How often do child porn and snuff films actually show up in random user feeds? Once you discover an example of something illegal, wouldn't the rest of your day be spent archiving *all* that person's files or communications and filling out paperwork to report them to authorities? And wouldn't most of your days consist of looking at boring innocuous stuff that's not illegal? How many pedophiles do you catch in a day, or a week, or a year? How frequent are actual infractions?

That's certainly a disturbing thought, but this section was also disturbing:

They “could literally view any customer’s communications at any time.” Specifically, they were asked to screen Microsoft users’ communications for child pornography and evidence of other crimes.

Uh... how much private information is getting violated while hunting for pedophiles? Are these people actually sifting through everyone's communications, or just targeted users? How are they targeting the users? Do Microsoft's terms of service include consenting to random wiretaps? What about bank numbers, personal data, medical information, and so forth? Stuff that isn't illegal but is normally illegal or at least unethical to compromise?

What about legal-but-non-vanilla stuff like BDSM videos, low-budget horror films, and so forth? Are these workers trained to distinguish between that stuff and actual illegal material? Are they disturbed by hardcore kink between consenting adults, like rape fantasy porn? Are they watching that stuff too?

Is Microsoft running honeypot sites like the Post Office (per kylemcbitch)? Isn't that in itself a violation of child pornography laws?
cb361 said[1] @ 10:22am GMT on 12th Jan
Dunno. Perhaps there are lots of false-positives to trawl through. Perhaps they all have to be trawled through all the way, in case that cute couple with a taste for making home-made porn decide to introduce their young son in the last five minutes. Unlikely, but it could happen, and who would get into trouble if it was missed? And perhaps when there is a true-positive, they have to write up a detailed report or compliance or legal reasons ("1:34 - 1:50 man in military uniform gouges out second prisoner's right eyeball. 2:15 eats eyeball"). Perhaps when one dodgy file is found, they have write up all of their files, so there could be a lot of bad stuff in a row.

When I first read the article, I wondered why pictures/videos couldn't be pre-processed to make them blurry or altered in some other way to let the viewer quickly see what is going on, but perhaps protect them a little bit from the HD details. But I guess the company can't risk cutting off customers or expose them to legal action unless they have fully covered themselves first. That would be to expose themselves to legal action.

But, ... I dunno. We would need more information about procedures, to answer these questions.

I heard that the UK police have a server that they can upload seized pictures/videos, which automatically checks them against a massive store of known child pornography, and will tell them if there is a match. But which crucially doesn't allow the client direct to access the stored media.

Post a comment
[note: if you are replying to a specific comment, then click the reply link on that comment instead]

You must be logged in to comment on posts.

Posts of Import
4 More Years!
SE v2 Closed BETA
First Post
Subscriptions and Things
AskSE: What do you look like?

Karma Rankings