(soft music) (Sarah speaking in foreign language) – [Narrator] Wake up. It's 2027. (Sarah speaking in foreign language) In 2027, Sarah takes care of everything. (Sarah speaking in foreign language) (Lionel speaking in foreign language) (Sarah speaking in foreign language) (shower water pouring)
(soft music) Sarah is a virtual assistant who knows exactly what's best for you. (Sarah speaking in foreign language) (Lionel speaking in foreign language) (Sarah speaking in foreign language) (Lionel speaking in foreign language) Everywhere you go artificial
intelligence, like Sarah, predicts your needs and
does the work for you. (male AI speaking in foreign language) With all of these
machines working for you, isn't life wonderful in 2027? (Sandrine speaking in foreign language) But let's not get carried away. Before Sarah changes your life forever, there's another story to tell, one with less special effects.
(eerie music) This story takes place behind the scenes of those businesses who are
working to invent our future. (man speaking in foreign language) For now, it's hardly this wonderful world where machines are working
entirely for mankind. In fact, you could say
it's exactly the opposite. – Humans are involved in
every step of the process when you're using anything online. But we're sold as this
miracle of automation. – [Narrator] Google,
Facebook, Amazon, Uber, these digital giants are using a completely invisible workforce to keep their applications running. – [Jared] There we are. – With technology, you
can actually find them, pay them the tiny amount of money and then get rid of them when
you don't need them anymore. – [Narrator] A workforce
that is disposable and underpaid. – On a very good day,
I could do $5 an hour. On a really bad day, I could do 10 cents an hour or 20. I mean, it's… – [Sandrine] Is it possible
for you to pay less than the American minimum wage? – I'm not sure we want to
go in this direction.
Yeah. – [Narrator] Whilst
millions of men and women are training artificial
intelligence for next to nothing, others are being hired
and hidden out of sight to clean up social networks. – [HR Representative]
You must have been told by the recruiting team
that you cannot mention that you are working
for this project, okay? – [Narrator] We went undercover as one of these web cleaners, working as a content
moderator for Facebook. (Gregoire speaking in foreign language) – [Pedro] There's a few things that I saw. Those things are going to stay with me because I remember them
as if it was yesterday. – [Narrator] To meet the workers
hiding behind your screen we're taking you to the
factory of the future.
The digital economy's best kept secret. – You know, it's just
like a sausage factory. They don't want people to come in to see how the sausage is made. I mean, I think it's just that simple. (waves crashing)
(boat horn blaring) – [Narrator] To delve into the mysteries of artificial intelligence, we're heading to the
West coast of the U.S. Here in San Francisco
and the Silicon Valley, the world of tomorrow is being developed. It's the high-tech hub of giants like Apple, Facebook, YouTube, Uber, Netflix, and Google. (upbeat rock music) We have a meeting at Figure Eight, a business specializing
in artificial intelligence that primarily works with Google. The founder, Lukas Biewald, agreed to spend the morning with us.
– Lukas.
– Hello. Hi, nice to meet you.
– Hello Lukas. Nice to meet you. Thank you
very much for your time. – Of course. – I know you have a busy schedule. – Yeah.
– Thank you. – [Narrator] At 38 years old, this Stanford graduate has already worked for the
likes of Microsoft and Yahoo before founding his own company. (Lukas laughing) Once his microphone is on, a quick tour of their startup style, Californian office space. – This is our best dressed employee. – Oh, geez.
(Lukas laughing) – [Narrator] Cool and relaxed. – This is probably our
worst dressed employee. (Lukas and employee laughing) (bright upbeat music) – [Sandrine] Do you play table football? – I think I'm pretty
good.
I don't know maybe. (Lukas laughing) This is kind of our eating area. This is actually where I like to work. My coffee got cold.
(Lukas laughing) – [Narrator] And in the reception area, an impressive display. – So these are some of our, some of our customers. And the different things that
they did with our products. Here's Twitter. We helped them remove a lot of people that were kind of
bullying on their website. You know, American
Express. Is that in France? I don't know, yeah?
– Yeah.
– I feel especially proud of
something like Tesco, right, is able to use us to
improve their online website to show better search results, so people can find the items
that they're looking for. – [Sandrine] And I don't see Google. – No, I don't know. Do you know why some of these get up here? – We, frankly need to stop. (Brown chuckling) Because it was getting out of hand. – [Narrator] This is
Mr. Brown, head of PR. After our visit, the founder explains the
enigmatic name, Figure Eight.
– We call our company Figure Eight because we think of it as a loop. And the loop really has
these two parts, right? There's the humans that do the labeling and then the machine learning
that learns from the humans. And then it goes back to the humans for more labeling, right? So we think of this kind of
like beautiful loop, right, where humans do the best
things that humans can do. And the algorithms, the
artificial intelligence does the best things that
the algorithms can do. And we put that together and that's why we call it Figure Eight. (soft music) (Lukas laughing) – [Narrator] To get a better understanding of why AI needs humans to function, we stop joking around
and get out the computer.
– So here's an example. You know, a lot of people these days are trying to build cars
that automatically drive. Like for example Tesla has a system where you
can drive around in a car but of course it's incredibly important that these cars don't
run into pedestrians. So the car camera just
sees something like this. So it's really important that
they build reliable systems that can identify people. And the way that they
learn to identify people is looking at lots of pictures of what the car is seeing from the camera. And then actually literally
labeling where the people are. So, in this… – [Narrator] Here is a real
example of how it works. If you want to teach a self-driving car to recognize a pedestrian,
a human like you or I, at first has to identify
pedestrians from photos and then feed this information to the AI. And this process has to
be done over a thousand, even a million times over which
can be very time consuming. (upbeat music) This is where Figure Eight gets involved. Using real people who
are paid to do this work.
– So the task here is
to look at this picture and then label where the people are. – [Sandrine] And so you get paid for this? – You get paid to draw
boxes around the people. – [Sandrine] How much? – I'm not sure this task, but maybe it would be like, maybe 10 cents per person
that you draw a box around. – [Sandrine] Who does this job? Do you have employees doing
these jobs and labeling people? – Yeah, so it's contractors in our network that log in and do these jobs. – [Sandrine] What do you mean by contractors on your network? What kind of people? – So it's like people that
log into this and then, and then want to work on these tasks. – [Sandrine] How many people
work for Figure Eight? – In this capacity as labelers? – [Sandrine] Yeah. – So again, it's, people can kind of come and go if they want to. So there's maybe around
a hundred thousand people that kind of consistently
work every day for, for certain use cases that we have. But then there's also millions of people that log in from time to
time and work on tasks.
– [Sandrine] And where
do those people live? – They live all over the world, actually. So they live all over America and then they live all over the world. – [Narrator] So who are
these millions of people who are being paid to train AI technology? (upbeat music) In order to meet these contractors, as Figure Eight calls them,
we leave Silicon Valley and head 500 miles North
of San Francisco in Oregon. (upbeat music) – There we are. Aha. Success. – [Narrator] Jared Mansfield signed up to Figure Eight three years ago. He now spends several hours
a week working for them. Every day, the company
offers a list of tasks that he can complete for money. For example, training search engines. (upbeat music) – For this first one, it's
showing examples of how to do it. The query is mac and cheese perogies. And the two results are Annie's Homegrown Organic Mac and Cheese. And Annie's Real Aged Cheddar Microwavable Macaroni and Cheese, which are, neither of them are perogies. So it's saying that one
would be equally bad matches. – [Sandrine] What's the use of doing that? – A lot of it, I think is to train search search algorithms.
So like when someone
sits at their computer and types a product, the algorithm will be able to
determine with more accuracy what product it is that
that person is looking for. (upbeat music) – [Narrator] For every 10 answers, Jared earns less than 1 cent. To get an idea of how
much money he can make, we leave him to work for 30 minutes.
He's answered 180 questions
over the course of half an hour. – [Sandrine] How much have you earned? – 15 cents. – [Sandrine] For how long? – A half hour. – [Sandrine] Which would
be 30 cents the hour. – Yeah. Which are pretty, definitely
not a livable wage, that's for sure. – [Sandrine] Do they have
the right to do this? – I mean, they have a right
to do whatever they want. I'm the one coming to them for little tiny bits of
coins on this website. And, there's no contract between me and them. (upbeat music) – [Narrator] No contract, no salary, no guaranteed minimum wage. These ghost workers are paid
to train software and robots using only one rule, supply and demand. (upbeat music) – It definitely feels like I'm part of this invisible workforce that is kind of made up
of just random people throughout the world. And, together we're kind of training what's going to replace the
workforce as a whole eventually. – [Narrator] Jared is very
philosophical about the idea. Still, he can afford to be. To earn a real living, he has another job selling
chicken in this supermarket for a little more than $1,500 a month.
Figure Eight is just
what he does on the side to earn a little extra cash. (upbeat music) After leaving Oregon, we decided to take advantage
of what we'd learned in America and sign ourselves up to Figure Eight to train artificial intelligence. On the site's welcome page, small tasks are proposed
at one, two or 12 cents. We chose this as our first task. Drawing boxes around objects in images. Following the instructions, it took us several minutes
to draw around 10 objects and earn 2 cents.
On the list of tasks, Figure
Eight also offers evaluations of search engine answers, Jared's task of choice. We could also listen to conversations and confirm if the recording features a man or a woman's voice. And if they are speaking English. – [Woman In Recording]
Hi, is James there please? (upbeat music) – [Narrator] We work for hours without ever earning more
than 30 cents an hour. (upbeat music) It's difficult to imagine
that there are people who work on these tasks
on a full-time basis. (upbeat music) We're in Maine, on the East Coast of the United States, close to the Canadian border.
We've arranged to meet with
one of the net's ghost workers, the human side of the Figure Eight loop. (upbeat music) Her name is Dawn Carbone. She is 46 years old. (car door banging)
– Bonjour. – Hello.
– Hello. Hello, hello, hello. Nice to meet you. – Thank you so much for your welcome. It's beautiful.
– Yes it is. – I've never seen so much snow. – Oh, we had a blizzard not that long ago. And then we got more snow.
And it's also, I think
negative seven out there. – [Narrator] Dawn is a single mother. She lives here with three of her children. (plastic bag crackling) – This is what subsidized
housing looks like up here. I mean, it's not bad for public housing. – [Narrator] She lives and works here, working on the Figure Eight site all day. – I'll turn it on like I said, right before seven o'clock, get the initial stuff done. I'll turn this off at three
o'clock in the afternoon and then turn it back on
at nine o'clock at night. So,
(Dawn sighing) I'll say eight hours minimum. So, I bust my butt though. Like this would be the dashboard and you can see I've done 6,445 tasks. – [Sandrine] Since when? – [Dawn] Three years. See these different badges?
– Yeah. – You start off, you have no badge. And you have to do so many
questions and get so many right. And then you get your first level badge and then now when you get to level three, you have access to virtually
all the tasks that are put up.
– [Sandrine] What is your
level level right now? – Right now? Oh, I'm on level three. I've been level three. I've been level three for quite a while. – [Narrator] Dawn is considered
a high performing worker. Figure Eight therefore offers
her more work than a beginner but it isn't necessarily more interesting. (upbeat music) – I have to put bounding
boxes around people. I'm not really keen on this job. The biggest problem is
trying to find jobs that are viable. And right now I don't have many. – [Narrator] And it's
definitely not better paid. – On a very good day,
I could do $5 an hour. On a really bad day, I could do 10 cents an hour or 20. I mean, it's, I mean, I have had some
really, really good days until February.
Yeah. – [Sandrine] Do you think this is a fair payment
for what you're doing? – No, no, no, no. Not at all. But I live in Northern Maine. We get a lot of snow. There's a very low job market and it helps me as a stay at home mom. It helps with added income. Yeah. Yeah.
(Dawn chuckling) – [Narrator] Dawn
prefers to work from home because her youngest
daughter, Jane, has autism. – Here you go. What happened? – [Narrator] Dawn wants to
be there to take care of her when she gets home from school at 3:00 PM.
– [Dawn] So how was school?
Good day or bad day? – Good day. – Really, a good day? With her autism, I always have to be ready to jump in my car and
go get her from school. I mean, it could happen one day out of the week or not at all or three days out of the week. And the school is very understanding. So, I mean, I have to
take out the whole week, if I was working out of the home. – [Narrator] Dawn receives $750 in government aid every month, which isn't enough to
cover all of her bills. This is why she signed up to Figure Eight. By working eight hours a
day and five days a week, she says she earns on average, $250 a month on the site. (wind whistling) (soft music) On Figure Eight, the
pay is non-negotiable. If you refuse the work, there will always be
someone else to take it.
There is an unlimited supply
of these ghost workers coming from all over the world. It's probably why Lukas
Biewald is so happy. (Lukas laughing) But he isn't the only one to take advantage of this phenomenon. Various other businesses propose these sorts of repetitive and underpaid online tasks. The biggest amongst
them being Click Worker and Amazon Mechanical Turk. A platform provided by Amazon
and its boss, Jeff Bezos, who invented the concept in 2005. – Think of it as micro work. – [Narrator] Micro working is
a growing concern for the ILO, the International Labor Organization, a UN agency in charge of
protecting workers rights across the globe. (upbeat music) – [Sandrine] Hello Janine.
Thanks for your time.
– [Narrator] Janine Berg
is the resident expert on this subject at the ILO,
who speaks to us through Skype. – With globalization,
you can see the emergence of kind of a global labor force. Here, it's the next step. It's really the service
industry that can break up work into kind of very short
little succinct tasks. And then, divulge it to
workers all over the world who compete for the jobs, do the jobs, then the price of the
wages are driven down because of this global labor supply. And the technology has facilitated this and it's cheap. That's the other, the main advantage. (dramatic music) – [Narrator] Janine Berg wrote a report calculating that micro
workers earn on average $3.31 cents an hour, without any rights in return. Workers' extreme vulnerability is the key to Lukas
Biewald's business model. After months of investigations, we found this video from 2010, that sums up his view of the labor force. (dramatic music) – Before the internet, it would be really
difficult to find someone, sit them down for 10 minutes
and get them to work for you and then fire them after those 10 minutes.
But with technology, you
can actually find them, pay them a tiny amount of money and then get rid of them when
you don't need them anymore. – [Narrator] While we
were interviewing him, we wanted to ask him if he
still shared the same opinion. But when we started talking
about work conditions, the Figure Eight founder seemed
to lose his sense of humor. – [Sandrine] Do you have an idea of the average revenue per
hour of your contributor? – I'm not sure. It's totally dependent on
the task that someone puts in and it's hard to track
time on the internet because people can walk
away from their computer and come back. So I don't know how much
people doing it really make. – [Sandrine] There was a report on ILO saying that on average the
people working on crowdsourcing were paid $3.31 an hour. Would that be consistent
with what you pay? – Again, I'm not sure. – [Sandrine] Is it possible
for you to pay less than the American minimum wage? – It could be possible. – [Sandrine] So this is legal? – I'm not sure we want to
go in this direction.
Yeah. – You know what? Can we take
this a different direction? I mean, I'd rather this one
with some more AI than anything. – [Sandrine] Yeah. But
this is the whole thing. I mean, this is about
crowdsourcing as well. So I have to ask questions
on the crowdsourcing. – Oh.
– Because— – I thought it was more, I just, I prepped him for more
of an AI conversation than a crowdsource conversation. – Now I don't, I think we should, we should, I don't really want to do this. Yeah, we can find you someone
else to talk about this stuff. – [Sandrine] Okay, so
you're not comfortable with this part of the discussion? – No, no.
– Okay. – You're right, it is an
important part of the conversation but I think it's just, it's not the AI conversation
I prepped him for.
(upbeat music) – [Narrator] We don't have
time to pull up the video. Lukas Biewald makes a hasty exit, without saying goodbye and leaves us alone with his head of PR. One last chance to ask how the business treats these contractors, as they call them here. – [Sandrine] When I was working on this, I found many people
complaining, being disconnected. – And I understand that.
– And they— – And, I actually have to go now too.
So it's 11 o'clock, so—
– Okay. So you don't want to speak
about human in the loop. – That's not my role so… All right. I think we're done. – [Sandrine] So only artificial
intelligence, no human. – Well, that's what we were prepared for. So, sorry.
– Okay. It's a pity. – [Narrator] To get some
answers to our questions about Lukas Biewald and
his views on his workers, we thought we'd try a different tact. (dramatic music) (soft music) On the day the Figure Eight founder made his statement on disposable workers, there were other
entrepreneurs amongst him, as well as a researcher, Lilly
Irani, just on the right. (dramatic music) (waves crashing) (upbeat music) 10 years after the conference we find Lilly living South
of Los Angeles, California. (upbeat music) Lilly Irani teaches at the
University of San Diego.
And one of her specialist subjects is the working culture
of high tech business. We're lucky she has a good memory. – [Sandrine] Do you
remember if somebody reacted after this sentence, which is
very brutal in a certain way? – To be honest, the reaction was nothing. I remember that panel, everyone went up to him to talk to him. And two or three people came up to me to talk about the ethics
of this form of labor. This is a room full of
highly educated people in San Francisco, and
nobody batted an eyelash. – [Sandrine] How do you explain that? – The kinds of people who
have access to these spaces are the kinds of people who've
never worked in a situation where they wondered if
they could make rent, or they never worked in a
situation where somebody gets sick and they can't pay someone
to go and take care of them. So they have to kind of take
a really bad job at home.
And they have no connection
to the kinds of situations of the people that are
willing to do this work. It's what happens when you
go to schools like Stanford and Harvard and Princeton that tell you you're the smartest person and you're going to be a future leader. And you've been chosen
because you're special. And that you have the
power to change the world.
– [Narrator] A Silicon Valley elite who is out of touch with
the rest of the world. This is the key to understanding
Lukas Biewald's logic. Although it's not the only part. – These workers are invisible by design. They can write code and send your work out and never talk to anyone. It's designed so you can get
the work back on a spreadsheet if you need to. You just see these letters and numbers of identifying the worker,
you don't see a name, you don't see where they live, you don't see what their situation is, you don't see unless you
keep track of it yourself, have they worked for you before or not? – [Narrator] Do these ghost workers really know who they work for? Have they ever heard of Lukas Biewald? We showed them the footage
of the Figure Eight founder talking about their work.
(soft music) (Lukas Biewald talking
indistinctly on video) – With technology, you
can actually find them, pay them the tiny amount of money, and then get rid of them when
you don't need them anymore. – He's giggling over paying
people pennies and yeah! Bye bye. Okay. Now I'm going to start arguing
like I do about the AI, is when they get me agitated. (Dawn chuckling) – It's kind of surprising,
I guess a little bit to see they're so openly, openly talking about that view that they have of the workforce. It's, I guess it doesn't
really surprise me that much but yeah, it definitely kind of sucks, I guess, when they could be paying them a lot more or at least showing some appreciation or maybe even some discretion.
– Basically he's saying
in person, you know, you hire somebody for 10
minutes and fire them. This way, you don't have
to look at the person and you just, goodbye. So that's kind of just, it is kind of… The fact that the head of the company is, people are that disposable, that really isn't right. I don't like that. So I like what I do when
I have something to say and I will say it. So I'm not disposable. (soft music) – [Narrator] Amongst
this invisible workforce hiding behind your screen, there are those who feed
algorithms for next to nothing. It's the people in charge
of tidying up the web.
The social media cleaners who work on sites like
Facebook or Instagram. These workers are never mentioned in the slick presentations
of the Silicon Valley CEOs. – I started building a service to do that. To put people first and at the center of our
experience with technology because our relationships
are what matters most to us. And that's how we find meaning and how we make sense of
our place in the world. – [Narrator] Today with 2 billion users, Facebook no longer has anything to do with Mark Zuckerberg's
initial vision of the site.
(upbeat music) With violent videos, hate
speech and pornographic images, more and more content has to be deleted. And it isn't always robots doing this job. There are, once again, humans
hidden behind the screen. – Determining if something is hate speech is very linguistically nuanced. I am optimistic that over
a five to 10 year period we will have AI tools that can
get into some of the nuances, the linguistic nuances of
different types of content to be more accurate in flagging
things for our systems, but today we're just not there on that. So a lot of this is still
reactive. People flag it to us. We have people look at it. – [Narrator] These people are in charge of sorting and managing
content on the network. Facebook call them content reviewers. According to their site, Facebook has 15,000 workers
doing this job across the world. In Ireland, Portugal, the
Philippines and the U.S.. (upbeat music) We contacted Facebook but the company refused our
request for an interview. (upbeat music) So in order to meet these moderators and understand their role, we identified Facebook's
main subcontractors. Multinationals, such
as Majorel, Cognizant, or Accenture.
(upbeat music) We found this job offer
for a content reviewer for the French market and Portugal. (suitcase zipping) Gregoire is one of the
journalists in our team. He responded to the ad
and was offered the job. (upbeat playful music) Before taking off, he
received his contract, which included his
monthly salary, 800 euros. A little over the
minimum wage in Portugal, with a food allowance
of €7.63 cents a day. (upbeat music) Facebook isn't mentioned
once in the document. Even when directly asked, Accenture refuse to
give the client's name. – I was just wondering,
now that I took the job, I'm going there, I'm doing it, I was just wondering, if I could know the name of the company I'm going to work for. – [Accenture Lady] No, we
can not reveal the name yet. It's for one our customer but we cannot, we are not allowed to say the name.
(upbeat music) (plane engine roaring) – [Narrator] This is where
Gregoire will be working, at the Accenture offices in Lisbon. (soft music) Before getting started, our journalist was sent
to a welcome meeting. The footage is a little shaky, as Gregoire is filming
with a hidden camera. – [Gregoire] Hello. Hello. I'm having a meeting with Accenture at 9:30.
– Just wait a moment. – [Narrator] Gregoire isn't
the only new employee. 12 other people are starting
the role at the same time. Another French person, along with some Italians and Spaniards. An HR representative is
running the welcome meeting. – [HR Representative] Welcome you all. My job as career advisor is to help you in all the
relationship with Accenture, okay? – [Narrator] After the vacation documents and social security paperwork, the small group finally find out which company they are working for.
But it's top secret. – [HR Representative]
You must have been told by the recruiting team
that you cannot mention that you are working
for this project, okay? The client is really very demanding. You cannot mention to anyone that you are working for Facebook, okay? If someone asks you where you work, you work for Accenture, okay? We still, we have this court mandate that is sealed. So if I'm talking to some
colleagues from Accenture not on this project and he
asks me where do I work, I cannot tell that I work for Facebook.
Okay? This is not allowed. It's completely like
confidential that work is, that Facebook is working
here at this facility, okay? – [Narrator] Code names,
confidentiality clauses and a complete ban on cell phones. Facebook gives you the
life of a secret agent for $800 a month. And if you're the chatty type, the following arguments should
shut you up pretty quickly. – You have like an agreement and you cannot break that agreement because by law, we can do,
we can punish you by law. You know, it's confidential. (soft music) – [Narrator] Cleaning up social media is a bit like doing your
family's dirty laundry. It has to be done, but
nobody talks about it. (soft music) Why so careful? What does the job involve? (soft music) We continue discreetly with Gregoire. (tap water running) – [Gregoire] Hi. – [Narrator] Before becoming a moderator, Gregoire has to follow a
three week training program.
Moderating Facebook's content doesn't only involve
deleting violent videos or racist jokes. It's a lot more complicated. At the moment, the algorithms
can't handle everything. Every decision must be justified
using very strict rules. This is what we learned
during the training. Every day is dedicated
to a different theme during the program. For example, nudity, violent
images or hate speech. On the agenda today, dark
humor and jokes in bad taste. – [Tutor] We will remove a violation if the person that you see in the image, we need to have a real person is visibly threatened.
If you are making fun of the event, then it's going to be marked as cruel. What do we do when there's
a mocking of the event? – [Students] Mark as cruel. – [Narrator] Here's an example
of an inappropriate joke about 9/11. (playful music) It may seem over the top but there are dozens of rules
like this for each category, which can be difficult
to get your head around. Take nudity for example. Depending on what part
of the body you see, or their position, the moderator can't always
make the same decision. Here's an example from the
exercises to better explain. Gregoire decided to delete
this particular photo, but according to Facebook's rules, he was wrong to do so. In the feedback session, the trainer offers this explanation. – [Trainer 1] If we cannot see, and his head is not here, then it's ignore. It's in between her boobs, so if I don't see directly
the contact with the nipple, it's nothing.
– [Gregoire] You know, that's exactly why I am having so much
trouble to understand things. You have an artistic
picture of a photograph, of a woman and you show
a tiny nipple on it. And so on one hand, this is a delete because we have a hundred
percent uncovered nipple. On the other hand, you have this almost pornographic picture, and you don't delete because
it doesn't fit the rule. That's exactly why— – [Trainer 1] Yes but
you have also a problem because you're still
going from what you think and your decisions and, we're in school to learn rules. – [Narrator] Applying Facebook's rules without questioning them
is the number one rule. A principle that will be drilled into you all day, every day. – [Trainer 2] There has to be a line. And they drew it around that, okay? We just need to respect it. And we just need to
apply it to do our jobs. – [Trainer 3] Sometimes,
we'll find disagreements, but I mean, this is still the good job because this is not my
social network, it's theirs.
– [Narrator] A training program with the end goal of
turning you into a machine. – [Gregoire] See you later. (dramatic music) – [Narrator] Pedro worked for six months as a content reviewer for
Facebook at Accenture. He agreed to respond to our questions but only if he remained anonymous. Two years after leaving the company, he still remembers the
numbing side of the role. – [Pedro] You have to play by their game or else you won't have a
job at the end of the month. And it got to a point where
I just felt I was a robot and just doing as many pictures and videos as much as possible, just because I was just, that's the only thing I can do.
You're just there with numbers and clicking enter. Numbers, enter, numbers, enter. – [Narrator] The hardest thing for Pedro is trying to forget everything
that he saw on that screen over six months. (tense music) – [Pedro] You're not prepared for it. We're not mentally prepared for it. All this stuff, they don't
really give us the inputs before. And it just comes to you as a shock. It just comes through like a wave. Here, have this in front of you. And you can't really say yes or no to it. If you give me a million
euros or billion euros, I wouldn't go. It's not for me.
I don't know. (soft music)
(people speaking indistinctly) – [Narrator] What Pedro described to us, the wave of shock that
washes over you unexpectedly is exactly what happened to Gregoire. It started around the
fifth day of training during the practical exercises. (Gregoire speaking in foreign language) A stream of horrific images and unbearable videos that
must be watched closely in order to make the right decision. According to Facebook's criteria. (Gregoire speaking in foreign language) (Gregoire's colleague
speaking in foreign language) The same horrific scenes are unfolding on his
neighbor's screen too.
(Gregoire speaking in foreign language) – [Gregoire] Excuse me. May
I take a glass of water? (dramatic music) (Gregoire speaking in foreign language) (upbeat music) – [Narrator] It's like
this on a daily basis for Gregoire and his group. Luckily, they can always
rely on the useful advice of the trainers to feel better. – [Trainer 2] If you feel
uncomfortable with the content, please do warn me and we'll do a little
pause, a little break. We'll go outside, do the Macarena, okay? And then we'll come back. – [Narrator] If the Macarena isn't quite enough to cheer you up, the business also has
psychologists available for the most traumatized moderators. (Gregoire speaking in foreign language) On this day, a video
lasting several minutes brought the violence to
another level for Gregoire. (Gregoire speaking in foreign language) (people speaking indistinctly) (door closing) During the break, everyone
tries to shake off the shock by discussing the grim video
they've just witnessed. – [Moderator 1] A girl was with two guys, and they were like playing with a gun, and suddenly the girl shoots the guy but he was like,
(Moderator 1 gasping) then he was like…
(Moderator 1 gasping) (Moderator 1 speaking in foreign language) At the moment I feel very bad, like, I don't know, but I think that the film didn't last a lot of time, you know? At the moment I feel very,
very sad, I don't know, but then I can continue.
– [Narrator] Gregoire realizes
the extent of the damage this job can cause when
talking with a former moderator who is now a trainer. – [Trainer 4] I have
trouble like on the street because I just see people being hit. In my brain, I see so many accidents that like, I cannot process. Like I just saw toady, fuck off everybody. He was running across the
street, like I cannot anymore. – Oh yeah. You can't take it?
– Yeah, so its— Yeah. It's like, kind of a mini PTSD. – [Gregoire] You've got that? – [Trainer 4] Yeah, I mean
I don't take medication, but I have to be like this. I can't watch people running
across the street anymore. – [Gregoire] You're still
doing this while you have PTSD? – [Trainer 4] There is a
purpose I do feel everyday like, I'm cleaning the trash
from the internet, you know? Okay, I will watch it but at least I know that
I'm going to watch it. Someone who's 14 years old is
going get that and not know. (soft music) – [Narrator] Even two years
after quitting the post Pedro still has very vivid
memories of certain videos.
– [Pedro] There's a few things that I saw. Those things are going to stay with me
because I remember them as if it was yesterday. It's very emotional sometimes. I remember sometimes people used to like, they were working, being productive and suddenly they just stand
up and run out of the room. That's okay, because sometimes there's… – [Narrator] Trauma built up. And for Pedro, left him feeling helpless.
– [Pedro] If you see
someone getting murdered, the only action you take
is delete, for example. You just erase it out of the platform. You don't really go into depth of like calling the police for example. It's like, you never really feel content with what you're doing. You're just going round
and round in circles and just, like bombarded
with all this stuff. So, it's a good mixture of
emotions that you go through in one day. Eight hours of work. – [Sandrine] How many
were you when you started? – [Pedro] We were 30 when we started. 30. From that 30, that started
just decreasing month by month until now there's only like three people. (soft music) – [Narrator] Pedro claims
that a lot of people struggle to deal with the
role and end up quitting. To understand what Pedro went
through and what Gregoire and his colleagues are
currently experiencing, we met up with a psychiatrist.
Professor Thierry Baubet is a specialist in post-traumatic stress disorder. For example, he works with police officers who have been involved
in terrorist attacks. We show him the footage we filmed. (soft music) (Thierry speaking in foreign language) (Sandrine speaking in foreign language) We also talked to him about the famous confidentiality clauses imposed by Facebook. (Thierry speaking in foreign language) Anxiety, trauma, stress, cleaning up social media
comes at a great cost. Gregoire decides to quit
only two weeks later, still in his training period. (upbeat music) He received his paycheck
just before leaving. His hourly pay written
at the top, €4.62 gross. This is a tough pill to
swallow for his colleague. – [Moderator 2] I was earning more in the ice-cream shop
that I was working at.
– [Gregoire] The ice cream
shop? Damn, that's bad, right? – [Narrator] After our experience there, we contacted Accenture. Their response was a brief email that didn't once reference Facebook. It did however, contain this phrase, the well-being of our
employees is our priority. (upbeat music) To finish our tour of the
Internet's trash cleaners, the invisible workforce behind your Facebook or Instagram feed, we had one last meeting. Sarah Roberts is the leading researcher specializing in those
who work as moderators. She is a key figure in this field. We met her at the
university where she teaches in California. She presented us with an analysis of the rise and development
of content moderation over the past year. – We are talking about
a scope and a scale, of magnitude that has
not been seen before. Billions of things shared
per day on Facebook, hundreds of hours of
video uploaded to YouTube per minute per day and so on.
The response has continued to be, we'll put more content moderators on it, which means that that continues
to exponentially grow. And it has gone from a next to nothing kind of
line item in the budget to being a massive, massive cost center. Meaning it doesn't
actually return revenue. It's not like a new product. It's just seen as an economic drain. And the way we manage that problem is by pushing it onto
some low wage workers to do it as cheaply as possible. Because again, that stacks up
when you double your workforce in two years, that does not come for free.
– [Narrator] This is why
companies like Facebook use subcontractors. But according to this researcher, this isn't the only reason. – It's about labor costs. But it's also about creating layers of lessening responsibility between those who solicit
this kind of work and need it, and those who do it and where they do it. They remove themselves. They put themselves at a distance from the workers and their conditions and it's not just a geographic distance but sort of a moral distance. So when that content moderator
some years later alleges harm or is having trouble
psychologically or emotionally because of the work that they did, then it may be possible for that company to disclaim responsibility for that, even though ultimately
they really are responsible because they ask them to do
that work in the first place.
– [Narrator] Despite these precautions, three former moderators filed
lawsuits against Facebook in the U.S a few months ago. All three were working
under sub-contractors. All claim to be victims of
post-traumatic stress disorder. The American company refused
every request we made for an interview. They did however, send us an email to explain how Facebook, with its partners pays great attention to the well-being of content moderators working on its platform, which is an absolute priority. (soft music) To finish off, here's
some of the latest news from the sector. While these ghost workers
are left in the shadows, it's business as usual for the companies working in this new sector. A few weeks after filming,
Figure Eight's founder sold his company for $300 million. (Lukas laughing) Well, at least now he has
good reason to be happy.
(soft music).