Aktualisiert: 30. Okt 2019
In the exhibition „Praying for my haters“ at the Centre Culturel Suisse in Paris artist Lauren Huret looked behind the curtains of the internet industry. Huret travelled to Manila in the Philippines to get in contact with the people who work for the big tech companies like Facebook or Instagram screening the flux of images on these platforms for harmful content. In addition to her onsite research the accompanying publication is an integral part of her artistic practice, in which she interviews experts on the ethical and psychological implications of content moderation and the effects the screening process does have on the workers.
We sat down with the artist and talked about her trip, cursed images and heteromation.
First of all, what led to Manila? What were you looking to find?
Back in 2014, I found an article by Adrien Chan published in The Wired about ‚Content Moderation‘ in the Philippines. Reading it was a shock. Although the majority of the tech companies are based in western countries, most of the dirty work is done elsewhere – meaning in countries where it’s possible to pay very little wages for an atrocious type of work, and obviously without any consequences.
These tech companies have built an online environment, in which we feel safe to share personal bits of our lives. But this content is seen, analyzed and used by the companies. Sharing things online is the beginning of a long process of lucrative steps for companies – and harmful ones for content moderators. They are getting seriously harmed by having to look at problematic and violent content all day long. That’s why I went to Manila, to confront myself with the reality of this work and to see the effects of the online sharing economy.
Which are the companies that make use of this service and how is this images-selecting organized?
Basically all companies that are based on user generated content (also named UGC). The process is very complex and different for each company. As are the guidelines that are subject to country laws and moral taboos.
Could you explain to us what kind of visual content gets filtered?
It’s very hard to explain, since every company has its own rules and the process is very complex. For instance, it’s allowed to share videos of two people fighting in the streets as long as the caption mentions that it’s wrong or should not be reproduced. But if the caption encourages violence or spreads hate, it will be removed from the platform. In some cases it is obvious if the videos are hard to watch, but in other cases it is difficult to define or to understand the real implications of a content.
Lauren Huret, Manila stories (chasing ghosts on social media), 2018, (c) Lauren Huret
In many of your artworks that you show in „Praying for your haters“ you recur back to the images that are generated by social media as they take on for example the format of Instagram stories. Can you tell us a little bit more about these works and what they show?
Learning about all this back then in Manila, it was definitely hard for me to share anything online. But I decided to still use the tools and apps to produce images and video related to my trip – like a diary, but only ‚shared‘ in the art space. I used the format of Instagram stories, loops of my daily life and weird visuals that together create a narration. And I am still creating these short videos that are bits of my life, I call them „luxurious gifs“.
In your theoretical work you often speak about cursed images. What do you mean by the term cursed images and in which way are they cursed?
A cursed image is an ‚esoteric‘ term that I came up with trying to understand the effects of an image from a different perspective – a more spiritual one. I tend to think that only recently we are seeing so many images, so diverse in their meaning, reception and use. As an artist, it was important for me to reflect and think about the ‚performative power‘ of an image. Images are ‚shaping‘ us, much more than we tend to think.
„Depression hits fast, suicide rates are huge in this work field.“
How do the content moderators cope with seeing such harmful pictures everyday?
If you’re looking at videos of rape all day long, this will become your reality and you will suffer from it, consciously or not. It’s also called torture… Depression hits fast, suicide rates are huge in this work field. Some people cope, some can’t… I heard stories of people completely changing their lifestyle after this job. I heard people having serious OCDs, and having hallucinogenic experiences. This one guy working for Microsoft, who sued his company for being depressed, is hearing voices of children being raped. Depression hits fast, suicide rates are huge in this work field.This one guy working for Microsoft, who sued his company for being depressed, is hearing voices of children being raped.
There are also a lot of christian figures on display in your exhibition. How do these ancient iconic images play into the world of digital images?
Even if the catholic imagination and imagery is not ‚not violent‘, I would say it was some sort of relief to use this imagery because it’s so ancient, and somehow oddly calming. The figure of Christ is representing the sufferings, and some content moderators are identifying (in a very sane way I think) to this religious figure. It’s their way of coping with the situation and the reason to keep going – thinking that they are martyrs for a ‚better internet’. I found this really inspiring and courageous. You have to put images, stories and words on your sufferings which religious beliefs can give you.
Depending on the cultural background everyone probably has different judgements on what they find harming. Some might find pictures of gay men kissing or of artworks that show naked women harmful and others don’t. What rules do generally apply and isn’t this also – in certain cases – kind of a censorship?
The most obvious and annoying one is the nipple. When the nipple belongs to a shirtless man, it’s all fine. But when it belongs to a woman, it has to be censored. As a woman I am tired of this because it’s still referring to the male gaze and how we should act and show ourselves. The guidelines are written by people who tend to think that what they think should be the norm.
Regarding the question of censorship, it’s a complex question because social media relies on problematic content to keep going. We know now that as humans we have the tendency to react when we’re outraged by things, and when something is fine, we do not react as strongly. It’s in the company’s interest to make us feel that we should share, react, be mad etc. If the network is thus filled with pictures of cute dogs, it will be boring. Social media companies want engagement from us, so you need that outrageous content from time to time… but not to the point that it’s impossible to watch. Fill the beast with porn and people will just leave the website: no people, no ads, no money.
The most obvious and annoying one is the nipple.
On one side we have the content moderators who have to look at these violent images every day, on the other we are also consuming a flood of images every day, though those might not be that violent. Does this constant stream of images affect us as well?
I do think so. I see myself change, I observe myself as an experiment, and I did some experiments myself. I’ve seen my friends and family changing, modified by what they saw or read. It’s the very first question linked to the power of all media, any types of affection. Political elections are a bright example nowadays (even if media changing the course of an election is not new at all). But I consider media in a large sense, in the definition that Yves Citton made: Words are media. They are a way to transmit information. Somebody will read this interview and will have an opinion, maybe it will modify the way they see the internet and social media, maybe they’ll think I sound pretentious! We need images to transmit emotions, that’s why we’re using emojis so much… we need to transmit our emotions with our messages, otherwise, it can be interpreted in so many different ways.
Another important aspect of your artistic practice is the accompanying publication in which you interview experts on the subject of content moderation. In your interview with Yves Citton you are talking about heteromation? Could you explain to us what this term means?
Heteromation is a term two researchers coined to describe the following process: work that is actually done by humans, but pretended to be done by machines, automation or computation-based work. Somehow, heteromation is an illusion. When you are on Facebook you don’t think of the thousands of people endlessly reviewing the content, the teamwork behind the codes, the energy and money needed to make that possible. You are seeing a facade. You are engaged and you are putting in work by sharing stuff, commenting, looking, … Someone also coined the term Potemkin AI, which I really like. It means that the supposed automation is all made by humans behind the machines.
When we think of content moderation, why is it that we think machines or algorithms are at work and not humans ?
I would say it’s a long process of being persuaded to use a tool without really knowing how it works, its core implications and effects. We’ve been fed by stories about machine learning, automation, neural networks. These words are working like smoke mirrors, and we tend to believe in these concepts so easily. It’s a mix of a lot of very smart PR, ads and confidential informations. It’s just the simple process of playing with our ignorance on the subject.
Are we all now just merely an integral part of a computation system? How does it render the social, economical and ethical standing of us?
What you referring too is very much linked to cybernetics. This idea that we are part of a network, and that everything is mathematics and programmable. I don’t think like that, I don’t want to. Things are way more chaotic and unpredictable that we can imagine. Concerning us, it belongs to us to change the way we’re creating and using things, by creating, criticizing, never stop thinking and by always challenging what’s there.
Lauren Huret, Praying for my Haters, Installation view, 2019, Centre Culturell Suisse, Foto: Margot Montigny
In your exhibition you installed an architectural model of one of these anonymous buildings that you find all over Manila and where the content moderators are at work. The viewer can get inside of it and finds himself in a panopticon, one that lets him see only multiplied, mirrored images of himself. Are these the sweatshops of the modern digital world, a place where no one can get out?
I conceived this sculpture with a friend, Benjamin Elliott, and we took some time to really understand what was at stake. We sat in the studio and thought about how to represent something that is very tricky to represent: the secrecy and horror of this job. We read Dante’s Inferno, and tried to be inspired by the architecture that Dante described in his book, these circles of hell. And from that, we tried to imagine to create another circle where people have to look at horrible stuff all the time. So we came up with trying to make a secret space, a server room that is both where the images are stocked and some kind of mortuary, a crypt with candles burning. The whole sculpture is a very high table and you have to go inside to discover that specific space. On top of the table, there is skeleton architecture empty and dispossessed of its function. The entire sculpture is describing the literal idea that you have to step in something, to be within, to see for yourself from inside. Do the research, do the work, never stop thinking.