Category Archives: privacy

The agony of a life without privacy. Or with too much of it. An interview with Mark Farid

Many artists and activists have worked on projects that denounce the loss of privacy online. Very few, however, have put their social, financial and mental well-being at risk in order to expose the damages of a carefree attitude towards our own digital footprint.


Mark Farid, Seeing I, ongoing


Mark Farid, Data Shadow, 2015

But Mark Farid did just that! Deeply convinced that “online privacy is the only right we have left”, the artist decided to give away his entire digital identity to anyone who’d want it. Back in 2015, he participated to a panel discussion entitled “Data Shadow: Anonymity is our only right, and that is why it must be destroyed” in Cambridge. After a presentation of his practice, Farid surprised the audience by displaying a document listing the login details of all his online accounts and inviting everyone to use them as they pleased. Within minutes, people he didn’t know had changed most of his passwords, from his online banking account to his Apple ID. The accounts were no longer his.

From that moment on, he embarked on a painful adventure: he lived with no digital footprint for 6 months, using multiple pay as you go phones, paying everything in cash, scrambling his IP addresses, etc. The experience was not only costly but it also made his social life ridiculously complicated.

After an experiment that suggested that Farid had something to hide, the artist went back to normal modern life, the one in which most of the exciting things you do is online and promptly turned into sets of data that both governments and corporations can snoop on.

Then came September 2016 when the artist decided time had come to further destroy his online privacy and demonstrate that he had nothing to hide. For a full month, he live streamed his digital footprint. The performance, titled Poisonous Antidote, involved exposing online and in a gallery in London all of his personal and professional emails, all his text messages, phone calls, Facebook Messenger, web browsing, Skype conversations, locations, Twitter and Instagram posts, as well as any photographs and videos. The work disclosed his interactions and daily life but it also questioned the assumption that you can fully comprehend a person through only their digital footprint.

Mark Farid, Data Shadow, 2015

I discovered Farid’s work at the Strasbourg Biennale of Contemporary Art, the first edition of an event that invites us to reflect on what it means to be a citizen in the age of hyper-connectivity. The biennale remains open until 3 March. I’d recommend you swing by the charming city to see the exhibition if you’re in the neighbourhood. If not, here’s a transcript of a conversation i had with Farid after i met him in Strasbourg. We talked about the density of our digital footprints, the mental pain of spending months off the digital grid, and his plan to spend 28 days wearing a VR headset to experience the life of another person.


Mark Farid, Poisonous Antidote exhibited at Gazelli Art House in 2016

Hi Mark! Recently, while I was recently visiting the Strasbourg Biennale, i saw the premiere of Poisonous Antidote, a film that you made in collaboration with film-maker Sophie le Roux to reflect on the whole performance. The press kit of the biennale mentions “there was no restriction on the content publicised”. But surely you did censor yourself sometimes, didn’t you?
So were there any self-restrictions?

There was absolutely a degree of self-censorship, and equally, there was absolutely an – arguably greater degree – of performing at the same time.

Poisonous Antidote had two sides to it for me, the first was for the audience: just how intimately you are able to comprehend a person – their humour, temperament and rationale – through only their digital footprint. When you listen to my phone calls with my dad, read my test messages with a girl I was seeing at the time, see what I was searching and where I have been going – very quickly, you really do start to get a very good idea of who I am.

The other side of it was more personal, it was for me to see to what degree knowing that everything I was doing was digitally documented forced me to see, and judge myself objectively. This of course resulted in self-censorship – limiting what I would search, what I would say to people, and most – how I would interact with people. I wouldn’t lose my temper, for example, except for once, at my dad, which is in the film.

It also resulted in ‘performing’ – I went on holiday with my dad for the first time in seven years during that month. I visited Leicester, where my parents live, three times that month, when normally I go back home once every three months. It also meant that I was going to lots of museums and galleries, going on walks and reading a broad range of things online instead of predominantly reading about football; I was trying to look like an interesting cool artist who does what a cultured artist should do.

Contrary to what I was expecting to do – which was to limit what i was doing – the performance actually opened up, and essentially forced me to do these activities, which I otherwise probably wouldn’t have done to the same degree, or at all.


Mark Farid, Poisonous Antidote exhibited at Gazelli Art House in 2016


Mark Farid, Poisonous Antidote exhibited at Gazelli Art House in 2016

Do you have any idea about the type of people who were watching the performance? Were they mostly friends who checked the website to support you? Or were they mostly complete strangers?

There was just over 32,000 views on the website during September 2016. The views have been from around the world but predominately came from Russia, USA and the UK. And obviously as well at Gazelli Art House, London, the gallery which funded and exhibited the project.

To date now, there have been 47,000 views on the website.

How about your friends and family? Did you make them aware that any interaction with you was being streamed online, for anyone to hear or read?

On the first day Poisonous Antidote started, I sent a message to all my contact (on my phone) to inform them that everything for the next month would be broadcast: emails, text messages, Facebook messenger, WhatsApp, and phone calls would be published online, in real-time, on www.poisonous-antidote.com. Pretty much every time I was on the phone I would inform them at the beginning that the conversation was being broadcast, and if people started getting slightly too intimate with me, then I would also remind them of the situation.

There were three people who did had an issue with this and refused to talk to me for that month. But to my surprise, only three people! But many of my friends liked it and some had a bit of fun with it and started trolling me which annoyed me a little bit at the beginning, but I would have done exactly the same thing!

I’m curious about the people who didn’t want to engage at all with you during that month? Were they a bit older, from a generation generally viewed as being more protective of their privacy?

No, they were in their 20s and 30s. There was also the Director of arebyte Gallery in London who wasn’t too happy with our conversations being broadcast. Funnily, he’s one of the first phone calls in the film. The last one too. He didn’t like it but ultimately he had no choice but to get in contact with me as they’re funding my next project, Seeing I. But it nicely highlights that even if you want privacy, and you were to do everything within your means to ensure your privacy (he does not), other people’s indifference to data privacy will ultimately be your demise.

But really, most people didn’t care. And that’s something I see elsewhere in my work, which until very recently I was finding to be very surprising.


Mark Farid, Data Shadow, 2015


Mark Farid, Data Shadow, 2015

Poisonous Antidote was the third and final part of a series of projects, wasn’t it? Can you tell me more about the other parts?

To coincide with the first draft of the Investigatory Powers Bill (Nov 2015), the first part of this three part series was Data Shadow (2015) which was an interactive art installation commissioned by Collusion, in partnership with Arts Council England, the University of Cambridge and The Technology Partnership. Located in All Saints Gardens, Cambridge, all visitors were required to interact independently with the installation, entering one at a time. 

On entering the 8 x 2m shipping container, the participant was greeted by a woman holding a contract of consent. Until signed, the woman remained silent, directly staring the individual in the eyes (a physical manifestation of Terms & Conditions). Once signed, candidates proceeded to join the WiFi. 

Participants progressed to the second half of the container, divided by a partitioning door. With sensors tracking the participant’s movements, their real-time (digital) shadow was cast by a projection on two facing walls – one filled with 1000 characters from their most recent text or WhatsApp messages, the other a collage of 64 images from the participant’s mobile phone.

On opening the exit door of the shipping container, all information Data Shadow had on the individual was automatically deleted, ready for the next participant to proceed.

To my surprise, the overwhelming majority of people were not annoyed, scared, or bewildered by it, at all?! There was an 18 year old girl who told me it was “so cool” that I had managed to expose the naked pictures of herself and her boyfriend on her phone.

The second part of this series was the accompanying talk to Data Shadow at St John’s College, University of Cambridge, in which I shared the login details to my entire digital life with the audience, inviting them to take, use, share and change the passwords to my accounts as they please, which ranged from my Facebook account, Apple ID, to online banking, whilst I attempted to live without a digital footprint for 6 months. If you want to know more about both of these projects, please watch my TEDx talk.

And then obviously Poisonous Antidote, the exhibition and online in 2016, and then the film in 2018, were the final parts of this series.

Mark Farid, Data Privacy: Good or Bad? at TEDxWarwick

There are many discussions about the need to value privacy as a protected right. Yet, I often feel that most people do not really care. Do you think that this is due to people not realising what a loss of privacy entails or is it simply that we have new definitions and forms of privacy?

Personally I think it’s because no one is giving a satisfactory response to the statement, “I have nothing to hide”. I put a lot of blame on this statement, as it really angers me. Mainly, because it’s not true – everyone has something to hide, but more so, because I doubt that everyone, on their own, universally came up with this statement, so they’re simply regurgitating an incorrect statement, that shuts down the conversation. Why it shuts down the conversation, is because, the phrase “I have nothing to hide” is based on a presumption that we are guilty until proven innocent. This question flips the subject, quietly, but firmly putting the onus on you to explain why you have nothing to hide. It suggests that you are guilty until proven innocent and this fundamentally goes against innocent until proven guilty. But as I say, I don’t necessarily blame the people making this statement, I blame others – myself included – for not being able to give a good, snappy one-liner response to this.

To come back a bit more to your question, on people not caring about data privacy, I had always ultimately assumed it would take a huge public breach of data privacy before people changed their view. And then Cambridge Analytica happened and I thought that would have played that role of offering a counter-response by showing what happens when private data is used politically and in the wrong way. However, it still hasn’t brought any real results – legally, socially and culturally – in fact, it is Mark Zucckerberg and Facebook who are suggesting the regulation that should be imposed on them! Truly Crazy!

I feel where work like mine has missed the point, is that think people need to be confronted with their data in front of loved ones, but then this becomes too unethical, and this is the problem, data misusage is hugely unethical, and to really highlight it, I think you must be equally, if not more, unethical.

Still, i feel that we’ve reached a point i find a bit unpleasant. I was recently at the Chaos Communication Congress in Leipzig and that’s probably the only conference i attend nowadays where they tell you specifically that if you want to take a photo of the audience you have to ask people for their permission to be photographed first. Everywhere else, people do as they please and you end up finding photos of yourself drinking wine and making stupid faces on strangers’ Instagram accounts…

Speaking of the top of my head, this is touching upon two truths, I think.

First there’s the idea that in a free, neo-liberal world, the individual is at the centre: the individual is free to do as they pleases; asking for permission to take a photo of someone, in a public space, can be seen to be a restriction of the individuals (the photographers) liberty (to do whatever they want wherever they are). Obviously this is a slightly warped take on neo-liberalism as we’re talking about taking photos of people, but I think this is one of the roots of what enabled the normalisation of people to take photos of whatever they want. Of course then there is the uploading of the photos on the world wide web, but extremely rarely do people stumble across photos of themselves taken by strangers. But I think this has a bigger foundation in what the world wide web used to be, something free, open and for the people. Back in the day, pre-Facebook, when Google was simply a search engine, when Amazon was a bookshop, ideas of the internet, and even filming and taking photos in public spaces was completely different. But now, as the world wide web becomes more and more centralised, monopolised, and a capitalist Utopia – totally privately owned, essentially unregulated by governments, with a facade of being uncensored, free and democratic – we are the commodity and our data is the currency. This changes everything.

This leads us nicely into the second truth, which I think is a development of the centralisation of the world wide web. Facebook, Google and other privately owned online companies want your data, and want to know everything about you. Not you specifically, but you within the collective. Governments are quietly happy with this, for obvious reasons, along with the monopolisation (of the world wide web by Google and Facebook) as it’s hugely expensive and time consuming to get warrants to demand that lots of different companies must handover data. Getting two warrants, and forcing those two companies to do so is much easier and quicker than getting 20, for example. And then once they do get the data, getting it from one or two companies means they don’t have to process, organise or most importantly analyse your data themselves to nearly the same degree. If your data is coming from 20 different companies, that all use different formatting and (code) languages, then you have to change languages, and format, then aggregate and process this, and so on, which is very expensive and time consuming.

It’s significantly cheaper and efficient for Governments, advertisers, or anyone for that matter, to have one or two companies having all of your data.

And then of course when Google, Facebook, and the Government are ultimately in favour of a very similar, particular model, a subtle and continuous message from these companies and institutions will have underlying messages pushing the came philosophy – that data privacy doesn’t matter, or, ‘I have nothing to hide’. Overtime, this subtle message becomes engrained, and when you combine these incredibly powerful and influential companies (and Governments) you start to realise just how hard it is to argue, and succeed, in the fight for data privacy.

Coming back to your performance, i must say i found it very shocking, even though there is a long tradition of artists exposing their private life in the most open way (people like Tehching Hsieh, Marina Abramovic, Yoko Ono, etc.) I should be immune to a work like Poisonous Antidote but i’m not.

How has this one month performance changed the way you view social media and digital technology in general? Did it change the way you are using it?

Interestingly, Poisonous Antidote was actually the thing that got me to start using social media again. It really was my Poisonous Antidote. I’m not too sure about the title of the project, but it is a fitting one for my personal experience. As I mentioned earlier, when Data Shadow was being exhibited, and I gave the accompanying talk titled, Anonymity is our only right, and that is why it must be destroyed, I gave away the passwords to my entire online life, from my Facebook account, to my Apple ID, to my online banking account login details, and everything in between. This included my social media accounts: Facebook, Twitter and Instagram. That was in October 2015. Poisonous Antidote was September 2016. So for that period I didn’t have any account.

It was only halfway through September 2016 that I I started to use (new) social media accounts, and this was to see the difference without social media and with social media and how that would change the whole experience to me…

What!??! You didn’t have any social media account for such a long time?!

No, when I gave away my login details, people pretty much immediately changed the passwords to my accounts, and that was that. They were gone and I had no way of get them back. Someone had been using my Facebook account for quite awhile, with some friends getting messages from whoever had the account. Someone is still using my old Twitter account (@markfarid). They set it private and occasionally tweet at me which is both frustrating and fun. It’s worth noting that I didn’t have any kind of accounts for this period – social media to an Apple ID to a smart phone. In any case (and ironically) Poisonous Antidote – a fight for data privacy – made me go back to social media.

But yes, it changed the way I use social media, a bit, but social media – and most website – are quite limited in what you’re really able to do. And as was the case with the gallery director of arebyte Gallery (in the Poisonous Antidote film), other people significantly reduce all of your efforts.

Do you do anything specifically different?

I’m very clear about the way I interact with everything and treat it as if it were a public conversation. So for example, the way that UK is moving at the moment, regrettably I won’t be surprised if we, by necessity, have a form of privatised healthcare in the future, and social media data, amongst others, will almost certainly be one of the things used to decide how expensive or cheap your medical insurance is. So i’m extremely cautious about the kind of things I’m saying to friends and family on Messenger, along with any photos of me, and what I’m doing in them.

I do as much as I can, within reason, the ‘headline’ thing, I guess, being that for every service that I use online, (Facebook, Twitter, Skype etc.) I have a different email address linked to it. Each email address is linked to a pay-as-you-go sim card that’s paid for in cash. Facebook is linked to one email address. Twitter is linked to another one, as is every sign-up I do. Nothing is linked together. I also use fake information, such as fake names and wrong birth dates. Ultimately, if someone wanted to link these accounts together, they could, it’ll just involve a bit more legwork for whoever is doing it.

How many email addresses do you have?!

Haha, I’m not sure. Many many dozens.

But then it makes you look suspicious.

I guess so, but the idea of ‘Anonymity is our only right, and that is why it must be destroyed’ was that by various strangers using my old, but real Facebook account, Twitter account, JustEat, Quora, etc. there shouldn’t appear to be anything suspicious, as they’re being used. That’s the reason why i gave away my passwords, because you’re not able to delete your accounts, so giving away my passwords was the only way I could essentially think of to make the accounts redundant and make the data useless, and at the same time, become invisible, for use of a better word.

How did ‘Anonymity is our only right, and that is why it must be destroyed’ go? How long were you doing it?

‘Anonymity is our only right, and that is why it must be destroyed’ was from October 2015 to April 2016.

When i gave away my passwords in October, the plan was to live for six months without a phone or computer. About 3 weeks after the start of the project, the Paris attacks happened which changed everything. For me, the Paris attach was the 9/11 of Europe. That was the big moment when terrorism became really present in Europe – innocent people being shot in restaurants, concerts and on the streets. The political and social landscape changed overnight. The argument for data privacy, fell flat on its face. The argument for data privacy is built upon individuality and the option to not conform, which is hugely outweighed when confronted with life and death.

During this 6 month period, without a phone or computer, my financial life plummeted beyond all belief. Along with my social life. To talk to me, people had to travel across London and knock on my door and hope I was in.

At one point, my dad hadn’t heard from me for a few months and was incredibly worried. He drove three hours to London to see if I was alive. I returned home to my flat to see a hugely worried, angry, but relieved parent.

At this point, I ‘decided’ I would get a different phone each month and would give the number of that phone to 5 people each month, and it would be a different 5 people every month – to made it harder to link it to me, i.e. If you had my phone number one month, you wouldn’t get my new one the next month, and this included my parents. Due to a plummeting financial side, and a lack of work coming in, I also got 3 different computers, each one I would only use for very specific purposes, at very specific locations, at specific times, whilst using a VPN and Tor.

What i found most traumatic in this experience wasn’t the horrific financial struggles, nor the lack of a social life. Not even the fact that someone committed fraud against me during this time and totally destroyed my credit rating – to the point that I was rejected from a new phone contract last year. All of these things were bad but manageable.

The hardest thing, however, was the cultural vacuum that I lived in during this period. Each day, I was getting more and more disconnected from the ‘real’ world, and what I found was that, slowly, this was effecting how I was interacting with people, what we would interact about, and my overall confidence. For example, it took me over 24 hours to find out about the Paris attacks, but if you were on Facebook – which I had no access to – things were going crazy, you couldn’t not know about it, but in the physical world, people were barely, if at all, talking about it.

I absolutely became depressed and developed quite a serious dependency on drugs a few months into this, and I became a hermit in many ways. In a very simplistic way, one drug replaced another. It wasn’t sustainable – although I did do the 6 months, it was truly awful. Having privacy to allow my sense of self to be free is not what I thought it would be.

I’d recommend watching my TEDx talk, if you want to know more about ‘Anonymity is our only right and that is why it must be destroyed’ and Poisonous Antidote.

So Poisonous Antidote followed this in September 2016?

Poisonous Antidote was my flip back because I found that the exact opposite happened: in totally giving up privacy, by having external-internal, expectations placed upon myself, I was infinitely happier. But this is the problem – my projected self, a constructed image of who I want to be, had a dedicate place to exist, and in doing so, it placed my virtual self at the centre of my real-world experiences.

For the first fifteen days, I found my usage of phone and laptop were normal. If I consciously acknowledged a change, it was, if anything, that I gained in confidence – using the experience opportunely to send things I might otherwise have not. But on September 15, half way through the project, I made a Facebook and a Twitter account, and almost immediately I became aware, even anxious even, about some of my prior arrogance.

Progressively more changes occurred. When I woke up, my phone and computer were not the first things I looked at, as I didn’t want people to know what time I woke up. When I did eventually go on the Internet, check my emails or reply to messages, the first things I looked at was the news, not the Leicester Mercury to read about football. When I was working, I didn’t procrastinate – digitally anyway, because everyone could see. I became very aware of my locations, so I made sure that I was going out, that I was being sociable, and that people could see I was being productive with my time. I would speak to my family more, I was on the Internet less, I was more productive, and surprisingly to me, I was enjoying myself more.

Mark Farid, Poisonous Antidote, 2016. Film by Sophie le Roux


Mark Farid, Poisonous Antidote, 2016

This is very different from ‘Anonymity is our only right, and that is why it must be destroyed’…

Poisonous Antidode certainly highlighted my own online editorial habits, the density of our digital footprints and, for me, the necessity of privacy. Unlike ‘Anonymity is our only right, and that is why it must be destroyed’, where I gave up online privacy to gain personal privacy, only to realise social media is indispensable to contemporary life, Poisonous Antidote embraced the publicity of social media. Subsequently, I have found that I was consciously and subconsciously changing my actions and behaviour to ensure I conformed to my insecurities, rooted in society’s ideologies – that I was doing what I was “meant” to be doing and feeling validated by the knowledge people could see this.

My narcissistic thirst for approval led me to willingly relinquish privacy in exchange for a perceived social stardom, where I was constantly judging my actions and options through the potential reception of my newsfeed, assessing my and their adherence to a standardised code of conduct allowing a form of acknowledgment that confirmed my actions and behaviours. The validation Poisonous Antidote created could only be fulfilled by further consumption, and as we used it more, each post, action and interaction meant less, for I become more reliant on it to fill the growing void it created. It become a self-feeding machine.

Now of course, publishing every part of my online activity might appear to represent some dystopian future. But the truth is that most of us are doing exactly this right now, – albeit in a more limited way. We are constantly self-publishing the details of our lives to technology companies, to governments, and to our networks on social media. The difference between you and I is of degree, not kind.

I also recommend going on www.poisonous-antidote.com where you can see all of my data for the month. It has been hacked multiple times, but all of the data should be back on there. You can also see the Poisonous Antidote film which premiered at the Strasbourg Biennale, and was made in collaboration with the very talented film-maker Sophie le Roux!


Mark Farid, Seeing I, ongoing


Mark Farid, Seeing I, ongoing

And of course, i need to ask you about Seeing I which is scheduled to premiere in September 2019. For the performance you plan to experience the life of another person for 24 hours a day, for 28 days, only seeing and hearing what one person sees and hears, using VR. You had a 24 hour test. Could you tell us how it went?

The 24-hour test run was in February 2014, using a DK1 (the first Oculus development kit). It went well. Well enough to pursue the project and put it on KickStarter in November 2014, which was ultimately unsuccessful. Since then however, I have gained funding from various place and institutions and done many, many, many more trials, the longest being 92-continuous-hours.

Unsuccessful? But i saw LOTS of articles about the projects and they all suggested that you already had the money!

Indeed, this was very frustrating. The KickStarter campaign got lots of press, but the press made no mention of the KickStarter.

In the 4 and a bit years that have followed however, the project is now being commissioned by arebyte Gallery, and is in partnership with the Sundance Institute, the National Theatre, UK, Imagine Science Films and looks set to be taking place at Ars Electronica in September 2019.

We have just finished developing the headband that record a 260 degree field of view left to right and 165 degrees up and down, along with audio. This is what the other person will wear, and has been the main thing holding us back from doing Seeing I, as it doesn’t exist, until now! It has a battery life of 36 hours along with a storage for 36 hours. We will be open sourcing the code, along with the .STL file for people to 3D print after the project has taken place.

The last three sections on the pre-production side to finalise is getting Ethical Approval on the project, and finalising the funders of the documentary (directed by Petri Luukkainen), which I can’t publicly name at this time. I hope to have these two parts finalised in the next two months.

The third section is who the Other person is. If you’re interested in applying, please visit www.seeing-i.com and apply online! We favour applicants living a life aligned with the ordinary rather than the extraordinary. This means, a real-life person who thinks their life is “too boring”!


Mark Farid, Seeing I, ongoing

How do you prepare for this performance?

Since January 2018 I have been, and will be until the exhibition in September 2019, spending prolonged periods of time in virtual reality – a minimum of 45-hours per week – to train my eyes to function in close proximity to LCD display (with no exposure to natural light). This is also to overcome any potential motion sickness I might experience.

The longest I’ve spent in continuous virtual reality is 92-hours, as I mentioned earlier. This was the longest trial I’ve completed without taking the headset off, watching footage from a single person’s life, from first person point of view. I’ve also spent four successive days in virtual reality on three separate occasions, but these were watching stuff on Netflix, YouTube and playing games.

In June 2018, I averaged 16-hours per-day in virtual reality for 23-consecutive-days. This was a specific test, focusing on my eyesight and any potential short or long term damages that might occur. The results found no long-term damage to my eyesight, with the only short-term damage being that he was short-sighted for the two following days.

And another key part of my preparation for the isolation that I will experience during the project, where I will have no human interaction, no stimulation, and no eye contact, I have attended two 10-day silent meditation retreats and I’ll be doing another three in the build up to the 28-day exhibition.

Thanks Mark!

Touch Me, the 1st edition of the Strasbourg Biennale of Contemporary Art was curated by Yasmina Khouaidjia. The event remains open until 3 March 2019 at Hôtel des Postes in Strasbourg, France.

Previously: Strasbourg Biennale. Being a citizen in the age of hyper-connectivity.
Poisonous Antidote was commissioned by Gazelli Art House, London, and in partnership with CPH: LAB.

Global Control And Censorship

After last week’s Notes from the RIXC Open Fields conference, it’s time to have a quick look at the accompanying exhibition of this year’s edition of the RIXC Art Science Festival.

The theme of the exhibition, curated by Lívia Nolasco-Rózsás and Bernhard Serexhe, is encapsulated in its title: Global Control And Censorship.


Ruben Pater, Drone Survival Guide, 2013. Photo: RIXC

The curators wrote in their introductory text to the exhibition:

Surveillance and censorship are mutually dependent; they cannot be viewed separately. It has always been well known that the surveillance of citizens, institutions, and companies, indeed, including the monitoring of democratically elected politicians and parliaments or of journalists and lawyers, is a secret task of government agencies. Recently, however, this tradition of government-legitimized spying on all citizens has expanded to include additional spying by powerful service providers and business enterprises. At the same time, courageous journalists, who disclose information that carries enormous importance to society such as illegal surveillance activities, censorship and torture by governmental institutions, are prosecuted and punished. Even in our day, journalists, artists and writers critical of the system and whistle-blowers are branded as traitors.

The exhibition is not ground-breaking* but it is solid, coherent and thought-provoking. I was particularly impressed by the way the curators take us from one location to another, showing how surveillance encroaches on freedom of movements, communication and actions no matter where we are on the planet. Sometimes the means of surveillance and their impact seem to be site-specific. Often though, they replay the same patterns of scrutiny and blackout that have been adopted everywhere else.

Here are some of the works i found most interesting:


Osman Bozkurt, Post Resistance, 2013


Osman Bozkurt, Post Resistance, 2013

Photographer Osman Bozkurt documented the remains of the slogans, drawings and other signs that were painted onto the surfaces of public spaces in Istanbul at the time of the Gezi Park demonstrations in Istanbul in 2013.

That Summer, thousands of citizens occupied the park to oppose its proposed demolition as part of an urban development plan. The police’s violent response to the unrest provoked strikes and further protests across the country, with citizens expressing their disapproval of large-scale urban and economic changes proposed by the government, attacks on freedom of the press and of expression, the encroachment on Turkey’s secularism and Erdogan’s authoritarian measures. The movement was eventually dispelled by the brutal governmental riposte, leaving many people injured or imprisoned.

Authorities made sure that the protests slogans and signs on the walls were swiftly painted over. Boskurt documented the grey patches that haunt the areas surrounding the unrest. They remain as ghosts of attempts to defend the rights to a fair society.


aaajiao, GFWlist, 2010. Photo: RIXC

The Great Wall of China, an over eight thousand kilometers-long series of fortification, was built to protect the Chinese states and empire against raids and incursions by nomadic peoples. Its information age equivalent, the Great Firewall of China, was engineered to regulate the Internet domestically and keep unwanted information, ideas and images out of the Chinese Internet. Both Chinese and foreign websites and news stories are censored by the GFW mechanisms.

GFWlist, by artist and activist Xu Wenkai aka aaajiao, is an installation that relentlessly prints the URLs of the websites that are banned on the Chinese Internet. A printer spits out the list on a long scroll of paper that falls down and forms a heap onto the floor. The printer is perched on a black monolith similar to the one that puzzles prehistoric humans in Stanley Kubrick’s 1968 movie A Space Odyssey. The meaning of the monolith remains a mystery for most film critics. Some like to interpret the structure as a trigger of self-awareness in the early humans and thus the beginning of civilization.

Because China prohibits to even publish the list of the blocked web-addresses, aaajiao’s installation stands as a poetical but explicit message of civil disobedience.


Hamra Abbas, Text Edit, 2011

Hamra Abba’s video is simple and incredibly moving. The screen shows an email in the process of being written by a woman who is announcing her pregnancy to a friend. While composing her message, the writer keeps erasing and correcting her words, self-censoring for fear that her words might be monitored and misinterpreted. Her joke about how people “terrorized” her into having a child is being amended so that the word “terrorized” becomes “coaxed”. Similarly, words like ‘blast’ or ‘chaos’ suddenly take an ominous meaning and she quickly erases and replaces them.

Such is her fear of the possibility of being under surveillance, that the final version of her message is brief but bland and devoid of any of the joy you would expect in such circumstances.


Daniel G. Andújar, Let’s Democratise Democracy, 2011-ongoing


Daniel G. Andújar, Let’s Democratise Democracy, 2011-ongoing. Photo via think commons

During the celebration of Labor Day and then again the day before Spain’s general election in 2011, Daniel G. Andújar rented a small plane and flew a banner that said Democraticemos la democracia (Let’s Democratise Democracy) from Murcia to Alicante. His yellow banner reappeared several times in Spain that year. The slogan was translated and brandished in places as diverse as the Ministry of Defense in Belgrade, the nuclear shelter of Tito in Bosnia Herzegovina or a refugee camp in Western Sahara. Whether his slogan takes the form of stickers, posters, graffiti, flags or installations, it always adapts and takes a new meaning and target with each location. Depending on the context, the Let’s Democratise Democracy slogan is interpreted as a challenge to corruption, inflation, expulsions, surveillance, etc. The motto works no matter the type of attack on democracy.

Because the artist believes that public space belongs to everyone and that it must be continuously conquered from hegemonic attempts to control it, he encourages passersby who stumbles upon his project to document it with their phone and spread the message further.


Marc Lee, Security First, 2015. Photo: RIXC


Marc Lee, Security First, 2015. Photo: RIXC

Marc Lee, Security First, 2015

Marc Lee shows displays “the wonderful world of surveillance technology.” The array of surveillance cameras he lines up on shelves is completed by a monitor showing the website insecam.org. While the cctv apparatus is sold as the gateway to protection and peace of mind, the directory of online surveillance security cameras reminds us of the threat these cameras present for our privacy.

More works and images from the Global Control and Censorship exhibition:


Dan Perjovschi, Drawings, 1995–2015


Dan Perjovschi, Drawings, 1995–2015. Photo: RIXC


View of the exhibition space. Photo: RIXC


Erik Mátrai, Turul, 2012. Photo: RIXC


Ma Qiusha, Twilight Is the Ashes of Dusk, 2011


View of the exhibition space. Photo: RIXC

Also part of the exhibition: Peters Riekstins, Back to the Light.

The RIXC Open Fields conference, organized by RIXC the center for new media culture, is over but if you’re in Riga, don’t miss the accompanying exhibition: Global Control and Censorship. It’s at the National Libary of Latvia until 21 October 2018.

More images of the exhibition opening in RIXC’s flickr album.

* i think i will always miss the extraordinary bite and vision that Armin Medosch was bringing to the RIXC festival.

Faceless. Re-inventing Privacy Through Subversive Media Strategies

Faceless. Re-inventing Privacy Through Subversive Media Strategies, edited by artist and researcher Bogomir Doringer in collaboration with curator and cultural studies scholar Brigitte Felderer.

On amazon USA and UK.

Publisher De Gruyter writes: The contributions to this book explore a phenomenon that appears to be a contradiction in itself – we, the users of computers, can be tracked in digital space for all eternity. Although, on the one hand, one wants to be noticed and noticeable, on the other hand one does not necessarily want to be recognized at the first instance, being prey to an unfathomable public, or – even less so – to lose face.

The book documents artistic and other strategies that point out options for appearing in the infinite book of faces whilst nevertheless avoiding being included in any records. The desire not to become a mere object of facial sell-out does not just remain an aesthetic endeavor. The contributions also contain combative and sarcastic statements against a digital dynamic that has already penetrated our everyday lives.


REBEL YUTHS, Masks, 2011-2013


Teresa Dillon, Under New Moons We Stand Strong, 2016. Photo: Fraser Denholm and Yvi Philipp

I love exhibition catalogues. Most of them give you a colourful overview of a show you’ve had the bad idea to miss. Others, however, do far more than that. They take the print as an opportunity to bring different voices around the pages to dissect and discuss a particular field of research, expanding on the exhibition itself and becoming a work of reference in the process. Faceless. Re-inventing Privacy Through Subversive Media Strategies is of the latter breed.

Faceless started as a duo of exhibitions that opened at Q21_ in Vienna in 2013. The shows investigated the hiding, distorting and masking of the face in post-9/11 visual culture. The practice, set against the backdrop of a massive production of images and a political frenzy to supervise movements, responds to various motivations: a need to regain some control over an identity, to protest against control and surveillance, to challenge mainstream ideas of acceptable bodies, etc.

As the book demonstrates, the strategies adopted to morph and conceal a face are as diverse as they are creative. It’s quite interesting to contrast some of them with the now normalized practice of publishing selfies in which the face has gone through so much (physical) makeup and (digital) filters action that the individual is barely recognizable. Everyone knows you don’t look like that at all in real life but we’ve stopped batting an eyelid a long time ago.

The essays and artistic contributions featured in the book are consistently excellent. Thomas Macho, for example, charts the strategies of facelessness through art history. Matthias Tarasiewicz discusses the zero trust society and the necessity to literally play hide and seek with surveillance infrastructures in order to obtain personal privacy online. Hille Koskela explains how exhibitionism, aided by digital media, has become “the new normal”. Teresa Dillon comments on the violent and material role that CCTV cameras play in urban life. Adam Harvey presents an e-commerce platform entirely dedicated to accessories and tricks for countersurveillance. Rosa Menkman has an eye-opening look at the use and abuse of the faces of (Caucasian) women in the history of image processing.

The best surprise for me, little Margiela maniac, was to find excerpts from the interviews that mint film office had done with members of the Martin Margiela team for their WE MARGIELA documentary. Margiela was an iconic fashion designer famous for the way he shrouded himself in invisibility. He shunned public appearances, refused to release any official portrait and accepted only a few interviews but then they had to be carried out via faxes. He was also a genius at disrupting all the fashion codes.


KNOWBOTIC RESEARCH, The MacGhillie Saga

My recommendation to you would be to get this book if you’re interested in how questions of control&surveillance, identity&politics of the body are explored critically across a wide range of cultural manifestations. Not just in contemporary art but also in cinema, fashion, street culture, sexual fetishism, etc. Faceless manages to put a new, brave and thought-provoking spin on crucial topics that dominate our culture but still deserved to be discussed with intellectual rigour. And a bit of humour here and there.

Just a couple of the many creative works i discovered in the book:


Martin Backes, Pixelhead limited edition, 2010

Pixelhead is a full face mask acts as media camouflage, completely shielding the head to ensure that your face is not recognizable on photographs taken in public places, without securing permission. This piece is inspired by google street view and therefore bridges the gap between the real and virtual world. This simple piece of fabric masks individuals’ anonymity for the Internet age.


Sofie Groot Dengerink, © Google Privacy, 2011 © Google Maps and Sofie Groot Dengerink

Window curtains in The Netherlands are often either left wide open as a protestant statement that there is nothing to hide. Sofie Groot Dengerink‘s series of snapshots from Google Streetview lays bare the digital invasion of our (physical) privacy.


Jan Stradtmann, Garden of Eden, 2008

Shot furtively on Canary Wharf (London’s financial district) in September and October 2008, Jan Stradtmann’s photos reflect the tense atmosphere of the early days of the economic crisis. Everyday situations and gestures -cigarette breaks, phone calls or casual meetings between colleagues- get interpreted and framed as if they had a direct link to the crash.


Vermibus, In Absentia


Ben DeHaan, Uncured

Ben DeHaan’s melting portraits were created with a run-of-the-mill inkjet printers that use ultraviolet light to dry the ink printed on a page, which happens to be UV-sensitive. The ink dries — or cures — almost instantaneously. Unless you disable the UV light which is exactly what the artist did. He then photographed the prints as the ink was slowly dripping down the face of his subjects.


Simone C. Niquille, Here Be Faces, 2013

Pablo Garcia and Addie Wagenknecht, Webcam Venus, 2013


Caron Geary aka FERAL is KINKY, Frontal View No. 2 of White British Female, UK born-‘Feral’, London – Self Portrait, 2007

Handbook of Tyranny: a guide to everyday cruelties

Handbook of Tyranny, by Theo Deutinger, an architect, writer, lecturer, illustrator and designer of socio-cultural maps.

On amazon UK and USA.

Publisher Lars Müller writes: Handbook of Tyranny portrays the routine cruelties of the twenty-first century through a series of detailed non-fictional graphic illustrations. None of these cruelties represent extraordinary violence – they reflect day-to-day implementation of laws and regulations around the globe.

Every page of the book questions our current world of walls and fences, police tactics and prison cells, crowd control and refugee camps. The dry and factual style of storytelling through technical drawings is the graphic equivalent to bureaucratic rigidity born of laws and regulations. The level of detail depicted in the illustrations of the book mirror the repressive efforts taken by authorities around the globe.

The twenty-first century shows a general striving for an ever more regulated and protective society. Yet the scale of authoritarian intervention and their stealth design adds to the growing difficulty of linking cause and effect. Handbook of Tyranny gives a profound insight into the relationship between political power, territoriality and systematic cruelties.


Animals slaughtered per second worldwide and slaughterhouse floor plan


Animals slaughtered per second worldwide

The Handbook of Tyranny‘s infographics and texts bring to light the nonhuman entities that restrict, govern and guide our daily existence. They lay bare a vast ecosystem of coercion that is (often insidiously) interwoven into the fabric of cities, of society, of every day life.

Some of these ‘small cruelties’ are engineering innovations, others are small design tweaks. Some are massive and overwhelming, others are subtle, their unpleasantness concealed behind a veneer of propriety, comfort or security. Some affect the existence of only a limited part of humanity (the refugees or the prisoners, for example), others target each and everyone of us as we walk around the neighbourhood, go on holiday or look for a place to sit in the park.


Bunker Buster


Prison cells

We might resent some of these objects and strategies of control but that doesn’t mean that will will automatically condemn them. At least not if we are told that they have been designed to ensure our safety and protect us from undesirable behaviour.

Handbook of Tyranny is a sharp, enlightening and beautifully designed book. It told me about anti-injecting blue light, urine deflectors that ‘pee back‘ at you and bunker busters that delay their explosion until after they have penetrated layers of earth or concrete. It also made me think about the responsibility for the authoritarian features of modern life: they do not reside entirely into the hands of ‘the powers that be’ but also in the ones of architects, designers, engineers and, to a certain extent, the rest of us.

Theo Deutinger & Lars Müller Publishers present Handbook of Tyranny at Pakhuis de Zwijger


Refugee Camps


Crowd Control


Crowd Control


Walls & Fences

Related story: Book review – Unpleasant Design and Design and Violence. Part 2: violence where you wouldn’t expect it.

Forensic Fantasies, online scams and the fragilities of IoT. An interview with KairUs

Many of you have probably heard of Agbogbloshie, the biggest and most infamous e-waste dump in the world. That’s where most of the “Western” world’s electronics is (illegally) sent to rest and be dismantled by young people who ruin their health breathing toxic fumes and trying to salvage the precious metals our trash contains.

But our old bits and pieces of hardware don’t just contain copper and gold, they also hold personal, corporate and military information that can be retrieved and used by cyber criminals.


KairUs art collective Linda Kronman and Andreas Zingerle

The duo KairUs (artists/researchers Linda Kronman and Andreas Zingerle) traveled to Agbogbloshie in Ghana to investigate the issue of data breaches of private information.

The result of their research is Forensic Fantasies, a trilogy of artworks that use data recovered from hard-drives dumped in Agbogbloshie to answer the question: What happens to our data when we send a computer, an hard disk or any kind of other storage device to the garbage?


Forensic Fantasies trilogy: #2 Identity Theft, exhibited at Aksioma – Institute for Contemporary Art, Ljubljana, 2018. Photo: Janez Janša / Aksioma

The first chapter in the series, Not a Blackmail examines the possibility to identify the prior owner of a hard-drive and extort money from them (with emphasis on the word “possibility” they didn’t actually try and ransom the owner!) The second work, Identity Theft, focuses on the fraudulent online profiles created for romance scams. Finally, Found Footage Stalkers uses images retrieved from one of the hard-drives to create photo albums, as a direct reference to the traditional practice of using found footage to create new artworks.

There’s something very disturbing in Forensic Fantasies. The trilogy not only connects us with the after-life of our electronics but it also makes palpable a series of dangers that would otherwise appears far-fetched and abstract to most of us.

KairUs‘s work focuses on human computer and computer-mediated human-human interaction. Since 2010 they have investigated the issue of Internet fraud and online scams. Both of them are currently holding an Assistant Professor position at Woosong University in South Korea where they are also doing research on the vulnerabilities of Internet of Things and Smart Cities.

KairUs have an exhibition right now at Aksioma, everyone’s favourite cultural venue in Ljubljana, Slovenia. The show focuses on the Forensic Fantasies Trilogy but i’d recommend you check out the fascinating talk the duo gave at Aksioma a couple of weeks ago because it not only sums up and comment on the trilogy but also presents the artists’ ongoing research into the weaknesses and pitfalls of the much-hype Internet of Things.

Behind the Smart World: Artist talk by KairUs (Linda Kronman and Andreas Zingerle) at the Aksioma Project Space in Ljubljana on 14 February 2018

Hi Andreas and Linda! Your work Forensic Fantasies – #1: Not a Blackmail examines the possibility to blackmail the prior owner of a hard-drive. Why did you not send the hard drive back to its owner? What does the letter to the owner say?

A primary motivation to visit Agbogbloshie in the first place was to answer the questions; if it is possible to use or abuse the data on a hard-drive recovered from an e-waste dump. As we had read cases about US senators being blackmailed, company secrets exposed and recovered hard-drives from US military contractors found amongst e-waste in West Africa, we were curious if our e-waste is really such a data breach as these reports were conveying. For us, the artwork ‘Not a Blackmail’ from the ‘Forensic Fantasies’ trilogy is a proof of concept that it is possible to recover data from a hard-drive and with the help of social media profiles track current contact information of the former owner, so that this person can be contacted and then potentially blackmailed. Of course our intention was not to blackmail this person, which is made clear in the title of the artwork (‘Not a Blackmail’).

The whole Forensic Fantasies series is also about the idea of finding something sensitive or valuable on the hard-drives, and until one recovers the data there is always a chance, a fantasy of recollecting something important or of value, even scandalous. Much of the data we recovered and processed would be more or less boring for most of us in an other context, on the other hand the content of a hard-drive might still feel very personal and exposing for its former owner, so how important is it to expose this person? The name of the former owner is exposed through the artwork, but it is still common enough, avoiding a direct link to an individual. Keeping this in mind we have been thinking of ways to deliver the data back to the former owner in a way or another. Just sending the package might evoke a reaction to ignore us, so we are still waiting for opportunities to do it in a more personal way. As the artwork is still exhibited in this speculative format, we also have to think how it will be affected, how the art work changes if we actually manage to deliver the data to the owner.

The letter to the owner basically covers the story how we got our hands on his data, that we found personal and sensitive data on it that a criminal might use against him and that we decided to return the hard-drive to him.


Forensic Fantasies trilogy: #1 ‘Not a blackmail, exhibited at Aksioma – Institute for Contemporary Art, Ljubljana, 2018. Photo: Janez Janša / Aksioma


Forensic Fantasies trilogy: #1 ‘Not a blackmail, exhibited at Aksioma – Institute for Contemporary Art, Ljubljana, 2018. Photo: Janez Janša / Aksioma


Forensic Fantasies trilogy: #1 ‘Not a blackmail, exhibited at Aksioma – Institute for Contemporary Art, Ljubljana, 2018. Photo: Jure Goršič / Aksioma

One of the issues the trilogy revealed is the peril of not cleaning up or destroying hard drive before getting rid of it. How easy or difficult is it to do so exactly?

To physically destroy a hard-drive is the most secure way of getting rid of the data. There are hard-drive shredders or just drilling holes in the hard-drive is a common practice of companies, that are more aware of leaking data and want to prevent data breaches. One can also open the hard-drive and scratch the disc that contains data.

Of course if you ever saved anything in the cloud your data will be saved on hard-drives somewhere else, often copied on several locations. You will never have access to destroy these hard-drives, so we can only trust that companies have proper workflows of re-using and destroying hard-drives (this aspect also made us more aware of the materiality of the cloud).

Deleting data and emptying the virtual trash bin still allows data recovery. As long as data has not been overwritten by new data at least one time it is quite easy to recover, though recovered data is not organized which makes it more difficult to process. If a hard-drive is meant for re-use experts recommend to overwrite the data several times.

Data forensics have been able to recover data or parts of data in cases that seemed impossible such as broken hard-drives or discs destroyed by water. Yet this type of data rescue is time-consuming, needs special equipment and is expensive, whereas we were more interested in how easy it is to recover data and if data mining with very simple tools is possible at an e-waste dump.


Acquiring hard drives on the e-waste dump. Photo: Kairus.org

Could you tell us about people you met in Ghana who are also very concerned about the topics you are investigating?

What took us to Ghana in the first place is that we have been investigating internet fraud for a longer time in several of our artworks. West Africa is known for certain types of scams, but internet fraud, internet crime and scams in general are a global phenomena. People in Ghana are in general worried how trustworthy they are perceived online. Due to this bad reputation of a few scammers, service providers use the easiest way of dealing with this issue, blocking the IP-range of a whole country to access their webpage. Hence the general population is punished with quite insufficient means because with a bit of advanced knowledge this will not stop a scammer. We talked about this with several people we meet and also internet scam issues are discussed through popular culture, mainly so-called Nollywood films, that are mostly Nigerian and Ghanaian low budget films.

This perspective we try to bring forth in the second part of the trilogy ‘ID theft’ by compiling a found footage film from several Nollywood films dealing with this issue. These films are distributed as DVD’s everywhere in Ghana and are considered a very important channel for West Africans to reflect upon their own culture. Through the films it was also easier for us to discuss these issues with people we met, though in Ghana the scams were often blamed to be done by Nigerians living in Ghana. At the e-waste dump no questions were asked what we want to do with the hard-drives. As far as we talked with the workers there, they salvage and sort valuable metals such as copper cables or computer parts with gold and other metals, processors, hard-drives, etc. These parts are then sold in bulk. Hard-drives are most probably bought in bulk by data rescue companies for their spare parts. In general, mining data from the e-waste dump is probably very marginal and unknown by the general public in Ghana. A bigger concern is the illegal trade of e-waste from the US and Europe that ends up in West Africa.


Map: Global illegal waste traffic

The work involved discussions with other artists about the ethics of using this type of ‘stolen’ material. On the one hand, people have thrown it away so it’s fair game. On the other, personal data is very sensitive. So i’m wondering what these conversations concluded about the ethic of using this data in art works? Is it just a big no no or are there conditions that make it acceptable?

Gathering and bringing back the hard-drives from Ghana was one thing, but what do you do with it and how to share it with other artists? Together with the Linz-based net culture hub servus.at we organized an artlab where we invited individuals from a trusted network of EU-based artists to participate in the project. We met in Linz for the “Behind the smart world” research lab, spending together several days to have a look at the data and give time to people to discuss and find ways to work with it.

A central issue was the privacy of the former owners of the hard-drives. Together we found different aspects on the hard-drives interesting and also developed strategies to abstract the data through artistic processes. Experiments were done to sonify folder structures, record booting attempts of the disks themselves, collages of browser cache or ascii renderings of videos and images. There was also participants attending the artlab that decided not to work with the material feeling that they had to compromise their working ethics or concerned that their reputation in handling sensitive material trusted to them would be compromised in the future. We highly respected these individual decisions.

Working with this material the longest we decided to take a more provocative approach in our third artwork Found footage stalkers we unveil the photos from one of the hard-drives, giving very personal insights into the life and habits of its former owners. Flipping trough the photos from parties with friends, trips to amusement parks and Christmas celebrations with the family evoke a similar feeling to stalking someone unknown on social media. Despite the rather uninteresting photo material, one starts to create a story and attach a personality to these fragmented digital representations. By presenting the photos in albums we approach the material as ‘found footage’, the practice of gathering material from thrift shops, yard sales and flea markets for remixing and creating new artworks, something artist have done for generations. Hence the artwork confronts earlier practices of using ‘found footage’ with now digital materials found amongst our trash. In the end, everyone has to decide for themselves how to deal with the data and what do with it. Artworks using the data from the Agbogbloshie hard-drives were shown for the first time in the ‘Behind the Smart World’ exhibition at the Art meets radical openness (AMRO) festival.


Artlab at servus.at in Linz, Austria. Photo: Kairus.org


Artlab at servus.at in Linz, Austria. Photo: Kairus.org


Forensic Fantasies trilogy: #3 Found footage stalker, exhibited at Aksioma Project Space. Photo: Janez Janša / Aksioma

Your ongoing research in South Korea, titled The Internet of Other People’s Things is, i think, a very relevant and important one because it lays bare the pitfalls of the internet of things, especially when deployed on the scale of a whole city like Songdo. How much awareness is there of that problem in Europe, among city developers, members of the public, tech companies, etc?

South Korea is one of the rapidly developing, tech-driven Asian states, with the second fastest and most connected society in the world, when you look at average internet connection speed and active social media penetration. At the same time, it’s a very young democracy, everything is driven by the government and market with a top-down approach, not focussing on the people who shall live in smart cities or use IoT devices on a daily basis. Cities like Songdo are built from scratch, supported by big tech companies amongst others IBM, CISCO and LG. This hyper-connected urban environments path the way to technocratic governance and city development, corporatization of city governance, technological lock-ins and hackable ‘pan-optic’ cities.

In Europe on the other hand, we see a much more inclusive development where citizens and communities become a vital part of city developments. Citizens are the ultimate actuators of a city. How are citizens involved in co-design collaborations with private corporations and the public sector to build better cities? Around this topics we are working on a publication where we seek submissions from researchers, artists, hackers, makers, activists, developers, and designers that explore vulnerabilities in IoT devices and other embedded systems e.g. in smart cities. We aim to bring artworks, projects, and essays together to create new critical perspectives on ubiquitous technologies. The full open call can be found on our website.


A view over the ‘central park’ of Songdo. Photo: Kairus.org


Not functioning automated vacuum waste collection system in Songdo. Photo: Kairus.org

Could you give a few example of the dangers you see in this massive investment in building ‘smart’ cities out of scratch and implementing IoT in our cities and homes? Do you think it is doomed to fail and be hacked or are there better ways to implement IoT?

Well, there is a couple of thing we were able to observe.

One is what we call the ‘Ruins of a smart city’. With this we imply designs, scenarios and technology that is hyped when a smart city is planned, yet already obsolete or even dysfunctional in the process of building the city. For example in Songdo City there is a central pneumatic waste disposal system. Citizens can activate a smart trash bin with their ID cards and are then able to deposit their garbage in the bins. The trash gets transported at high speed through underground pneumatic tubes to a collection station where it is separated and recycled. The city wants to eliminate the need for garbage pick-up. During a field research together with activists from Seoul-based Unmakelab, we were able to observe that the trash system is not working and piles of trash become part of the urban landscape. On the other hand, residents living in the buildings, that have invested in this infrastructure, now pay for a dysfunctional system.

Due to examples like this, Songdo has been criticized to be a prototype city or a test bed of technologies. For us, this shows that from a citizen perspective important questions to ask are actually maintenance, openness and sustainability of the technology one is intended to live with. The technological development progresses so rapidly and specially from a city planning perspective where 5-10 years later, many ideas they had envisioned for Songdo are unpractical or not used by the citizens.

A system that is praised in the advertisement material of Songdo is that each home is equipped with multiple screens that allows telepresence with other homes or institutions such as hospitals or schools. This is an idea developed before the times of mobile internet usage, which city really needs a stationary video telepresence infrastructure now? Other observations were non-functioning sensors paved into the streets and lamp posts. We documented this failed implementations also to point out the materiality of the “smart technologies” in a city.

Another thing we have became aware of is that through these massive smart city projects the city is increasingly being corporatized. Songdo is owned by three companies Gale International, Posco and Morgan Stanley. Further on cities naturally make contracts with technology companies who also end up owning wast amounts of citizen data.

We want to understand what kind challenges emerges when technotopian cities are not populated with their imagined tech-savvy international citizens. Who is included, who is excluded when we talk about smart cities? On the other hand, how do actual residents reshape, redesign, misuse or opt out from technological lock-ins?

Until now we have been mostly concentrating on Songdo which is categorized as a first generation smart city, followed by several generations that gradually starts to consider the citizen also in planning. Also the term ‘smart city’ has been widely contested while it represents diverse values, solutions and implementations depending on context.

Further on when we investigate ‘smart cities’ we are looking at those current and future scenarios in which our things are wirelessly connected, so Internet of Things. We do not think that IoT is doomed, but we see that designs are far from being sustainable, privacy respecting and somewhat secure. We are not against the development of IoT per se, though we are also not convinced that the proposed technologies that are branded ‘smart’ are best practices of solving the problems they intend to tackle.


Panopticity: ‘Seoul’ video screenshot. Photo: Kairus.org


How an attacker runs DDoS attacks on a victim’s IP camera

Could you explain the first work you did in that research, the city portrait of Seoul through insecure public CCTV and private IP cameras?

Cities, companies and private persons use networked security cameras often including tracking software for their surveillance. Various brands offer products with integrated web-server allowing remote processing and streaming of the video footage, adding these devises to the growing amount of connected devices. These web-servers are often ‘insecure by design’, meaning they are not protected by a password or have hard-coded login credentials saved as plain text. By default, the servers stream unencrypted and on publicly-accessible network ports, providing potential risks of being intercepted and allowing unknown third parties unintended access to the set up function of the cameras. Some manufacturers use the same vulnerable settings across their entire camera lineup.

“By default, the Network Camera is not password-protected”, or “the default user name is admin” and “the password is 12345” can be read in the camera manuals. We recorded video footage from these web streams and assembled it into a city portrait through the lenses of unsecured video cameras. The experimental video is currently touring film festivals and we are finishing a video installation that portrays mega-cities around the world through their unsecured video cameras. Paradoxically the security camera becomes a security risk. We are also fascinated to see what is surveilled around the world in the name of security.


KairUs, Forensic Fantasies. Artists talk at Aksioma – Institute for Contemporary Art, Ljubljana, 2018. Photo: Jure Goršič / Aksioma

This is probably a stupid question but i have to ask it: why are you called KairUs?

Back in 2010 when we started our artistic collaboration we were clear that we want to work with time-based interactive installation art. We adapted the ancient greek word ‘Kairos’ (καιρός) meaning the right, critical, or opportune moment. Whereas the western definition of time ‘chronos’ is purely quantitative, kairos has a qualitative, permanent nature. As artists we look for these opportune moments for our artistic expression, it encourages creativity, to adapt to unforeseen obstacles and opinions that can alter opportune or appropriate moments to produce art.

Thanks Linda and Andreas!

KairUs (Linda Kronman and Andreas Zingerle)’s solo exhibition Forensic Fantasies is at Aksioma | Project Space in Ljubljana until 16 March 2018.

And should you be a researcher, artist, hacker, maker, activist, developers or designer whose work and/or writing explores the vulnerabilities in IoT devices and other embedded systems, then have a look at this open call for submissions (deadline is 30 April 2018.)

Related stories: Permanent Error, e-waste, porn, ecology & warfare. An interview with Dani Ploeger, When erased data come back to haunt you and Harvesting the Rare Earth.

Watching You Watching Me. A Photographic Response to Surveillance

If you happen to be in Belgium this week, don’t miss Watching You Watching Me. A Photographic Response to Surveillance, a show at BOZAR which makes it clear that technology has left us with nowhere to hide. We knew that already of course. Edward Snowden’s NSA revelations have pulverized any dream of internet as a space for free and uninhibited exchanges.

Watching You Watching Me explores how artists are responding to the world’s transformation into a vast tech-mediated panopticon. Some of the artists reveal the efforts deployed by governments and corporations to monitor our online thoughts and ideas, with no concern for our privacy and freedom of expression. Others make visible the new forms digital self-surveillance and ‘virtual vigilantism’ facilitated by social media and access to webcams across the world.

I feel like i’ve blogged about surveillance/sousveillance hundreds of times already but i was impressed with this show, it is solid, enlightening and should appeal to the wise and the uninformed alike. It closes on Sunday so be quick and visit it if you’re in the area. Here’s a quick overview of the works on show (i only skipped the ones i wrote about in the past):


Julian Roeder, Thermal Imaging Camera, 2012. A portable, long-distance infrared thermal imaging surveillance system used by a Bulgarian Frontex unit


Julian Roeder, Monitoring Zeppelin, 2013. Near Toulon, southern France.
A Wescam MX 15 surveillance camera operator inside a monitoring zeppelin. This photograph was taken during an initial testing phase of a EUROSUR research project aimed to improve control of illegal immigration in the Mediterranean


Julian Roeder, Frequency-Modulated Continuous-Wave Radar and High-Performance Wescam MX 15, near Toulon, southern France. A frequency-modulated continuous-wave radar—used for the detection of small wooden boats—and a high-performance Wescam MX 15 surveillance camera are mounted on a dirigible


Julian Roeder, Polish Frontex Officer, 2012. A Polish Frontex border patrol officer stands with an ICS30 thermal imaging reconnaissance camera near the border between Greece and Turkey. Evros region, northern Greece


Julian Roeder, Border Situation, 2012. Border patrol police monitor the external border of Greece

Julian Roeder’s Mission and Task series exposes the high-tech surveillance apparatus deployed by Europe along its external borders. Thermo-cameras, surveillance drones, satellite technology, radar equipment for hunting down fleeing refugees and migrants add a digital and unforgiving layer to the old barbed wire, walls and fences.

These deterrents and the way they function elude visual representation.

“With my work, I intend to portray a border security system consisting of surveillance infrastructure that ensures the relative affluence of life in Europe,”
Roeder wrote. “I know of many works dedicated to representing the fate of migrants. I wanted, however, to create works that do not focus on “the other” itself, but on the systems and mechanisms used to construct and control “the other.”

In making these images, I was particularly dedicated to showing how technologization turns the handling of migrants into an abstraction. The focal point is a technology that records humans as data, currents, points of light, and as signals—not as individuals. Through an excessive enhancement of the photographic aesthetic, this technology can become a tool and symbol for alienation instead of a responsible means of dealing with people.”


Edu Bayer, Former Gadhafi Intelligence Facilities in Tripoli, Libya. Interior of the main center of Internet Surveillance and Internal Security of the former Gadaffi regime. Computers, files, and electronic devices abandoned in a 6 floor building. August, 2011


Edu Bayer, A room in Libya’s internet surveillance center, Tripoli, Libya, August 30, 2011

Edu Bayer‘s images depicts the physical remains of Colonel Muammar al-Qaddafi’s surveillance apparatus.

The internet surveillance center in Tripoli was a six-story building where the government monitored citizens’ movements and correspondence. After the 2011 civil war, the repressive machine was left empty, documents were shredded and the internet traffic monitoring and filtering equipment was abandoned.


Simon Menner, From a Disguise Seminar


Simon Menner, From a Disguise Seminar


Simon Menner, Transmitting Secret Signs

Simon Menner spent two years exploring the archives of the Stasi (East Germany’s Ministry for State Security.) Almost 300,000 people worked for the secret police, per capita far more than were employed by the CIA or the KGB.

After the downfall of the GDR, the Stasi’s operations were laid open to the public and reviewed by the StaSi Records Agency, an office set up specially for this purpose.

The archive images that Menner selected are exhibited un-retouched. They often look funny to the modern eye but the reality they depict is dark: these are real photos of real people who were trained to systematically pry and report on their neighbors and family members. What makes these images unique is that, as Menner explained “the public has very limited access to pictures showing the act of surveillance from the perspective of the surveillant. We rarely get to see what Big Brother sees”.


Josh Begley, Information of Note (detail), 2014. Composite image and text-based installation featuring photographs and observational notes culled from New York City Police Department (NYPD) Demographics Unit documents

Josh Begley‘s Information of Note is another work that brings to light the Big Brother perspective. The installation consists of text and photographs that were extracted from New York City Police Department’s secret Demographics Unit (later the Zone Assessment Unit). The operation systematically spied on the daily lives of Muslims, mapping and monitoring the communities, where they go to pray, buy veggies or have a coffee.

The unit was disbanded in 2014 after public outcry. Or maybe because snooping on Muslim led to no leads.

By re-contextualizing this material in floor-to-ceiling collages, Begley paints a disturbing picture of the mundane nature of the “evidence” collected.


Mari Bastashevski, It’s Nothing Personal (detail), 2014. Hacking facility at “CyberGym”, an Israeli cyber defense role-playing training facility that provides IT security training to enterprises and government officials


Mock-up illustration of It’s Nothing Personal, 2014. Photographs: Mari Bastashevski. Design: LUST

One of the most interesting works in the show is It’s Nothing Personal, an installation that takes a close look at the booming industry that caters to governments’ demand for surveillance of mass communications. These electronic surveillance companies might operate in a covert world and design products that are meant to be undetectable, but that didn’t prevent them from developing a strong corporate image and language.

Working with the NGO Privacy International, Mari Bastashevski combines her own photographs with trade fair brochures and corporate documentation from the industry. Her installation brings side by side this sanitized corporate aesthetic with testimony from an Uzbek human rights advocate whose life has been affected by the kinds of scrutiny that private surveillance companies enable.

All advertising catalogues from It’s Nothing Personal project are also easy to find, using the instrument ‘search for text files’ on the servers of the companies. All of this information is in open access.


Andrew Hammerand, The New Town (detail), 2013


Andrew Hammerand, The New Town (detail), 2013

Andrew Hammerand’s series The New Town was shot via a web-controllable CCTV installed by the property developer of a community in the American Midwest to monitor and publicize construction progress. Hammerand managed to get online access to the device’s entire control panel, allowing him to remotely operate the camera and subvert its intended purpose in order to make photographs.

The pixelated, blurry photos immediately call out the visual language of surveillance footage and “evidentiary” images often used to stir suspicion. Are these people really honest citizens or are they criminals, missing persons and murder suspects?

The work points to the dangers that the increasing (and often careless) use of domestic surveillance pose to privacy and personal freedom.


Tomas van Houtryve, Baseball practice in Montgomery County, Maryland. According to records obtained from the FAA, which issued 1,428 domestic drone permits between 2007 and early 2013, the National Institute of Standards and Technology and the U.S. Navy have applied for drone authorization in Montgomery County


Tomas van Houtryve, Students are seen in a schoolyard in El Dorado County, California. In 2006, a drone strike on a religious school in the village of Chenegai reportedly killed up to 69 Pakistani children

In October 2012, a drone strike in Pakistan killed a 67-year-old woman picking okra in her garden. At a U.S. Congressional hearing held a year later in Washington, D.C., the woman’s 13-year-old grandson, Zubair Rehman, spoke to lawmakers. “I no longer love blue skies,” said Rehman. “In fact, I now prefer gray skies. The drones do not fly when the skies are gray.”

According to strike reports complied by Human Rights Watch and Amnesty International, Zubair Rehman’s grandmother is one of several thousand people killed by covert U.S. drone strikes since 2004. Although we live in the most media-connected age in history, the public has little visual record of the drone war and its casualties.

Tomas van Houtryve used a drone to shoot photos of weddings, afternoons in public parks, birthday parties and other relaxing moments across America. The photos mirror the ways US drones are being used for targeted killings at Yemeni, Afghan and Pakistani gatherings.

Watching You, Watching Me was curated by Stuart Alexander, Susan Meiselasz, Yukiko Yamagata. The show remains open until 18 February 2018 at BOZAR in Brussels.
Organized by: Open Society Foundations, in cooperation with BOZAR and Privacy Salon

Entrance is free.

Related stories: Exploitation Forensics. Interview with Vladan Joler, Black Diamond. The internet is full of loopholes and leaks, The System of Systems: technology and bureaucracy in the asylum seeking process in Europe, Book review: Top Secret. Images from the Stasi Archives, Big Eye Kabul. Surveillance blimps over Afghanistan, Identity squatting and spy training. A conversation with Simon Farid, The Influencers: Former MI5 spy Annie Machon on why we live in a dystopia that even Orwell couldn’t have envisioned, Politics and Practices of Secrecy (part 1) and (part 2), Confessions of a Data Broker and other tales of a quantified society, Unauthorized photos of U.S. intelligence officials stencilled on the walls of your city, etc.

Exploitation Forensics. Interview with Vladan Joler


Vladan Joler and Kate Crawford, Anatomy of an AI system (detail)

If you find yourself in Ljubljana this week, don’t miss SHARE Lab. Exploitation Forensics at Aksioma.

The exhibition presents maps and documents that SHARE Lab, a research and data investigation lab based in Serbia, has created over the last few years in order to prize open, analyze and make sense of the black boxes that hide behind our most used platforms and devices.

The research presented at Aksioma focuses on two technologies that modern life increasingly relies on: Facebook and Artificial Intelligence.

The map dissecting the most famous social media ‘service’ might be sober and elegant but the reality it uncovers is everything but pretty. A close look at the elaborate graph reveals exploitation of material and immaterial labour and generation of enormous amounts of wealth with not much redistribution in between (to say the least.) As for the map exploring the deep materiality of AI, it dissects the whole supply chain behind the technology deployed by Alexa and any other ‘smart’ device. From mining to transport and with more exploitation of data, labour, resources in the process.

Should you not find yourself in Ljubljana, then you can still discover the impulses, findings and challenges behind the maps in this video recording of the talk that the leader of the SHARE Lab, Prof. Vladan Joler, gave at Aksioma two weeks ago:

Talk by Vladan Joler at the Aksioma Project Space in Ljubljana on 29 November 2017

In the presentation, Joler talks superpowers of social media and AI invisible infrastructures but he also makes fascinating forays into the quantification of nature, the language of neural networks, accelerating inequality gaps, troll-hunting and issues of surveillance capitalism.

I also took the Aksioma exhibition as an excuse to ask Vladan Joler a few questions:


SHARE Lab (Vladan Joler and Andrej Petrovski), Facebook Algorithmic Factory (detail)

Hi Vladan! The Facebook maps and the information that accompanies them on the Share lab website are wonderful but also a bit overwhelming. Is there anything we can do to resist the way our data are used? Is there any way we can still use Facebook while maintaining a bit of privacy, and without being too exploited or targeted by potentially unethical methods? Would you advise us to just cancel our Facebook account? Or is there a kind of medium way?

I have my personal opinion on that, but the issue is that in order to make such a decision each user should be able to understand what happens to their private data, data generated by activity and behaviour and many other types of data that is being collected by such platforms. However, the main problem, and the core reasoning behind our investigations, is that what happens within Facebook for example, i.e. the way it works is something that we can call a black box. The darkness of said boxes is shaped by many different layers of in-transparency. From different forms of invisible infrastructures over the ecosystems of algorithms to many forms of hidden exploitation of human labour, all those dark places are not meant to be seen by us. The only thing that we are allowed to see are the minimalist interfaces and shiny offices where play and leisure meet work. Our investigations are exercises in testing our own capacities as independent researchers to sneak in and put some light on those hidden processes. So the idea is to try and give the users of those platforms more facts so that they are able decide if the price they are paying might be too high in the end. After all, this is a decision that each person should make individually.

Another issue is that, the deeper we were going into those black boxes, the more we became conscious of the fact that our capacities to understand and investigate those systems are extremely limited. Back to your question, personally I don’t believe that there is a middle way, but unfortunately I also don’t believe that there is a simple way out of this. Probably we should try to think about alternative business models and platforms that are not based on surveillance capitalism. We are repeating this mantra about open source, decentralised, community-run platforms, to no real effect.


SHARE Lab. Exploitation Forensics at Aksioma Project Space. Photo: Jure Goršič / Aksioma


SHARE Lab. Exploitation Forensics at Aksioma Project Space. Photo: Janez Janša

The other depressing thing is that for many people, Facebook IS the internet. They just don’t care about privacy, privacy belongs in the past and being targeted is great because it means that Facebook is extra fun and useful. Do you think that fighting for privacy is a futile battle? That we should just go with the flow and adapt to this ‘new normal’?

It is interesting to think that privacy already belongs to the past since historically speaking privacy as we understand it today is not an old concept. It is questionable whether we ever had a moment in time when we had properly defined our right to privacy and we were able to defend it. So, from my point of view, it is more of a process of exploration and an urge to define in each moment what privacy means in present time. We should accept the decentralised view on the term privacy and accept that for different cultures this word has a different meaning and not just imply, for example, European view on privacy. Currently, with such a fast development of technology, with the lack of transparency-related tools and methodologies, outdated laws and ineffective bureaucracies, we are left behind in understanding what is really going on behind the walls of leading corporations whose business models are based on surveillance capitalism. Without understanding what is going on behind the walls of the five biggest technology firms (Alphabet, Amazon, Apple, Facebook and Microsoft) we cannot rethink and define what privacy in fact is nowadays.

The dynamics of power on the Web have dramatically changed, and Google and Facebook now have a direct influence over 70% of internet traffic. Our previous investigations are saying that 90% of the websites we investigated have some of the Google cookies embedded. So, they are the Internet today and even more, their power is spilling out of the web into many other segments of our life, from our bedrooms, cars, cities to our bodies.


SHARE Lab. Exploitation Forensics at Aksioma Project Space. Photo: Janez Janša


SHARE Lab. Exploitation Forensics at Aksioma Project Space. Photo: Janez Janša

Could you explain us the title of the show “Exploitation Forensics”?

Oxford dictionary is giving us two main uses of the word exploitation : (1) the action or fact of treating someone unfairly in order to benefit from their work and (2) the action of making use of and benefiting from resources. Basically both versions are essentially related to two maps that are featured in the exhibition. We can read our maps as visualisations of exploitation process regardless whether we speak about exploitation of our immaterial labour (content production and user behaviour as labour) or we go deeper and say that we are not even a workers in that algorithmic factory, but pure, raw material, i.e. a resource (user behavioural data as a resource). Each day users of Facebook provide 300.000.000 hours of unpaid immaterial labour and this labour is transformed into the 18 billion US dollars of revenue each year. We can argue if that is something that we can call exploitation or not, for the simple reason that users use those platforms voluntarily, but for me the key question is do we really have an option to stay out of those systems anymore? For example, our Facebook accounts are checked during visa applications, and the fact that you maybe don’t have a profile can be treated as an anomaly, as a suspicious fact.

Not having profile will place you in a different basket and maybe different price model if you want to get life insurance and for sure, not having Linkedin account if you are applying for a job will lower your chances of getting the job you want. Our options of staying out are more and more limited each day and the social price we are paying to stay out of it is higher and higher.

If our Facebook map is somehow trying to visualise one form of exploitation, the other map that had the unofficial title “networks of metal, sweat and neurons” is visualising basically three crucial forms of exploitation during the birth, life and death of our networked devices. Here we are drawing shapes of exploitation related to different forms of human labour, exploitation of natural resources and exploitation of personal data quantified nature and human made products.

The word forensics is usually used for scientific tests or techniques used in connection with the detection of crime; and we used many different forensic methods in our investigations since my colleague Andrej Petrovski has a degree in cyber forensics. But in this case the use of this word can be treated also as a metaphor. I like to think of black boxes such as Facebook or complex supply chains and hidden exploitations as crime scenes. Crime scenes where different sort of crimes against personal privacy, nature exploitation or let’s say in some broad sense crime against humanity happens.


SHARE Lab. Exploitation Forensics at Aksioma Project Space. Photo: Jure Goršič / Aksioma


SHARE Lab. Exploitation Forensics at Aksioma Project Space. Photo: Jure Goršič / Aksioma

The maps are incredibly sophisticated and detailed. Surely you can’t have assimilated and processed all this data without the help of data crunching algorithms? Could you briefly describe your methodology? How you managed to absorb all this information and turn it into a visualisation that is both clear and visually-appealing?

In our previous investigations (eg. Metadata Investigation: Inside Hacking Team or Mapping and quantifying political information warfare) we relied mostly on process of data collection and data analysis, trying to apply different methods of metadata analysis similar to ones that organisations such as the NSA or Facebook probably use to analyse our personal data. For that we used different data collection methods and publicly available tools for data analysis (eg. Gephi, Tableau, Raw Graphs). However, the two maps featured in the exhibition are mostly product of long process of diving and digging into publicly available documentation such as 8000 publicly available patents registered by Facebook, their terms or services documentation and some available reports from regulatory bodies. At the beginning, we wanted to use some data analysis methods, but we very quickly realised that the complexity of data collection operations by Facebook and the number of data points they use is so big that any kind of quantitative analysis would be almost impossible. This tells a lot about our limited capacity to investigate such complex systems. By reading and watching hundreds of patents we were able to find some pieces of this vast mosaic of data exploitation map we were making.

So, those maps, even though they look in some way generative and made by algorithms, they are basically almost drawn by hand. Sometimes it takes months to draw such an complex map, but somehow I need to say that I really appreciate slowness of this process. Doing it manually gives you the time to think about each detail. Those are more cognitive maps based on collected information then data visualizations.


SHARE Lab. Exploitation Forensics at Aksioma Project Space. Photo: Jure Goršič / Aksioma

In a BBC article, you are quoted as saying “If Facebook were a country, it would be bigger than China.” Which reminded me of several news stories that claim that the way the Chinese use the internet is ‘a guide to the future’ (cf. How China is changing the internet) Would you agree with that? Do you think that Facebook might eventually eat up so many apps that we’ll find ourselves in a similar situation, with our lives almost entirely mediated by Facebook?

The unprecedented concentration of wealth within the top five technology companies allows them to treat the whole world of innovation as their outsourced research and development. Anarcho-Capitalist ecosystem of startups is based on a dream that in one moment one of those top five mega companies will acquire them for millions of dollars.

If you just take a look at one of the graphs from our research on “The human fabric of the Facebook Pyramid” mapping the connections within Facebook top management, you will probably realise that through their board of directors they have their feet in most important segments of technological development in combination with political power circles. This new hyper-aristocracy has a power to eat up any new innovation, any attempt that will potentially endanger their monopoly.

The other work in the Aksioma show is Anatomy of an AI system, a map that guides “visitors through the birth, life and death of one networked device, based on a centralized artificial intelligence system, Amazon Alexa, exposing its extended anatomy and various forms of exploitation.” Could you tell us a few words about this map? Does it continue the Facebook research or is it investigating different issues?

Barcelona-based artist Joana Moll infected me with this obsession about materiality of technology. For years we were investigating networks and data flows, trying to visualise and explore different processes within those invisible infrastructures. But then after working with Joana I realised that each of those individual devices we were investigating, has let’s say another dimension of existence, that is related to the process of their birth, life and death.

We started to investigate what Jussi Parikka described as geology of media. In parallel with that, our previous investigations had a lot to do with issues of digital labour, beautifully explained in works of Christian Fuchs and other authors, and this brought us to investigate the complex supply chains and labour exploitation in the proces.

Finally, together with Kate Crawford from AI Now Institute, we started to develop a map that is a combination of all those aspects in one story. The result is a map of the extended anatomy of one AI based device, in this case Amazon Echo. This anatomy goes really deep, from the process of exploitation of the metals embedded in those devices, over the different layers of production process, hidden labour, fractal supply chains, internet infrastructures, black boxes of neural networks, process of data exploitation to the death of those devices. This map basically combines and visualises three forms of exploitation: exploitation of human labour, exploitation of material resources and exploitation of quantified nature or we can say exploitation of data sets. This map is still in beta version and it is a first step towards something that we are calling in this moment – AI Atlas that should be developed together with AI Now institute during next year.

Do you plan to build up an atlas with more maps over time? By looking at other social media giants? Do you have new targets in view? Other tech companies you’d like to dissect in the way you did Facebook?

The idea of an Atlas as a form is there from the beginnings of our investigations when we explored different forms of networks and invisible infrastructures. The problem is that the deeper our investigations went, those maps became more and more complex and grew in size. For example, maps exhibited at Aksioma are 4×3 m in size and still there are parts of the maps that are on the edge of readability. Complexity, scale and materiality of those maps became somehow a burden itself. For the moment there are two main forms of materialisations of our research. First, the main form are stories on our website and recently those big printed maps are starting to have their life at different gallery spaces around. It is just recently that our work was exhibited in art context and I need to say that I kind of enjoy in that new turn.


SHARE Lab. Exploitation Forensics at Aksioma Project Space. Photo: Janez Janša


SHARE Lab. Exploitation Forensics at Aksioma Project Space. Photo: Janez Janša

How are you exhibiting the maps in the gallery? Do you accompany the prints with information texts or videos that contextualize the maps?

Yes. As You mentioned before, those maps are somewhat overwhelming, complex and not so easy to understand. On the website we have stories, narratives that guide the readers through the complexities of those black boxes. But at the exhibitions we need to use different methods to help viewers navigate and understand those complex issues. Katarzyna Szymielewicz from Panoptykon Foundation, created video narrative that is accompanying our Facebook map and we are usually exhibiting a pile of printed Facebook patents, so visitors can explore them by themselves.

Thanks Vladan!

SHARE Lab. Exploitation Forensics is at Aksioma | Project Space in Ljubljana until 15 December 2017.

Previously: Critical investigation into the politics of the interface. An interview with Joana Moll and Uberworked and Underpaid: How Workers Are Disrupting the Digital Economy.

Smart guide for connected objects, activism on the dance floor, cooking with phones, a human Alexa. Just another edition of the DocLab conference

The DocLab Interactive Conference closed at De Brakke Grond in Amsterdam on Sunday 19th of November. An integral part of the International Documentary Film Festival Amsterdam (IDFA), DocLab looks at how contemporary artists, designers, filmmakers and other creators use technology to devise and pioneer new forms of documentary storytelling. There’s an exhibition, an immersive network summit, screenings, performances and a conference. The conference is my favourite. Digital pioneers share with the audience their latest experiments and boldest visions of the future. Each year, the same thing happens: the talks start at 10 in the morning, i blink and it’s already 6pm. Their picnic bag is a monstrosity for anyone who’s into eating healthy (more about food later) but that’s just about the only negative thing i can say about the event.

Here are the notes i wrote down during the talks. They are not exhaustive, they only aim to highlight a few ideas and projects i found particularly thought-provoking:


Brett Gaylor at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

Superflux and Mozilla, Our Friends Electric

Brett Gaylor is the project lead for Mozilla’s Web Made Movies project. Before working with Mozilla, Brett directed the award wining documentary Rip! A Remix Manifesto (2008), an open source documentary investigating remix culture and copyright in the digital age. Two years ago, he authored Do not Track: an online, interactive documentary series about who’s watching you and who’s profiting from your private data.

In his brief and lively presentation, Gaylor talked about the Internet of Shit and the connected salt shakers, forks and other ‘smart objects’ that are actually stupid, insecure and easily hackable. He also showed us an extract of Our Friends Electric, a short film by Superflux and Mozilla which imagines life in the company of an AI virtual assistant that has its own personality.

But if there’s one link you should click on in this pre-Christmas silly period, it’s this one: privacy not included, a guide for shopping connected gadgets that respect your online privacy and security.


Memo Akten at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

Human Nature – Supernormal Stimuli

Memo Akten is a computational artist interested in tools that enable people to express themselves. His work also looks at the tensions between nature, science, technology, ethics, ritual, tradition and religion.

Akten will i’m sure go down in media art history as a brilliant researcher into AI but also as the guy who told us about Australian beetles mating habits. The males of the Julodimorpha bakewelli species like to couple with big, brown-orangey conquests covered in dimples. Which leads them to copulate with discarded brown beer bottles. The behaviour nearly wiped out the whole species until the beer company decided to change the bottle design and save them from extinction.

Humans are not necessarily always more perceptive than beetles. We also project meaning into what we see and Akten’s work explores how this translates when it comes to algorithms and how the way we use AI actually uncovers our own human biases.

He also made a couple of valid points about the rise of AI and the explosion of big data. The artist believes that AI has been around for years but we need it now more than before to make sense of big data. A second reason for this new interest in AI is that the very organizations that push it are the ones that rely the most on big data to make profits.


Bogomir Doringer at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

Short documentary which sums up the concept and background of FACELESS

The stand-out talk for me was Bogomir Doringer‘s. The artist and curator contemporary investigates collective and individual dynamics. He introduced us to two of his ongoing areas of research.

The first one is FACELESS which started in 2005, turned into an exhibition at q21 MuseumsQuartier Vienna in 2012 and has now taken the form of a book that should be published next year. FACELESS looks at the topic of hidden faces in society in relationship to the surveillance technologies deployed by government after 9/11. He showed us dozens of examples that demonstrate how widespread hidden and masked faces have become in the media, in the creative arts and in pop culture. From David Bowie album cover to Raf Simons putting balaclavas on the runway in 2012. From Adam Harvey to public protests where people wear masks to express that they are one body. From Burqa fetish to Jill Magid’s performance with CCTV in Liverpool.

The second research Doringer presented is I Dance Alone, a work in progress anchored in the artist’s experience of partying in the club Industria in Belgrade during the 1999 Nato bombings. To mock death or simply to try and forget about it. I Dance Alone places cameras above and on the dance floor in order to understand the rituals taking place when people dance. One of these rituals has urgency. The other is all about entertainment. I found the ‘urgency’ side of the research fascinating. Aside from the Industria nightclub, the artist also mentioned Bassiani, a nightclub in Tbilisi, Georgia. One important dimension of Bassiani is its social activism and in particular the way it encourages clubbers to join street protests, influence drug policy and tolerance towards the LGBTQ community in the country.


Lauren McCarthy at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum


Lauren McCarthy performance as part of Doclab. Photo Nichon Glerum

Lauren McCarthy discussed her attempts to become a human version of Alexa. The performance involves having access to all the software that runs your house and installing all kinds of gadgets in your home. Once everything’s in place you can ask her anything. From the weather forecast to an honest opinion about your new outfit.


Conference host Ove Rishøj Jensen introducing Jonathan Harris at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum


Jonathan Harris at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

This edition of IDFA was particularly satisfying for anyone interested in digital creativity because for the first time ever, the guest of honour of the festival was a digital pioneer. Jonathan Harris is an artist, a computer scientist and someone who’s generally concerned with bringing more compassion and human warmth to digital technology.

Harris has gained fame for works that include the interactive I Love Your Work (a portrait of nine women who make lesbian porn), We Feel Fine (a touching visualization of human feelings), The Whale Hunt (a 9 day journey with the Inupiat Eskimos), etc. For the DocLab conference, Harris focused on Cowbird, a website he conceived as a library of life experiences “filled with small moments of human connection.” A kind of instagram but more thoughtful, less complacent and launched years before instagram.


Yasmin Elayat at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

With her visually powerful and socially-engaged work at Scatter, Yasmin Elayat is trying to open up the storytelling and production process to the audience. One way to do that is by sharing the tools that Scatter builds. Their depthkit, for example, aims to put volumetric filmmaking into the hands of everyone.

She also briefly presented the very promising Racial Terror Project (working title) which uses VR to time-travels to sites where Claude Neal was dragged and lynched by a mob of white men in 1934 Florida. The project aims to be a ‘magical realist documentary’ that would reclaim the sites where violence took place and contextualize them.


Micha Wertheim at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

Micha Wertheim is a stand-up comedian, writer and satirist. Last year, Wertheim performed an experiment that made theatre history: he never appeared on stage during his performance. Instead, he used a robot, a printer, a stereo and a set of headphones to coax an unaware audience to perform the whole show in his absence. I hope the video of his presentation will be published at some point. It was hilarious and frankly genius.


Jonathan Puckey at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

Interactive designer Jonathan Puckey presented one of his ‘older’ projects and i adore it! The Radio Garden website allows you to spin a globe and listen to live radio from anywhere around the world.

He was actually on stage to present the interactive VR music video Dance Tonite which is rather impressive but i’m a fan of radio so i’ll stay with that good old media today.


Panel with Francesca Panetta, the Guardian executive editor for VR, Oscar Raby, Creative Director at virtual Reality studio VRTOV, and filmmaker Zhao Qi

The panel with Francesca Panetta, the Guardian executive editor for VR, Oscar Raby, Creative Director at virtual Reality studio VRTOV, and filmmaker Zhao Qi managed to pack many ideas and reflections in 20 minutes or so. I learnt that:
– The Guardian has its own VR studio and they’ve made 8 VR pieces so far. They recently gave out 100 000 google cardboard goggles.
– Women in VR face much gender discrimination.
– There are some 5000 VR cinema and arcades in China, making it easier for VR creators to reach audiences.
– Research has shown that if someone reads an article, there is 1% chance that that person will look for more information about that topic. But there’s 25% chance if they experience that same topic via VR.
– In case you’re wondering about the graphic screened behind the panelists on the photo above, it’s the hype cycle which represents “the maturity, adoption and social application of specific technologies.”


Emilie Baltz at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

Emilie Baltz creates musical licking performances and other food design extravaganzas. For DocLab, she collaborated with chef Matthias Van Der Nagel and Klasien V.D. Zandschulp to make us literally cook using a mobile phone, a box containing ingredients with strange textures and spiritual encouragement from Futurist Filippo Tommaso Marinetti. The result was called Amuse Telebouche.


The public enjoying Amuse Telebouche at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

As you can see in this very flattering photo, I’m not too keen on experimenting with food:


Killjoy me refusing to taste Amuse Telebouche at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum


The public cooking Amuse Telebouche at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum


The public cooking Amuse Telebouche at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum


The public cooking Amuse Telebouche at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum


Jessica Brillhart and Jason Kottke at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

Jessica Brillhart and Jason Kottke sat down on stage to ponder upon the question: The internet is on fire. What would you save? The selection included two of my favourite: Wikipedia and David OReilly’s Octocat.


Erwin Verbruggen at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

Erwin Verbruggen from the Netherlands Institute for Sound and Vision presented FREEZE! A manifesto for safeguarding and preserving born-digital heritage.


W/O/R/K performance by Anagram at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum


W/O/R/K performance by Anagram at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum


W/O/R/K performance by Anagram at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum


Caspar Sonnen and the team of veryveryshort.com at Doclab interactive conference in de Brakke Grond, part of the IDFA. Photo Nichon Glerum

This event was part of the DocLab: Uncharted Rituals program, made possible by the Netherlands Film Fund, Mondriaan Fund, De Brakke Grond and Diversion.

Previously: DocLab exhibition asks “Are robots imitating us or are we imitating robots?”
Image on the homepage: Emilie Baltz.

Creditworthy. A History of Consumer Surveillance and Financial Identity in America

Creditworthy. A History of Consumer Surveillance and Financial Identity in America, by Josh Lauer.

On amazon USA and UK.

Publisher Columbia University Press writes: The first consumer credit bureaus appeared in the 1870s and quickly amassed huge archives of deeply personal information about millions of Americans. Today, the three leading credit bureaus are among the most powerful institutions in modern life–yet we know almost nothing about them. Experian, Equifax, and TransUnion are multi-billion-dollar corporations that track our movements, spending behavior, and financial status. This data is used to predict our riskiness as borrowers and to judge our trustworthiness and value in a broad array of contexts, from insurance and marketing to employment and housing.

In Creditworthy, the first comprehensive history of this crucial American institution, Josh Lauer explores the evolution of credit reporting from its nineteenth-century origins to the rise of the modern consumer data industry. By revealing the sophistication of early credit reporting networks, Creditworthy highlights the leading role that commercial surveillance has played—ahead of state surveillance systems—in monitoring the economic lives of Americans. Lauer charts how credit reporting grew from an industry that relied on personal knowledge of consumers to one that employs sophisticated algorithms to determine a person’s trustworthiness. Ultimately, Lauer argues that by converting individual reputations into brief written reports—and, later, credit ratings and credit scores—credit bureaus did something more profound: they invented the modern concept of financial identity. Creditworthy reminds us that creditworthiness is never just about economic “facts.” It is fundamentally concerned with—and determines—our social standing as an honest, reliable, profit-generating person.

Creditworthy opens up in 1913 when oil magnate John D. Rockefeller is denied access to credit in a Cleveland department store. The clerk, who didn’t trust the appearance of his customer, insisted on calling the credit department before authorizing Rockefeller’s purchases. The story shows that at the time already, even one of the richest men in the world, could not escape the gaze of a surveillance apparatus that will remain under-studied for decades to come.

The book ends 100 years later when his great-grandson Senator John (Jay) D. Rockefeller IV initiates a senate investigation into the business practices of the U.S.’ leading data brokers. The results were divulged a few months after Snowden’s NSA revelations. Talking about privacy, the senator said:

What has been missing from this conversation so far is the role that private companies play in collecting and analyzing our personal information. A group of companies known collectively as ‘data brokers’ are gathering massive amounts of data about our personal lives and selling this information to marketers. We don’t hear a lot about the private-sector data broker industry, but it is playing a large and growing role in our lives.

Let me provide a little perspective. In the year 2012, which you will recall was last year, the data broker industry generated $156 billion in revenues–that is more than twice the size of the entire intelligence budget of the United States Government–all generated by the effort to learn about and sell the details about our private lives. Whether we know it or like it or not, makes no difference.

In this book, professor of media studies Josh Lauer describes how U.S. citizens became objects of intensive surveillance. He investigates how financial identity became a key marker of our personal trustworthiness and how increasingly centralised and invasive systems for monitoring an individual’s behaviour and credits enabled the ascent of consumer capitalism in the U.S.


Photograph of the Vegas Credit Bureau parade entry, Las Vegas, circa late 1920s to early 1930s

Many of us think that modern surveillance appeared after 9/11 but its history actually started in the late 19th century when a disembodied doppelganger of the American consumer started materializing inside the files of retail credit departments and local credit bureaus.

The credit reporting industry was an omnivorous collector of personal data. It cultivated trusted informants, connected with hospital and utility companies, placed phone calls to employers, landlords and neighbours in order to amass as much information as possible about American individuals. Many credit departments and bureaus even maintained separate ‘watchdog’ cabinets where they stored all sorts of information that may affect an individual’s ability to pay: divorces, lawsuits, bankruptcies or accounts of immoral behaviour gleaned from papers court and newspapers clippings, etc. The data gathered was so extensive that in the early 1960s, FBI agents, treasury men and the NYPD visited their offices when they needed to fill in gaps in their dossier.

The credit surveillance industry not only quantified the value of citizens, it also functioned as a disciplinary machine, attempting to control their behaviour, shaming them into paying back what they owed and enforcing the doctrine that a person who abused his or her credit must should be shunned from business and society. To the point that, over time, an individual’s financial identity became an integral dimension of their personal identity.


The telautograph. Image: redorbit

Trustworthy explores a very American phenomenon. We do have credit surveillance systems in Europe too but they are probably not as sophisticated as the ones described in the book (note to self: please investigate the European situation.) I’d definitely recommend this book to U.S. readers. It is impeccably researched and makes for a compelling read. I particularly enjoyed the parts describing the array of human and mechanical techniques employed to extract and manage credit information. From the personal interviews that subjected consumers to intrusive scrutiny to the new technologies that enabled the collection and archiving of data. That’s where i learned about the existence of the telautograph, a precursor to the modern fax machine that was developed to transmit drawings to a stationary sheet of paper. It was used in credit bureau, banks and doctors for sending signatures over long distances.


The Rockdale Reporter and Messenger (Rockdale, Tex.), Vol. 82, No. 34, Ed. 1 Thursday, September 9, 1954 Page: 6 of 20

Credit surveillance systems placed individuals at the center of an invasive information and communication network. Its complexity, its reach and the impact it had on society was (and is still) alarming. Yet, most American consumers have long remained unaware of the private surveillance system that facilitated their credit purchases. This lack of knowledge and control is something that most of us -U.S. citizen or not- have often deplored since Edward Snowden revealed the extent of the NSA mass surveillance infrastructure.

Critical investigation into the politics of the interface. An interview with Joana Moll


Joana Moll, AZ: move and get shot, 2012-2014

Joana Moll is a young artist and researcher whose work critically explores the way post-capitalist narratives affect the alphabetization of machines, humans and ecosystems. Her main research topics include Internet materiality, surveillance, online tracking, critical interfaces and language.

I first encountered Joana’s work a couple of years ago when i read about her online works such as Texas Border, AZ: Move and Get Shot and Virtual Watchers which look into the crowdsourcing of the surveillance of the US/Mexico border by civilians.


Joana Moll, in collaboration with anthropologist Cédric Parizot, The Virtual Watchers


Joana Moll, AZ: move and get shot, 2012-2014

These projects expose two rising features of contemporary culture: the insidious militarization of civil society but also the dilution of individual responsibility enabled by technology. I would really recommend that you check out the talk Surveillance through social networks along the US-Mexico Border that she gave a couple of years ago at AntiAtlas of Borders conference because today’s interview is not going to focus specifically on these works.

The reason why i got in touch with Joana is that she is the co-founder of the Critical Interface Politics Research Group at HANGAR, a centre for arts production and research in Barcelona.

This ongoing research project investigates the complex physical structure of the Internet and in particular the many actors, (infra)structures, systems and materials that have a direct but often covert impact on every aspect of our daily lives: submarine and underground cables that perpetuate colonialist heritage, companies and countries that have access to our data, ecological costs of online habits, commodification of data, cultural biases within user interface design, etc.

Joana Moll not only probes into these questions in her own artistic works but she has also started to develop a series of workshops, strategies and tools that enable other people, no matter how tech savvy they are, to delve into these issues but also to subvert the material and computational architectures of the internet.


Poetic Destruction of the Interface, a workshop on Critical Interface Politics at HANGAR, Barcelona, 2016


Performing PageRank physically, from Poetic Destruction of the Interface, a workshop on Critical Interface Politics at HANGAR, Barcelona, 2016

Joana will be giving online classes about the power of interfaces and the way we can learn to democratize them in May with the School of Machines, Making & Make-Believe. In the meantime, i had a skype chat with Joana. Here’s what it sounded like:

Hi Joana! The Tracking Forensics workshops, which you organised together with Andrea Noni and Vladan Joler, looked at the material impact of the so-called digital immateriality on the ecosystems. The word ‘forensics’ suggests the collection of criminal evidences. Why did you chose this title for the workshops?

Maybe i should start with the background of the workshop?

You know what? That’s a good idea!

Hangar in Barcelona was at the origin of this workshop. They invited me over a year ago to lead an investigation for IMAGIT, a European project that deals with criticism of interfaces. They asked me to develop some actions that would flesh out some of the more abstract concepts that they explored in the Manifesto for a critical approach to the user interface.

I ended up developing 3 workshops that lasted each for 12 hours. Over the course of these workshops, we explored topics such as the materiality of the internet, code, cognition, power and then interface, intervention governance, bias in the interfaces, etc. We were trying to cover everything that goes beyond the interface.

And then while working with another colleague at Hangar, we started to talk a lot about forensics, tracking forensics, online tracking and surveillance, I have been exploring these topics for many years. So we came up with this idea of doing the same workshops that we had done already but the difference would be that we’d focus much more on tracking.

We invited Vladan to give a talk in the workshop because he was already at Hangar doing a residency i had curated on the topic of tracking forensics and ethical uses of collected data.

The term “forensics” refers to cyber forensics (or computer forensics), the official term used when you follow the path of crime where evidence is stored digitally. You thus approach the online traces as if you were in front of a crime scene.

As for “tracking”, it refers to the action of monitoring people’s activity on the internet. Basically the workshop was about showing how you can understand the dynamics, the mechanisms that corporations, agencies and governments use to collect your data. Share Lab in Serbia did a massive research on that topic.


Interface Hack, from Poetic Destruction of the Interface, a workshop on Critical Interface Politics at HANGAR, Barcelona, 2016


Poetic Destruction of the Interface, a workshop on Critical Interface Politics at HANGAR, Barcelona, 2016


Tracking Forensics Atlas. Map #2 Tracerouting Top 100 domains

How did you proceed to uncover the physical paths of information? What kind of methodology and strategies did you use?

Archaeology! We made a big archive at Hangar with a group called Critical Interface Politics Research Group. If you have a look on the website, you will find tools, encryption, visualisation, research, activism, etc. But there’s still so much more information we should add.

During the workshops, we used various software but the most important thing lays in the tangible approach to these digital infrastructures and issues because the way you acknowledge things is totally different whether you just work with screens or you experience them physically. For example, we used maps to draw out a forensic analysis of the paths of information.


Poetic Destruction of the Interface, a workshop on Critical Interface Politics at HANGAR, Barcelona, 2016

As an individual who didn’t get the chance to participate to the workshop, how can i become better informed about the infrastructures hidden behind our dependency on the digital?

Together with the Share Lab, we are doing some Do It Yourself Tracking Forensics that we hope to publish soon. It’s basically what his residency at Hangar was about. It’s a project that Andrea and I proposed to do and Hangar is helping us develop it with group of Cyber Forensic people. This DIY is going to be for everyone because it has been very important for me right from the start to engage in critical pedagogic strategy. I want to not only help people with no technical skills understand all these things that are actually responsible for sculpting our reality but also i want this DIY to help them intervene autonomously in these systems.

Aren’t there other groups working on the same issues and putting resources out there just like what you’re trying to do? Or do you have to do all that research from scratch?

There are other people working on similar issues but because we do things in a different way, we still have to do all the research. For example, the Share Lab in Serbia is looking at similar issues but they only cover a part of it. Also Tactical Tech Collective, with whom I’ve collaborated on two projects, developed many pedagogical manuals on the issue. And then of course there is Julian Oliver and Danja Vasiliev but they don’t cover the physical part as in depth as we do, they are mostly looking at the architecture of information and that’s something that we, on the other hand, only cover very briefly. Our focus is on internet infrastructure and tracking. The pedagogy aspect is also very important for us. I also discovered a group in Austria that did a massive research in tracking. The output was a great paper that’s almost a book actually. There are other people in Amsterdam also but again, it’s different.


Joana Moll, DEFOOOOOOOOOOOOOOOOOOOOOREST


Joana Moll, DEFOOOOOOOOOOOOOOOOOOOOOREST

Your work DEFOOOOOOOOOOOOOOOOOOOOOREST explores the tangible and devastating impact of the most mundane habit: the use of google.com. The project visualises the amount of trees needed to absorb the amount of CO2 generated by the global visits to the search engine every second. The website is very simple yet so powerful that it makes me very anxious. I close DEFOOOOOOOOOOOOOOOOOOOOOREST almost as soon as i’ve opened it. It makes me feel helpless. Once we are more aware of the consequences of our daily internet gestures, is there anything we can do apart from despair?

It’s an ongoing debate. Because of course it’s easy to put all the weight on the shoulders of the end user and make them feel guilty for everything. However, i think it’s very important that we visualize the physical and ecological impact of our online actions. It needs to be embedded in the social imagination because it is quite unbelievable. Data generates C02, it pollutes. There are a few things we can do to help with the problem but they are very minimal. If you are a web designer, for example, you can try and put less images or just work in a more efficient way. Companies bear an even larger share of responsibility.

And in this case, policies have to be enforced from above. Change has to come from a political level and we need to take responsibility collectively if we want things to change dramatically.


The Institute for the Advancement of Popular Automatisms, Embrace Stupidity

You are a co-founder of The Institute for the Advancement of Popular Automatisms. I read the about page, clicked around but i must confess that the more i thought i understood, the less i understood. So could you explain me in layman’s terms the activities of IAPA?

A lots of people tell me exactly that! They are not sure whether there are artists behind the project or if it’s just an algorithm doing all the work.

It’s actually very simple. I did this project together with Mexican artist Eugenio Tisselli. The Institute for the Advancement of Popular Automatisms is a platform that enables us to experiment in a very fast way with code, with language, with algorithms, to talk about poetry and the absurd and how machines communicate with humans. With this project we can do all that in a very unorthodox way, by using more the instinct and the irony. The projects that Eugenio and I do aside from this one are research-based and involve long processes. So IFAPA allows us to play a bit. It’s still serious but the approach is more laid-back, more simple. It allows us to play with our ideas and implement things that are important to our work. It’s kind of escape bubble too!

Any upcoming project, field of research or event you could share with us?

I’m working on another project that talks about how different agents that exploit data. I call that ‘data slavery’, there is a lot of dating sites that sell profiles to each other in a crazy way. You can by thousand or even one million profiles for a hundred dollars….

You mean real profiles?

Some of them are real, some are fake. But that doesn’t even matter because the pictures they use are pictures of real people.

I’m about to buy massive amounts of profiles and then try and understand where else these profile, these pictures, these names, or emails can be found. And from there, i want to explore the data of these slavery markets. In a previous research I did on the topic I’ve seen that one single profile was being exploited by more than 50 online services.

Together with Vladan we are writing a text that explores and exposes the ecological footprint of surveillance capitalism and we hope to release these before the summer.

Besides, and that’s very recent news, the next phase of the Critical Interface Politics Research Group will focus on deeply analizying the environmental impacts of internet infrastructures, data flows and interfaces through different interdisciplinary initatives. The plan is to gather a transdisicplinary resesarch team and design serveral interventions that will be able to both, expose the termendous material impact of communication technologies, create mechanisms and tools to reduce such footprint and make them available to the general public. We are in the process of writing the project and looking for partners right now.

Thanks Joana!

Joana Moll will be running a Tracking Forensics workshop at the Resonate Festival in Belgrade on 21 and 22 April. And if you can’t make it to Serbia, Joana will be giving online classes about the power of interfaces and the way we can learn to democratize them. The online program is organized by the School of Machines, Making & Make-Believe in May. I’ll also be giving online classes but on the topic of socially engaged creative practices, same month, only that Joana gets the Tuesdays and i get Mondays.)