Category Archives: privacy

Obfuscates your feelings on Facebook and defeat its algorithms in the process


Ben Grosser, Go Rando, 2017

At a time when truth is treated as a polymorph and malleable commodity, when half of your country seems to live in another dimension, when polls are repeatedly proved wrong, when we feel more betrayed and crushed than ever by the result of a referendum or a political election, it is easy to feel disoriented and suspicious about what people around us are really thinking.

Blinding Pleasures, a group show curated by Filippo Lorenzin at arebyte Gallery in London invites us to ponder on our cognitive bias and on the mechanisms behind the False Consensus effect and the so-called Filter Bubble. The artworks in the exhibition explore how we can subvert, comprehend and become more mindful about the many biases, subtle manipulations and functioning of the mechanisms that govern the way we relate to news and ultimately to our fellow human beings.

Ben Grosser, Go Rando (Demonstration video), 2017

One of the pieces premiering at arebyte is Go Rando, a brand new browser extension by Ben Grosser that allows you to muddle your feelings every time you “like” a photo, link or status on Facebook. Go Rando will randomly pick up one of the six Facebook “reactions”, leaving thus your feelings and the way you are perceived by your contacts at the mercy of a seemingly absurd plug-in.

The impetus behind the work is far more astute and pertinent than it might seem though. Every “like”, every sad or laughing icon is seen by your friends but also processed by algorithms used for surveillance, government profiling, targeted advertising and content suggestion. By obfuscating the limited number of emotions offered to you by Facebook, the plug-in allows you to fool the platform algorithms, perturb its data collection practices and appear as someone whose feelings are emotionally “balanced”.

If you want to have a go, installing Go Rando on your browser is a fairly straightforward task. And don’t worry, the extension also allows you choose a specific reaction if you want to.

Grosser has been critically exploring, dissecting and perverting facebook mechanisms for a number of years now. His witty artworks become strangely more relevant with each passing year, as facebook gains even more popularity, both in number, influence and importance.

I caught up with Ben right after the opening of the Blinding Pleasures show:

Hi Ben! Your latest work, Go Rando, randomly assigns Facebook users an ‘emotion’ when you click “Like”. I hate to admit it but I’m not brave enough to use Go Rando. I’d feel too vulnerable and at the mercy of an algorithm. Also, I’d be too worried about the way my contacts would judge the assigned reactions. “Would I offend or shock anyone?” Are you expecting that many people will be as coward as I am? And more generally, what are you expecting people to discover or reflect upon with this work?

As users of Facebook I’d say we are always—as you put it—“at the mercy of an algorithm.” With Go Rando I aim to give users some agency over which algorithm they’re at the mercy of. Are they fully subject to the designs Facebook made available, or are they free to deviate from such a prescribed path? Because Go Rando’s purpose is to obfuscate one’s feelings on Facebook by randomly choosing “reactions” for them, I do expect some (many?) will share your concerns about using it.

However, whether one uses Go Rando or not, my intention for this work is to provoke individual consideration of the methods and effects of emotional surveillance. How is our Facebook activity being “read,” not only by our friends, but also by systems? Where does this data go? Whom does it benefit? Who is made most vulnerable?

With this work and others, I’m focused on the cultural, social, and political effects of software. In the case of Facebook, how are its designs changing what we say, how we say it, and to whom we speak? With “reactions” in particular, I hope Go Rando gets people thinking about how the way they feel is being used to alter their view of the world. It changes what they see on the News Feed. Their “reactions” enable more targeted advertising and emotional manipulation. And, as we’ve seen with recent events in Britain and America, our social media data can be used to further the agendas of corporate political machines intent on steering public opinion to their own ends.

Go Rando also gives users the possibility to select a specific reaction when they want to. That’s quite magnanimous. Why not be more radical and prevent users from intervening on the choice of emotion/reaction?

I would argue that allowing the user occasional choice is the more radical path. A Go Rando with no flexibility would be more pure, but would have fewer users. And an intervention with fewer users would be less effective, especially given the scale of Facebook’s 1.7 billion member user base. Instead, I aim for the sweet spot that is still disruptive yet broadly used, thus creating the strongest overall effect. This means I need to keep in mind a user like you, someone who is afraid to offend or shock in a tricky situation. The fact is that there are going to be moments when going rando just isn’t appropriate (e.g. when Go Rando blindly selects “Haha” for a sad announcement). But as long as the user makes specific choices irregularly, then those “real” reactions will get lost in the rando noise. And once a user adopts Go Rando, one of its benefits is that they can safely get into a flow of going rando first and asking questions later. They can let the system make “reaction” decisions for them, only self-correcting when necessary. This encourages mindfulness of their own feelings on/about/within Facebook while simultaneously reminding them of the emotional surveillance going on behind the scenes.


Opening of Blinding Pleasures. Photo: arebyte gallery


Opening of Blinding Pleasures. Photo: arebyte gallery

Go Rando is part of arebyte Gallery’s new show Blinding Pleasures. Could you tell us your own view on the theme of the exhibition, the False Consensus effect? How does Go Rando engage in it?

With the recent Brexit vote and the US Presidential election, I think we’ve just seen the most consequential impacts one could imagine of the false consensus effect. And even though I’m someone who was fully aware of the role of algorithms in social media feeds, I (and nearly everyone else I know) was still stunned this past November. In other words, we thought we knew what the country was thinking. We presumed that what we saw on our feeds was an accurate enough reflection of the world that our traditional modes of prediction would continue to hold.

We were wrong. So why? In hindsight, some of it was undoubtedly wishful thinking, hoping that my fellow citizens wouldn’t elect a racist, sexist, reality television star as President. But some of it was also the result of trusting the mechanisms of democracy (e.g. information access and visibility) to algorithms designed primarily to keep us engaged rather than informed. Facebook’s motivation is profit, not democracy.

It’s easy to think that what we see on Facebook is an accurate reflection of the world, but it’s really just an accurate reflection of what the News Feed algorithm thinks we want the world to look like. If I “Love” anti-Trump articles and “Angry” pro-Trump articles, then Facebook gleans that I want a world without Trump and gives me the appearance of a world where that sentiment is the dominant feeling.

Go Rando is focused on these feelings. By producing (often) inaccurate emotional reactions, Go Rando changes what the user sees on their feed and thus disrupts some of the filter bubble effects produced by Facebook. The work also resists other corporate attempts to analyze our “needs and fears,” like those practiced by the Trump campaign. They used such analyses to divide citizens into 32 personality types and then crafted custom messages for each one. Go Rando could help thwart this kind of manipulation in the future.

The idea of a False Consensus effect is overwhelming and it makes me feel a bit powerless. Are there ways that artists and citizens could acknowledge and address the impact it has on politics and society?

It is not unreasonable to feel powerless given the situation. So much infrastructure has been built to give us what (they think) we want, that it’s hard to push back against. Some advocate complete disengagement from social media and other technological systems. For most that’s not an option, even if it was desirable. Others develop non-corporate distributed alternatives such as Diaspora or Ello. This is important work, but it’s unlikely to replace global behemoths like Facebook anytime soon.

So, given the importance of imagining alternative social media designs, what might we do? I’ve come to refer to my process for this as “software recomposition,” treating sites like Facebook, Google, and others not as fixed spaces of consumption and interaction but as fluid spaces of manipulation and experimentation. In doing so I’m drawing on a lineage of net artists and hacktivists who use existing systems as their primary material. In my case, such recomposition manifests as functional code-based artworks that allow people to see and use their digital tools in new ways. But anyone—artist or citizen—can engage in this critical practice. All it takes is some imagination, a white board, and perhaps some writing to develop ideas about how the sites we use every day are working now, and how small or big changes might alter the balance of power between system and user in the future.


Ben Grosser, Go Rando, 2017


Ben Grosser, Go Rando, 2017

I befriended you on Facebook today not just because you look like a friendly guy but also because I was curious to see how someone whose work engages so critically and so often with Facebook was using the platform. You seem to be rather quiet on fb. Very much unlike some of my other contacts who carry their professional and private business very openly on their page. Can you tell us about your own relationship to Facebook? How do you use it? How does it feed your artistic practice? And vice-versa, how maybe some projects you developed have had an impact on the way you use Fb and other social platforms?

When it comes to Facebook I’d say I’m about half Facebook user and half artist using Facebook.

I start with the user part, as many of my ideas come from this role. I use Facebook to keep up with friends, meet new people, follow issues—pretty normal stuff. But I also try to stay hyper-aware of how I’m affected by the site when using it. Why do I care how many “likes” I got? What causes some people to respond but others to (seemingly) ignore my latest post?

When these roles intersect, Facebook becomes a site of experimentation for me. I’m constantly watching for changes in the interface, and when I find them I try to imagine how they came to be. What is the “problem” someone thought this change is solving? I also often post about these changes, and/or craft tests to see how others might be perceiving them.

A favorite example of mine is a post I made last year:

“Please help me make this status a future Facebook memory.”

Nothing else beyond that, no explanation, no instruction. What followed was an onslaught of comments that included quotes such as: “I knew you could do it!!” or “great news!” or “Awesome! Congrats!” or “You will always remember where you were when this happened.” In other words, without discussion, people had an instinct about what kinds of content might trigger Facebook in the future to recall this post from this past. These kinds of experiments not only help me think about what those signals might be, but also illustrate how many of us are (unconsciously) building internal models about how Facebook works at the algorithmic level. Because of this, much of my work has a collaborative nature to it, even if those collaborations aren’t formal ones.

Ben Grosser, Facebook Demetricator (demonstration video), 2012-present

Do you know if some people have started using or at least perceiving Facebook and social media practices in general differently after having encountered one of your works?

Yes, definitely. Because some of my works—like Facebook Demetricator, which removes all quantifications from the Facebook interface—have been in use for years by thousands of people, I regularly get direct feedback from them. They tell me stories about what they’ve learned of their own interactions with Facebook as a result, and, in many cases, how my works have changed their relationship with the site.

Some of the common themes with Demetricator are that its removal of the numbers on Facebook blunts feelings of competition (e.g. between themselves and their friends), or removes compulsive behaviors (e.g. stops them from constantly checking to see how many new notifications they’ve received). But perhaps most interestingly, Demetricator has also helped users to realize that they craft rules for themselves about how to act (and not act) within Facebook based on what the numbers say.

For example, multiple people have shared with me that it turns out they have a rule for themselves about not liking a post if it has too many likes already (say, 25 or 50). But they weren’t aware of this rule until Demetricator removed the like counts. All of the sudden, they felt frozen, unable to react to a post without toggling Demetricator off to check! If your readers are interested in more stories like this, I have detailed many of them in a paper about Demetricator and Facebook metrics called “What Do Metrics Want: How Quantification Prescribes Social Interaction on Facebook,” published in the journal Computational Culture.


Ben Grosser, Facebook Demetricator, 2012-present

Ok, sorry but I have another question regarding Facebook. I actually dislike that platform and tend to avoid thinking about it. But since you’re someone who’s been exploring it for years, it would be foolish of me to dismiss your wise opinion! A work like Facebook Demetricator was launched in 2012. 5 years is a long time on the internet. How do you feel about the way this project has aged? Do you think that the way Facebook uses data and the way users experience data has evolved over time?

I have mixed feelings about spending much of my last five years with Demetricator. I’m certainly fortunate to have a work that continues to attract attention the way this one does. But there have been times—usually when Facebook makes some major code change—when I’ve wished I could put it away for good! Because Facebook is constantly changing, I have to regularly revise Demetricator in response or it will stop functioning. In this way, I’ve come to think of Demetricator as a long-term coding performance project.

Perhaps the best indicator of how well Demetricator has aged is that it keeps resurfacing for new audiences. Someone who had never heard about it before will find the work and write about its relationship to current events, and this will create a surge of new users and attention. The latest example is Demetricator getting discussed as a way of dealing with post-election social media anxiety in the age of Trump.

In terms of Facebook’s uses of and user experiences with data over time, there’s no question this has evolved. People have a lot more awareness about the implications of big data and overall surveillance post-Snowden. The recent Brexit and US election results have helped expand popular understandings of concepts like the filter bubble. And I would say that, while perhaps not that many people are aware of it, many more now understand that what we see on Facebook is the result of an algorithm making content decisions for us. At the same time, Facebook continues to roll out new methods of quantifying its users (e.g. “reactions”), and these are not always discussed critically on arrival, so there’s plenty of room for growth.

I find the way you explore and engage with algorithms and data infrastructures fascinating, smart but also easy to approach for anyone who might not be very familiar with issues related to algorithm, data gathering, surveillance, etc. There always seem to be several layers to your works. One that is easy to understand with a couple of sentences and one that goes deeper and really questions our relationship to technology. How do you ensure that people will see past the amusing side of your work and immerse themselves into the more critical aspect of each new project (if that’s something that concerns you)?

It’s important to me that my work has these different layers of accessibility. My intention is to entice people to dig deeper into the questions I’m thinking about. But as long as some people go deep, I’m not worried when others don’t.

In fact, one of the reasons I often use humor as a strategic starting point is to encourage different types of uses for each work I make. This is because I not only enjoy the varied lives my projects live but also learn something new from each of them. As you might expect, sometimes it’s a user’s in-depth consideration that reveals new insights. But other times it’s the casual interaction that helps me better understand what I’ve made and how people think about my topic of interest.

Ultimately, in a world where so many of us engage with software all day long, I want us to think critically about what software is. How is it made? What does it do? Who does it serve? In other words, what are software’s cultural, social, and political effects? Because software is a layer of the world that is hard to see, I hope my work brings a bit more of it into focus for us all.


Ben Grosser, Music Obfuscator


Ben Grosser, Tracing You (screenshot), 2015

Any upcoming work, field of research or events coming up after the exhibition at arebyte Gallery?

I have several works and papers in various stages of research or completion. I’ll mention three. Music Obfuscator is a web service that manipulates the signal of uploaded music tracks so that they can evade content identification algorithms on sites like YouTube and Vimeo. With this piece, I’m interested in the mechanisms and limits of computational listening. I have a lot done on this (I showed a preview at Piksel in Norway), but hope to finally release it to the public this spring or summer at the latest. I’m in the middle of research for a new robotics project called Autonomous Video Artist. This will be a self-propelled video capture robot that that seeks out, records, edits, and uploads its own video art to the web as a way of understanding how our own ways of seeing are culturally-developed. Finally, I have an article soon to be published in the journal Big Data & Society that will discuss reactions to my computational surveillance work Tracing You, illustrating how transparent surveillance reveals a desire for increased visibility online.

Thanks Ben!


Image by Filippo Lorenzin

Go Rando is part of Blinding Pleasures, a group show that explores the dangers and potentials of a conscious use of the mechanisms behind the False Consensus effect and its marketing-driven son, the so-called “Filter Bubble”. The show was curated by Filippo Lorenzin and is at arebyte Gallery in London until 18th March 2017.

Book review: Privacy. A Short History

Privacy. A Short History, by David Vincent, Professor of Social History at the Open University.

On amazon USA and UK.

Vincent-Privacy2Publisher Polity writes: Privacy: A Short History provides a vital historical account of an increasingly stressed sphere of human interaction. At a time when the death of privacy is widely proclaimed, distinguished historian David Vincent describes the evolution of the concept and practice of privacy from the Middle Ages to the present controversy over digital communication and state surveillance provoked by the revelations of Edward Snowden.

Deploying a range of vivid primary material, he discusses the management of private information in the context of housing, outdoor spaces, religious observance, reading, diaries and autobiographies, correspondence, neighbours, gossip, surveillance, the public sphere and the state. Key developments, such as the nineteenth-century celebration of the enclosed and intimate middle-class household, are placed in the context of long-term development. The book surveys and challenges the main currents in the extensive secondary literature on the subject. It seeks to strike a new balance between the built environment and world beyond the threshold, between written and face-to-face communication, between anonymity and familiarity in towns and cities, between religion and secular meditation, between the state and the private sphere and, above all, between intimacy and individualism.


The Archive of Mass Observation at the University of Sussex, England

People have always aspired to privacy and this book recounts the many forms that threats to privacy have taken through time: 14th century litigious window repeatedly taking her neighbours to court because they could watch her through their windows, Bentham’s Panopticon dreams, UK’s Mass-Observation records, NSA global surveillance programs, etc.

Our notions of privacy have been constantly shifting and made more complex over time. But one thing that seemed to be a constant in the past was that privacy was accessible to those who could afford it. If you had enough money, you could enjoy furniture to store letters, a private room or garden to discuss freely with your guest, separate quarters for servants, individual rooms for children, thicker walls, etc. And of course domestic plumbing in your lavatory and bathroom so that even your hygiene business could be conducted in peace. Not everyone could afford an indoor toilet in the late 19th-early 20th century.


Women working at the U.S. Capitol switchboard, 1959. Photo: vintage everyday

I’m not sure money would be enough to ensure you total privacy these days. A reason for that is of course the widespread adoption of digital communication. Each new communication technology has always allowed for both greater privacy and greater intrusion. Letters facilitated illicit or unsanctioned intimacy but this freedom coexisted with ‘epistolary anxiety’ (the fear of private correspondence falling into the wrong hands.) Telephones meant that callers no longer had to be seen on your front door but until 1958 operators were a latent threat to this new sense confidentiality.

The digital revolution marks a rupture with the past because it redefines the network in need of protection. The home and family used to be the vulnerable unit. Nowadays privacy is less of a household aspiration and more of an individual human right issue. The other discontinuity lies in the scale and complexity of the data available on the internet. But also in the willingness of some security agencies to bypass formal agreements with internet companies and tap directly into their fiber-optic cables. ‘For security purposes’ and without public debate.


Harry Caul (Gene Hackman) attempts to listen in on a private conversation in Francis Ford Coppola’s movie The Conversation. Photo via Cinematic

Vincent believes that we put too much faith in the idea of an all-seeing internet at the center of the web of surveillance. He states that privacy is not dead but merely distorted and that we are constantly devising new ways to regain what we’ve lost of it.

States and corporations are not constantly spying our every gestures. But they can still know a lot about us if they want to and this power to intrude might lead to mistakes, misunderstanding and misinterpretations that could end up being costly for citizens. Especially if you couple this power with our well-documented social media exhibitionism.

A lot of the research in the publication relies on UK material but i’d still recommend that you have a look at the book if you have a chance because the story of privacy is a fascinating one. It is made of small victories and defeats, domestic improvements and invisible inks, state surveillance and gossip press scrutiny.

Design My Privacy. 8 Principles for Better Privacy Design

00a0adesignmyprivacyy8Design My Privacy. 8 Principles for Better Privacy Design, by Tijmen Schep.
With foreword by Mieke Gerritzen, Director of MOTI, the Museum of the Image in Breda.

On amazon usa and uk.

BIS Publishers write: How can we protect our privacy in this digital era? Because of the emerging of ‘The internet of things’ this question becomes more and more relevant to designers. Technology becomes part of our daily used products. Watches, clothing, cars, houses are becoming ‘smart’, all being connected to the ‘cloud’. This book gives you guidance on how to design for privacy.

This book is written to encourage designers to think about and to design for privacy issues. The technology behind the smart products and systems are so complex, that for the consumer it is difficult to understand what the consequences are for everyday life. Designers have to start thinking about transparency and accessibility in the design of privacy-sensitive products and services. This book offers the designer guidance, in the form of eight design principles, that help with designing products and services.

Screen-Shot-2014-05-23-at-9.24.59-PM
Owen Mundi, I Know Where Your Cat Lives

Privacy concerns are taking a beating these days: China is implementing a social credit system meant to rate each citizen’s trustworthiness, UK has just legalized a series ofextreme surveillance measures and let’s not cry over what the orange fascist is going to inflict to American citizens worried about data-collection by intelligence agencies.

The Design My Privacy booklet invites designers to engage with privacy issues instead of leaving the whole discussion into the hands of IT experts.

SETUP medialab, The National Birthday Calendar (teaser)

The author of the book, technology critic Tijmen Schep, lists 8 design principles that designers should keep in mind while working on products and services in the age of the Internet of Things.

These principles require the designers to be practical (by including privacy features right from the start of the project in order to avoid costly updates later), humble (allowing users to customize according to their own needs and culture), brave (standing up to a client who would like to collect as much data as possible), malicious (thinking like a hacker and forecasting all the ways the technology can be abused), critical (realizing that designers imbue designs with codes of law, cultural norms and prejudice), etc.

citizen-eiiiiiiix
James Bridle, Citizen-Ex


James Bridle, Citizen-Ex

Aside from the 8 principles, the book also contains plenty of case studies, examples of artistic projects contributing to the privacy discussion, a crash course on the value of privacy, a glossary of important terms and concepts, a reading list, a series of interviews with experts such as Marcel Schouwenaar and Jaap Stronks and a contribution by Frank Koppejan about prospective European privacy legislation. There’s even a little privacy pop quiz about the very blurred boundaries between reality and science fiction.

Lauren McCarthy and Kile McDonald, pplkpr

Design My Privacy is a witty, practical and thoughtful little book. It constitutes a useful tool for designers who want to create products and environments which balance efficiency, user-friendliness and privacy. But it can also serve as a sensible companion for customers who might want to know what to look for and what to be cautious about next time they plan to buy a car insurance, smart watch or energy saving outlet.

Views inside the book:

0phonedroneclones

irevealmyatribbbut

sampleannnaaalys

Image on the homepage stolen from HUH magazine.

Design My Privacy. 8 Principles for Better Privacy Design

00a0adesignmyprivacyy8Design My Privacy. 8 Principles for Better Privacy Design, by Tijmen Schep.
With foreword by Mieke Gerritzen, Director of MOTI, the Museum of the Image in Breda.

On amazon usa and uk.

BIS Publishers write: How can we protect our privacy in this digital era? Because of the emerging of ‘The internet of things’ this question becomes more and more relevant to designers. Technology becomes part of our daily used products. Watches, clothing, cars, houses are becoming ‘smart’, all being connected to the ‘cloud’. This book gives you guidance on how to design for privacy.

This book is written to encourage designers to think about and to design for privacy issues. The technology behind the smart products and systems are so complex, that for the consumer it is difficult to understand what the consequences are for everyday life. Designers have to start thinking about transparency and accessibility in the design of privacy-sensitive products and services. This book offers the designer guidance, in the form of eight design principles, that help with designing products and services.

Screen-Shot-2014-05-23-at-9.24.59-PM
Owen Mundi, I Know Where Your Cat Lives

Privacy concerns are taking a beating these days: China is implementing a social credit system meant to rate each citizen’s trustworthiness, UK has just legalized a series ofextreme surveillance measures and let’s not cry over what the orange fascist is going to inflict to American citizens worried about data-collection by intelligence agencies.

The Design My Privacy booklet invites designers to engage with privacy issues instead of leaving the whole discussion into the hands of IT experts.

SETUP medialab, The National Birthday Calendar (teaser)

The author of the book, technology critic Tijmen Schep, lists 8 design principles that designers should keep in mind while working on products and services in the age of the Internet of Things.

These principles require the designers to be practical (by including privacy features right from the start of the project in order to avoid costly updates later), humble (allowing users to customize according to their own needs and culture), brave (standing up to a client who would like to collect as much data as possible), malicious (thinking like a hacker and forecasting all the ways the technology can be abused), critical (realizing that designers imbue designs with codes of law, cultural norms and prejudice), etc.

citizen-eiiiiiiix
James Bridle, Citizen-Ex


James Bridle, Citizen-Ex

Aside from the 8 principles, the book also contains plenty of case studies, examples of artistic projects contributing to the privacy discussion, a crash course on the value of privacy, a glossary of important terms and concepts, a reading list, a series of interviews with experts such as Marcel Schouwenaar and Jaap Stronks and a contribution by Frank Koppejan about prospective European privacy legislation. There’s even a little privacy pop quiz about the very blurred boundaries between reality and science fiction.

Lauren McCarthy and Kile McDonald, pplkpr

Design My Privacy is a witty, practical and thoughtful little book. It constitutes a useful tool for designers who want to create products and environments which balance efficiency, user-friendliness and privacy. But it can also serve as a sensible companion for customers who might want to know what to look for and what to be cautious about next time they plan to buy a car insurance, smart watch or energy saving outlet.

Views inside the book:

0phonedroneclones

irevealmyatribbbut

sampleannnaaalys

Image on the homepage stolen from HUH magazine.

When erased data come back to haunt you

2grandpassportace5_b
Peters Riekstins, Back to the Light, 2016. Exhibition opening at RIXC Gallery in Riga. Photo: Kristine Madjare for RIXC

Everyone knows about cybercrime and how owning networked computers and mobile devices makes you a potential victim of bank fraud, identity theft, extortion, theft of confidential information, etc. Data stored on your computer is never safe and its ghosts can come back and haunt you long after you’ve discarded your electronic device, long after even you’ve erased the data it contained.

Unless you take every possible step to make sure that your device is being recycled responsibly and data is erased thoroughly from the hard-drive, your credit card numbers, the classified information of the company you worked for or the records of online transactions you had forgotten about can end up being sold on black markets or used for identity theft, blackmail and credit card fraud.

Artist Peters Riekstins has been investigating data security over the past few years. From the way people trade privacy for convenience, sharing their private data on various platforms, to the way they neglect to properly wipe out the sensitive data they release in the wild when they discard their computers.

To illustrate this never-ending life of data after computer death, Riekstins first looked for ways to obtain and use private data legally. He found it in pawnshops where he bought discarded hard drives for 20 to 40 euros. “The content on most of the hard-drives have been deleted by the original owner,” he said. “Unfortunately, not everyone is aware that it’s pretty difficult to delete data permanently. When you simply delete a file on your computer, it only records this space as empty and available. Physical data are still on the hard drive until the computer itself transcribes new information over the free space.”

apoooornroulet

00apporrnopri89
Peters Riekstins, Porn Roullete (screenshot from video documentation)

“It’s actually not hard at all to get data from hard drives,” the artist continued. “Unless you want to get something specific, as you would do if you lost something. There are ways to get everything that is stored in magnetic disks, it takes skills, but for now i have plenty information to work with.”

Influenced by the content he found on external hard-drives (mostly pirated movies, TV series, private images and pornography), he made Porn Roullete, an installation inviting visitors to spin the hard drive. When it stops rotating, images from the hard drive are shown on a small display. There is one in 6 chances (as in the Russian roulette) that the image will be a pornographic picture.

Ghost Call [expired]

Another work using found material was Ghost Call, an 8-hour video performance on YouTube. People could call in and summon an image from a hard drive. Just like you would summon a ghost. The images floated on the screen for a few moments and then they disintegrated back to the mysterious place they came from. The soundtrack of the work used music retrieved from hard drives but modified with granular synthesis.

2plusgeneral6861d_b
Peters Riekstins, Back to the Light, 2016. Exhibition opening at RIXC Gallery in Riga. Photo: Kristine Madjare for RIXC

The latest series Riekstņš worked on is called “Back to the Light.” The project consists in ‘reawakening’ pixels that have been quietly sleeping on hard drives, forgotten by all including their original owners. The artist brought them back to light not exactly as they had been saved (which would be ethically questionable) but artistically modified. The images, videos and sound material has been processed by programs developed by the artist in order to create a certain artistic esthetics. The work also aims to convey the message that the data could have been used in a very different way had it ended in evil hands.

All the artwork were made using the images that the artist has found in just one disk. It contained many many things. Such as hundreds of copies of passports. Riekstins remixed them so that visually they look like one passport. Video of the work by Raitis Šmits.

2visage8021_b
Peters Riekstins, Back to the Light, 2016. Exhibition opening at RIXC Gallery in Riga. Photo: Kristine Madjare for RIXC

In a second artwork he mixed together the faces of 1059 people who’ve never met in real life, but who get the opportunity of a kind of post-mortem encounter through Riekstņš’ work. All of the pictures are shown simultaneously so that it results in a forever changing face, one that looks both familiar and strangely alien. Video of the work by Raitis Šmits.

By restoring deleted files and using them in art practice, Riekstiņš hopes to stimulate people to feel responsible for their digital property. Now and after they’ve discarded it. “I want people to understand that it is important to take care of their data security,” he concludes. “If you really want to delete data from your hard drive, the hardware has to be physically destroyed (destroying only the monitor is not enough, unlike what Hollywood would like you to believe.)”

Back to the Light was recently part of the RAM (Random Access Memories) exhibition at the RIXC Gallery in Riga. RAM showcased the work of Trihars (aka Rihards Vitols, Peters Riekstins and Kristaps Biters), an artist collective interested in the interconnections between computer and environment. The show closed on 4 September 2016.
Also part of the same show: The Woodpecker: Could fake birds save our forests?

Extra Fantômes. The real, the fake, the uncertain

While in Paris a few weeks ago, i visited Extra Fantômes. The real, the fake, the uncertain, an exhibition at La Gaîté Lyrique that explores the interweaving of the technological and the uncanny.

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Extra Fantômes. View of the exhibition space at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Karolina Sobecka, All the Universe is Full of the Lives of Perfect Creatures. Exhibition view at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Nils Völker, Seventeen, 2016. Extra Fantômes, exhibition view. Photo © vinciane verguethen/voyez-vous

I thought it would be a light and amusing way to fill a rainy afternoon. And amusing it certainly was. At least at the start of the exhibition, when you find yourself plunged inside dark spaces and Lynch-inspired red room dedicated to the occult. There is a Oui Ja table, a mirror haunted by animals, a phone that puts you in contact with ghosts, a clique of translucent cushions that breathe over your heads. But the exhibition goes way beyond the mystical and the supernatural…

In a world where scientific rationalism rules, interest is on the rise for alternative forms of relating to the world and to others.

The exponential development of technology is paradoxically a time there is a surge in attention and demand for magical, unexplained and mythological phenomena.

After the first two rooms of fun and phantasms, the ride gets darker and the paranormal gets worryingly normal. The specters, spirits and impersonators become pervasive, intrusive, you can ignore them if you so wish but you can’t hide from them. They are made of the data we generate. They are our disembodied doppelgängers, our digital shadow and they relentlessly shed information about our opinions, routines, sexual preferences and working habits. Unsurprisingly, these last few rooms were the ones where i spent the longest time.

3_extra-fantomes_photo_vinciane-verguethen-968x645
Extra Fantômes. View of the exhibition space at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

The first one presents itself like a Control Room that enables the visitors to discover the immaterial energies and invisible forces that inhabit the same spaces as us. These forces are not esoteric anymore. They are real, they are the ones that inevitably accompany our technology-mediated existence.

googlefaces_01
onformative, Google Faces, 2013

onformative, Google Faces – Google Earth Flight Animation

Google Faces was my favourite piece in the room because of the way it ties up the uncanny atmosphere of the previous rooms with the reality of the current technological world.

Google Faces tirelessly travels through Google Maps’s satellite images and uses a face detection algorithm to detect portraits hidden in the topography of our planet. The images would look nothing like faces were it not for pareidolia, a psychological phenomenon wherein the mind perceives a familiar pattern of a face, animal, object, message or other where none actually exists. “Unprejudiced” technology meets human subjectivity.

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Tobias Zimmer and David Ebner, Database, 2014. Exhibition view at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

The cameras of the Database installation record the faces of visitors as they enter the room, a recognition algorithm analyzes them and the resulting data is sent to a printer, which automatically prints the little portraits along with data about the time of the visitor’s passage in the gallery. The process is super fast. Every hour though, the intrusive work acknowledges the right to privacy by blending all the faces into a composite portrait and displaying it on the installation’s website, while all other digital records are deleted. As for the ridiculously voluminous prints, they get shredded.

Database publicly documents the nuts and bolts of facial recognition—which governments and large corporations keep behind closed doors—and also refuses to catalog or monetize the information accumulation, in stark contrast with other entities that collect big data.

Semiconductor, Magnetic Movie, 2007

In Semiconductor’s Magnetic Movie, physicists from NASA’s Space Sciences Laboratory at UC Berkeley describe their experiments about magnetic fields while images visualize this invisible phenomenon in the form of hectic, ever-changing geometries.

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Extra Fantômes. Exhibition view at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

The last room in the exhibition bears the inauspicious title of ‘the Bunker.’ There’s nothing oppressive about it though. The space is filled with ideas and strategies deployed by artists to fight back against data collecting, machine scrutiny and other forms of control. They make us disappear and even turn us into ghosts in the eyes of the machines.

There’s a very straightforward way to make yourself untraceable. Head over to the website of LessEMF and get a maternity camisole, sleeping bag or poncho that will protect you from electro-magnetic fields. My personal choice would be this fetching upper body shield which might come in handy next time i fancy a bit of jousting.

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Extra Fantômes. Exhibition view at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

2camoufl81c0_b
Adam Harvey, Stealth Wear

stealth-wear-hoodie-tate-multi
Adam Harvey, Stealth Wear

Adam Harvey designed a range of fashionable thermal evasion garments that protect their wearer from the eyes of the drones and other heat sensing machines.

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Extra Fantômes. Exhibition view at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

The artist and researcher is also famous for CV Dazzle, a sly make up and hair fashion technique that covers the face with bold patterns. By breaking apart the expected features targeted by computer vision algorithms, CV Dazzle makes you immune to CCTV scrutiny.

Heather Dewey-Hagborg, Invisible

Finally, Heather Dewey-Hagborg has been exploring the next frontier in surveillance: biological surveillance. Her Invisible kit ensure your genetic privacy by obliterating any DNA trace you leave behind.

Extra Fantomes HD_03
Catalogue Extra Fantômes

Extra Fantomes HD_15
Catalogue Extra Fantômes

The catalogue of the exhibition is published by Gaîté Éditions and Lienart. It contains plenty of great essays by the like of James Bridle, Finn Brunton, Vinciane Despret, Marie Lechner, Elliot Woods (Kimchi and Chips), Mushon Zer-Aviv, etc. Only available in french, i’m afraid.

More images from the exhibition:

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Karolina Sobecka, All the Universe is Full of the Lives of Perfect Creatures. Exhibition view at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous


Mathieu Schmitt, Oui Ja, 2013

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Mathieu Schmitt, Oui Ja, 2013. Exhibition view at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

2me923538bb9_b
Malte Martin, Spectres, 2014

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Exhibition view at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

Extra Fantômes, exposition à la Gaîté Lyrique du 7 avril au 31 juillet 2016. © vinciane verguethen/voyez-vous
Extra Fantômes. Exhibition view at Gaîté Lyrique. Photo: © Vinciane Verguethen/voyez-vous

Extra Fantômes. The real, the fake, the uncertain was curated by Daily tous les jours. The show remains open at Gaîté Lyrique in Paris until July 31rst 2016.

Confessions of a Data Broker and other tales of a quantified society

49867_xl_eroeffnung-nervoese-systeme-2016_aw0h1w
The White Room, Opening of Nervous Systems. Photo: © Laura Fiorio/HKW

49840_xl_eroeffnung-nervoese-systeme-auss_rvww5o
!Mediengruppe Bitnik, Reconstruction of Julian Assange’s study room at the Ecuadorian Embassy in London. Opening of Nervous Systems. Photo: © Laura Fiorio/HKW

While in Berlin for the Anthropocene Campus, i visited the one show you shouldn’t miss if you happen to be in town this week and next: Nervous Systems. Quantified Life and the Social Question.

The exhibition smartly enrolled artists, media historians and writers to chart the history and current rise of data-technologies and the world they bring about, exploring and exposing our quantified society and the processes of self-quantification. The food for thought that this show provide is overwhelming. Almost as much as this (partial) review of it!

Nervous Systems was co-curated by Anselm Franke and by Stephanie Hankey and Marek Tuszynski from the Tactical Technology Collective but because pretty much every single artwork and historical artifact in that deserves to be mentioned, i thought it would be better for everyone’s patience and sanity if i focused on one segment of the exhibition only.

49845_xl_eroeffnung-nervoese-systeme-whit_flt8id
The White Room, Opening of Nervous Systems. Photo: © Laura Fiorio/HKW

I picked up the one called The White Room, for the very arbitrary reason that it was curated by Tactical Technology Collective whose brilliant twitter feed i’ve been stalking for months. The other strength of The White Room is its combatant, encouraging and engaged attitude towards rampant quantification, loss of autonomy and demise of privacy. It gives visitors the means to understand their data and devices but it also provides them with the tools necessary to gain more control over their digital life.

The White Room opens up the black box of our daily technological environment, brings to light the links between Silicon Valley’s most successful start-ups and the military-industrial complex, and even uncover the Big Brother that hides behind the benevolent masks of some philanthropic initiatives.

Perhaps the best introduction to The White Room is actually this video that sums a research that Tactical Technology Collective has made into information brokering services:


Tactical Technology Collective, Confessions of a Data Broker

Inspired by David Ogilvy’s book Confessions of an Advertising Man, Confessions of a Data Broker presents results from interviews with and research into data brokers in Europe, North America and Asia, providing insights into how the industry works, who is buying/selling data and what it means for users.

What is worrying is that data brokering is not only unreliable and invasive of your privacy, it is also opaque. It is indeed often very difficult for individuals to find out what data a broker holds on them, how they used it and how long they store it.

citizenxxx8h
James Bridle, Citizen Ex

Citizen Ex is a browser plugin that makes us better understand data gathering. Once installed on your computer, Citizen Ex shows where the websites you are visiting are located geographically. Over time, Citizen Ex builds a user’s algorithmic citizenship based on your browsing habits.

Whether or not you download Bridle’s software, you already have an algorithmic citizenship. Every time you click on a link, every time you visit a website, you leave traces behind. Companies collect this data in order to deliver content and ads better targeted to each individual. But that’s not all! Data gathering is also used for credit rating, insurance, ID verification, health care and fraud detection. And of course, government surveillance agencies like the NSA and GCHQ monitor your data to decide whether to spy on you.

icwatch

Intelligence Community Watch puts data gathering into the hands of the citizens. ICWatch has mined LinkedIn for résumés posted by people who state that they have worked for the NSA or Intelligence community or for related contractors and programs. ICWatch then compiled these findings into a searchable database of the US intelligence community. Transparency Toolkit, who developed it, say the aim of the site is to “watch the watchers” and better understand surveillance programmes and any human rights abuses associated with them.

forgot-your-password-12-800
Aram Bartholl, Forgot you Password, 2013

In 2012, LinkedIn.com got hacked and passwords for nearly 6.5 million user accounts were stolen. A few months later parts of the decrypted password list appeared on the Internet. Aram Bartholl printed 8 books that list the 4.7 million passwords leaked in alphabetical order. The work reminds us that the safety of our data can never be guaranteed.

Some of the artistic projects selected in the show are using everyday objects and tech devices to demonstrate that the “I have nothing to hide” dismissal of surveillance is unwise now that we are part of a quantified society: Ai Weiwei and Jacob Appelbaum’s stuffed panda (see SAMIZDATA: Evidence of Conspiracy. Talking secrets and pandas with Jacob Appelbaum), Sascha Pohflepp’s Button camera, Danja Vasiliev and Julian Oliver’s sneaky Newstweek… And Un Fitbits:


Tega Brain and Surya Mattu, Unfit Bits

metronome
Tega Brain and Surya Mattu, Un Fitbits. GIf via bionymous

Un Fitbits enables you to obfuscate your data traces by generating fake data, while giving you the ability to control and understand your real data. All you have to do is clip the Fitbit bracelet to a metronome, dog, drill, bicycle or pendulum and they’ll get fit and active for you.

The artists were interested in FitBit after noticing that insurance companies were giving away Fitbit to their customers. Wearing the device and walking a certain number steps would earn customers discounts. How do companies benefit from your healthy lifestyle? Can your data be considered ‘yours’ if it can be used against you?

The White Room also presented a series of projects that are decidedly at the most dystopian end of the quantification spectrum:

Sesame-Credit
Sesame Credit. Photo via The Independent

Sesame Credit is a credit-rating system that scores Chinese citizens based on both online and offline data: their spending behaviour, habits, minor traffic violations, fiscal and government information, interests and affiliations. A high score will result in a better chance to find a job, get a date, rent a car without paying a deposit and be deemed ‘trustworthy‘ by the government. The project was approved by the Chinese government as a pilot for a future nationwide database, an individual citizen ‘social credit-rating’ system, planned for nationwide rollout by 2020.

Some projects were labelled “Big Mama” by the curators. Dressed up as care, initiatives such as eye-scanning for refugee aid and facial recognition to monitor attendances in churches look more like Big Mama (“It’s for your own good”) than Big Brother.


Jordan: Iris Scanning Program In Action

The United Nations High Commissioner for Refugees has introduced an iris-scanning technology to verify the identity of Syrian refugees in Jordan. The pilot program allow refugees to withdraw their benefits from ATM machines but also to buy groceries through looking into an iris scanner.

The project is implemented by tech company Iris Guard which sells the same iris-scanning technology for border control, prisons and national ID. Iris Guard has 3 advisory board members: the CEO of a global merchant bank, the former hear of MI6 and the former Homeland Security Advisor to the President of the US.

Electronic databases of personal information raise privacy but also security concerns. Databases are being hacked all the time, and that’s a huge threat to privacy and security. Hacked biometric data is particularly problematic, because unlike credit cards or even social security numbers, the data cannot be modified.

churchix-monitors-congregants-database
Churchix compares CCTV camera footage of people to a database of congregants of the church. Photo Face-Six via

Churchix is a facial recognition-based ‘event-attendance tracking’ software designed to help churches easily identify members of their congregation, and record their attendance at church and church-related events. Churchix identifies individuals in ‘probe’ photos or videos and then matches them with previously uploaded reference photos. Face-Six, the company behind it uses similar software for products used in casinos, airports, shopping malls and at border control posts. Churches in Indonesia, the US, Portugal, Africa and India have already adopted the system.

Normal_is_boring_03
The Google Empire (information graphic / wood and acrylic.) Photo La Loma

A table exposed the presence of marketing departments, Washington D.C. expats, lobbyists and Wall Street analysts behind the sleek facade of some of Silicon Valley’s most successful startups. Think of how Google went from the friendly search engine to Alphabet, the owner and developer of self-driving cars, DNA databases, AI and robotics. What used to be a bunch of bespectacled geeks is now a group of powerful companies who have accumulated vast amounts of power, knowledge, and wealth.

Normal_is_boring_chip01
The Fertility Chip (simulation / laser cut and engraving.) Photo La Loma

In 2012, the Bill and Melinda Gates Foundation gave a grant of 11,316,324 US dollars to MicroCHIPS Biotech to develop a contraceptive chip that can be embedded in a woman’s body for up to 16 years. The technology would enable a remove control of a woman’s hormones, activating her ability to either conceive, or prevent fertilization.

MicroCHIPS hopes to introduce the product in 2018. Note that the technology is intended for women and girls in poorer countries.

0a1-PALANTIR-JP3-master675
Inside Palantir offices. Credit Peter DaSilva for The New York Times

Normal_is_boring_11
The Shire (model of a office room of the Palantir). Photo La Loma

Data-analysis company Palantir Technologies might keep a lower public profile than Airbnb and Uber but it is one of the Silicon Valleys most powerful start-ups. It has contracts with government groups, including the CIA, NSA, the FBI, the Marine Corps and the Air Force. We know that its software processes huge amounts of disparate data to elaborate predictions and conclusions, enabling fraud detection, data security, consumer behavior study, rapid health care delivery, etc. Rumour has it that it was them who provided the data-analysis skills that located Bin Laden. But little else is known publicly about Palantir.

The exhibition reproduced a model of Palantir’s head office, the Shire, based on photographs for a 2014 New York Times article. The world map is based on the strategy board game Risk: The Game of Global Domination.

2martibc7a7c0_b

2risesagain3149604a_b
Patches that can be purchased online, along with t-shirts, calendars and coffee mugs from the apparel store off Lockheed Martin, America’s largest contractor, making fighter planes, cluster bombs, combat ships and designing nuclear weapons. It is also the largest private intelligence contractor in the world, working in the past on surveillance programs for the Pentagon, CIA, NSA and making biometric identification systems for the FBI

49839_xl_eroeffnung-nervoese-systeme-auss_cl1fa1
The White Room, Opening of Nervous Systems. Photo: © Laura Fiorio/HKW

49859_xl_nervoesesysteme-presse-6683-jpg_lqy0jk
The White Room, Opening of Nervous Systems. Photo: © Laura Fiorio/HKW

On Saturdays, Sundays and Mondays, workshops, demos and discussions help visitors understand the devices and interfaces we use every day. White Room workers are also on hand to help visitors navigate an alternative “App Center” that offers tools to regain control over their data and their tech gadgets.

More views of the exhibition Nervous Systems:

49858_xl_nervoesesysteme-presse-6672-jpg_2bmjeg
Opening of Nervous Systems. Photo: © Laura Fiorio/HKW

49846_xl_eroeffnung-nervoese-systeme-tria_sg1faf
Opening of Nervous Systems. Photo: © Laura Fiorio/HKW

Nervous Systems. Quantified Life and the Social Question was co-curated by Stephanie Hankey and Marek Tuszynski from the Tactical Technology Collective and Anselm Franke. The show is at Haus der Kulturen der Welt in Berlin until 9 May 2016.

Related stories: Obfuscation. A User’s Guide for Privacy and Protest, Sheriff Software: the games that allow you to play traffic cop for real, The Influencers: Former MI5 spy Annie Machon on why we live in a dystopia that even Orwell couldn’t have envisioned, SAMIZDATA: Evidence of Conspiracy. Talking secrets and pandas with Jacob Appelbaum.

Obfuscation. A User’s Guide for Privacy and Protest

Obfuscation. A User’s Guide for Privacy and Protest, by Assistant Professor of Media, Culture, and Communication at New York University Finn Brunton and Professor of Media, Culture, and Communication and Computer Science at NYU and developer of TrackMeNot Helen Nissenbaum.

(available on amazon USA and UK)

9obfus9735

Publisher MIT Press writes: With Obfuscation, Finn Brunton and Helen Nissenbaum mean to start a revolution. They are calling us not to the barricades but to our computers, offering us ways to fight today’s pervasive digital surveillance—the collection of our data by governments, corporations, advertisers, and hackers. To the toolkit of privacy protecting techniques and projects, they propose adding obfuscation: the deliberate use of ambiguous, confusing, or misleading information to interfere with surveillance and data collection projects. Brunton and Nissenbaum provide tools and a rationale for evasion, noncompliance, refusal, even sabotage—especially for average users, those of us not in a position to opt out or exert control over data about ourselves. Obfuscation will teach users to push back, software developers to keep their user data safe, and policy makers to gather data without misusing it.

Every day, we produce gigantic volumes of data and that data stays around indefinitely even when we’ve move on. We might want to keep personal data as private as possible but that often means opting out from many forms of credit and insurance, social media, efficient search engines, cheaper prices at the shop, etc. It’s perfectly doable of course but it can often be inconvenient and/or expensive.

So Nissenbaum and Brunton see in obfuscation the means to mitigate or even defeat digital surveillance and they provide us with a brief description of it:

Obfuscation is the deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection.

The authors also call obfuscation the ‘weapon of the weak’ because this method and strategy of resistance is available to everyone in their everyday life. You don’t need to be rich nor tech-savvy to disobey, waste time, protest and confound.

So now we know what obfuscation is, we might want to understand how it works. At this point, the authors provide the reader with a series of historical and contemporary cases that illustrate various obfuscation techniques. Some of them can immediately be applied to your daily life (speaking in a deliberately vague language, using false tells in poker or swapping loyalty cards with other people to interfere with the analysis of shopping patterns.) Others not so much but all are inspiring. Here’s a quick selection of obfuscating actions:

radarj1
Radar Jamming. Image via Steve Blank

Chaff is a radar countermeasure in which aircraft or other targets spread a cloud of small pieces of aluminium, metallized glass fibre or plastic, which either appears as a cluster of primary targets on radar screens or swamps the screen with multiple returns. The method was used during WWII to jam the German military radars. All the operator would see was noise, rather than airplanes.

Twitterbots can fill the conversation on a channel with noise, by using the same # as protesters for example and rendering it unusable.

– “Babble tapes” are digital files played in the background of a conversation in order to defeat audio surveillance.

AdNauseam clicks on every ad on an online page, creating the impression that someone is interested in everything. The plugin confuses the system and thus protects people from surveillance and online tracking.

Bayesian Flooding, an idea of Kevin Ludlow, consists in overwhelming Facebook with too much information (most of it false) in order to confuse the advertisers trying to profile the user and the algorithmic machines that are trying to make predictions about his/her interests.


Spartacus film (Dir. Stanley Kubrick) excerpt featuring the “I’m Spartacus” clip, a classic obfuscation moment

The other half of the book attempts to help us understand obfuscation, its role, purposes, limits and possible impact. The authors also spend a few pages exploring whether and when obfuscation is justified and compatible with the political values of society.

Obfuscation: A User’s Guide for Privacy and Protest is an important and straight to the point book that reminds us that, ultimately, we’re up against intimidating asymmetries of power and knowledge. Stronger actors -whether they are corporations, governmental bodies or influential people- have better tools at their disposal if they want to hide something. What we have is obfuscation. It might require time, money, efforts, attention but it gives us some leverage as well as some measures of resistance and dignity.

The book offers 98 pages of dense, informative and never tedious text. I’m glad a publisher as respected and as widely distributed as MIT Press chose to print it.

Sheriff Software: the games that allow you to play traffic cop for real

Over the past few years, artist Dries Depoorter has been exploring issues of privacy in ways that are at times thought-provoking and at times almost comical (often both.) He started by looking into his own privacy, either through a program that was taking and sharing online one screenshot a day of his computer screen at a random time or through a website that used Google Streetview to disclose in real time the artist’s exact location and direction.

His recent works explore how other people are willing to surrender their privacy for the sake of entertainment, safety or just the prospect of a one night stand. One of the outcomes of this approach is the recently released and muchdiscussed Tinder In which puts side by side and to often surprising outcomes the profile pictures that an individual selects to represent himself or herself on two platforms that are at opposite ends of the social spectrum: LinkedIn & Tinder.

Amsterdam, 19-11-2015, IDFA International Documentary Filmfestival Amsterdam. Photo Nichon Glerum
Installation view of Sheriff Software (JayWalking) at the DocLab: Seamless Reality exhibition, part of IDFA International Documentary Filmfestival Amsterdam. Photo credit: Nichon Glerum for IDFA DoLab

Amsterdam, 19-11-2015, IDFA International Documentary Filmfestival Amsterdam. Photo Nichon Glerum
Installation view of Sheriff Software (JayWalking) at the DocLab: Seamless Reality exhibition, part of IDFA International Documentary Filmfestival Amsterdam. Photo credit: Nichon Glerum for IDFA DoLab

Depoorter’s new investigation into privacy will premiere this week at the DocLab: Seamless Reality exhibition in Amsterdam (more practical info below.) The set of works, grouped under the name Sheriff Software, invites people to not just be the object of the attentions of the CCTV cameras that relentlessly gaze upon us but also to use them, turn the scrutiny back at the police and even play traffic cop.

The first piece in the series is JayWalking, a software that scans traffic lights at intersections in different countries, check whether the light is red or green and spots anyone braving the red light and jaywalking. Visitors of DocLab are given the opportunity to witness any infraction and decide whether or not to send to the police a screenshot that proves the transgression. The consequence of the decision of the public is made even more tangible by a counter at the bottom of the screen that shows how much the fines are for the offense in the country where it’s being committed.

Will we report the unsuspecting jaywalker? Will we click on the button that can send a screenshot of the violation to the nearest police station?

Are we going to empathize with our fellow pedestrians? Or are we going to point the finger and divulge their minor crimes? I doubt that people at DocLab will be willing to snitch on jaywalkers when there is a group of people around. But how different would it be if they were alone at the moment of taking the decision? JayWalking reminded me of an experiment that took place in 2006 when Shoreditch residents were given access to live CCTV footage of their neighbourhood on their own tv sets. People were invited to tune in the “community safety channel” and report any suspicious behaviour by text to the local police. Apparently, local CCTV cameras attracted viewing figures with an “equivalent reach of prime time, week-day broadcast programming”.

Then of course, there’s the no so minor detail of face recognition systems. I guess the JayWalking screenshots will only be showing the blurry silhouette of the offender. But what will happen if one day/when surveillance cameras are equipped with automated facial recognition technology?

005vc15970While Jaywalking enables people to spy on other people, the second work in the Sheriff Software series lets people watch the watchers. Called Seattle Crime Cams, the piece relies on the Seattle Area Traffic and Cameras system which monitors city traffic.

Seattle not only shares the live stream of its CCTV network, it also share the dispatch from the Seattle Police Department radio. Depoorter’s Seattle Crime Cams will connect police calls with the live stream of the nearest webcam. The public will be able to witness incoming calls that report a traffic incident or a robbery and see how long it takes for the police to arrive at the scene.

Seattle Crime Cams turns us into ultimate long-distance disaster tourists, virtually present at the scene of the crime in Seattle. In this city, which is filled to the brim with traffic cameras, the police make the calls they receive available online. Using the latest calls, the closest live webcams are constantly zooming in on the very latest violations.

Amsterdam, 19-11-2015, IDFA International Documentary Filmfestival Amsterdam. Photo Nichon Glerum
Installation view of Sheriff Software (Seattle Crime Cams) at the DocLab: Seamless Reality exhibition, part of IDFA International Documentary Filmfestival Amsterdam. Photo credit: Nichon Glerum for IDFA DoLab

Amsterdam, 19-11-2015, IDFA International Documentary Filmfestival Amsterdam. Photo Nichon Glerum
Installation view of Sheriff Software (Seattle Crime Cams) at the DocLab: Seamless Reality exhibition. Photo credit: Nichon Glerum for IDFA DoLab

Amsterdam, 19-11-2015, IDFA International Documentary Filmfestival Amsterdam. Photo Nichon Glerum
Installation view of Sheriff Software (Seattle Crime Cams) at the DocLab: Seamless Reality exhibition, part of IDFA International Documentary Filmfestival Amsterdam. Photo credit: Nichon Glerum for IDFA DoLab

Sheriff Software is premiered at IDFA DocLab, a ridiculously interesting program of screenings, performances, talks, exhibition and other events that explore the future of documentary storytelling. Think augmented reality, artificial intelligence, live cinema and interactive experiments. The installation is part of the DocLab: Seamless Reality exhibition (19-29 November) and will be also be one of the highlights of DocLab Live: The Art of Artificial Intelligence (23 November.) The program is organized by IDFA (the International Documentary Festival in Amsterdam) and the Flemish Arts Centre De Brakke Grond.

Amsterdam, 19-11-2015, IDFA International Documentary Filmfestival Amsterdam. Photo Nichon Glerum
DocLab: Seamless Reality exhibition, part of IDFA International Documentary Filmfestival Amsterdam. Photo credit: Nichon Glerum for IDFA DoLab

The Influencers: Former MI5 spy Annie Machon on why we live in a dystopia that even Orwell couldn’t have envisioned

I’d always wanted to go to the The Influencers festival. So i went. Last week. No, i’ve no idea what took me so long. Based in Barcelona, the event looks at some of the most radical, provocative and socially-engaged forms of media art through documentary screenings, workshops, performances and talks. I’ll come back with more details about the programme but today i just want to share the notes i took during Annie Machon’s keynote presentation on the evening of Thursday the 22nd of October.

CR8INO8WUAA6aSm.jpg_large
Photo by The Influencers

CR8Ay_DWEAAugc1.jpg_large
There were LOTS of people in the audience

Annie Machon is an intelligence expert and author who worked for 6 years as an intel­li­gence officer for MI5, the UK domestic counter-intelligence and security agency. Together with her ex-partner, David Shayler, she resigned in the late 1990s to blow the whistle on the spies’ incom­pet­ence and crimes.

book_c000overIn 2005, Machon published her first book, Spies, Lies and Whistleblowers: MI5, MI6 and the Shayler Affair in which she offers criticism of the intelligence agencies based on her observations of the two whilst in the employment of MI5.

Machon started off by saying that she had never been interested in becoming a spy. She applied to work for the Foreign Office but got a letter from the Ministry of Defense suggesting she might be interested in working with them. She went through 10 months of recruitment. They were looking for a new generation of counter-terrorism officers

One of Machon’s first work at MI5 consisted in investigating fellow citizens who might be involved in ‘subversion’. Spying on political activism had massively increased and reached ridiculous proportions. Machon gave the example of a schoolboy doing some homework about the communist party. He wrote a letter to the party asking for more information about their activities and his letter was intercepted. That’s how a schoolboy got a file at MI5.

Civil liberties activists, journalists, musicians, etc. had a file at MI5. So did many prominent politicians. When Labour won the elections in 1997, almost all senior members of the party -and that includes Tony Blair, Home Secretary Jack Straw- had a file because some of those ministers had been involved in left-wing politics in their youth.

Which means that the spies have secret information on people who are supposed to be their political bosses, and that makes for a preoccupying ‘tail wagging the dog’ situation.

But what Machon found most upsetting while she was working at MI5 was the discovery that the spies had lied to the government on several occasions about mistakes they had made. She said that many IRA bombing could have been avoided had MI5 agents been more competent. There were also some illegal phone taps against journalists and people wrongly sentenced to prison even though MI5 or MI6 had evidence that would have shown they were innocent.

She gave the example of two students wrongly accused of attacking the Israeli embassy in Lon­don in 1994. MI5 had documents to innocent them. But the agency refused to disclose the evidence because, under the secrecy laws, they didn’t have to. They said nothing and the young people were both sentenced to 20 years in prison.

Shayler also found evidence that MI6 had funded attempts by Islamic extremist terrorists to assassinate Gaddafi in 1996. The plot failed and Gaddafi survived.

This plot amounted to state-sponsored terrorism. The spying agencies broke the law and there was no way they could justify their scheme by claiming it was ‘public defense’.

6gadddda2-63931f9b6734-2060x1236
Libyans step on a carpet featuring Libyan leader Muammar Gaddafi. Photograph: John Moore/Getty Images

David Shayler and her tried raising their concerns from the inside but no one would listen. So they both resigned and set to tell what they knew to the media in the hope that the scandal would lead to reforms and greater accountability of the spying agencies.

However, upon entering MI5, both had signed Official Secrets Act which makes it illegal to say anything about their job. In brief, it’s a crime to report a crime and Shayler faced 6 years in prison for revealing the crimes of the agency.

The story about the spies’ crimes broke in 1997 and the couple had to flee the country, they wanted to stay free to have a chance to argue their case.

David-Shayler-and-Annie-Machon-2002
Annie with David Shayler outside the Old Bailey in 2002, at the start of his trial for breach of the Official Secrets Act. Photo: The Mirror

After a couple of years hiding and living in Europe, Shayler decided to go back to the UK. He wanted a day in court to explain why they had resigned and to talk about the crimes of the agencies. In the end, he was never allowed to say anything. But at the trial, the judge concluded that what Shayler had done was not motivated by greed and that no life had been put into danger following their revelations. Journalists were present in the room. Yet, the day after, all of them wrote the exact opposite of the judge’s conclusions.

There has never been any enquiry from the government into Shayler’s allegations.

Shayler and Machin separated but both found the post-whistleblowing life hard: your reputation is destroyed, you find it difficult to earn money, your social life is affected, etc. But her experience taught her 2 valuable lessons:

1. How easily the media can be controlled, especially in the UK. After Shayler’s conviction, they reported the exact opposite of what the judge had said.

There are two ways to manipulate a journalist.

First, there is the soft method. They invite the journalists in the ‘secret circle’, give them scoops that will give a boost to their career. In exchange, the journalists are invited to report back to MI5 or MI6 if ever they hear of anything that might embarrass the spy agencies.

Then, there is the hard way. MI5 has at its disposal a battery of laws that enable them to attack any uncooperative journalist.

For example, terrorism laws can be used against reporters to force them to expose their sources or to gag reports. There are also label laws to sue journalists. As a result, self-censorship mechanisms have taken place. Machon explained that senior journalists end up collaborating with senior military officials and spies to decide whether a piece of news can or cannot be made public. There is a term for that: the D-notice system. The Official Secrets Act can also be used to gag the media.

MI6 even has an “Information Operation” section to plant fake stories and control the way media break news.

2. The second important lesson was the importance of privacy. Shayler and Machon always assumed that their whole life was listened to. Which made it difficult to carry on a relationship. In the ’90s, surveillance was resource-intensive for spies. Now, post-Snowden, it’s not about targeting someone anymore, all of us should be living with a sense of being under surveillance. She noticed that there was a great deal of outrage about the NSA revelations in countries such as Germany or Brazil. But not so much in the UK. Machon even talked about UK spy agency GCHQ pros­tituting itself to NSA. An example of that is the Tempora operation which involves GCHQ tapping fibre-optic cables to collect global email messages, Facebook posts, internet histories and calls, and shares them with the NSA.

But what if you don’t do anything wrong?

Well, what you do online might still be watched without your consent or knowledge. She gave the example of how the Optic Nerve program collected Yahoo webcam images in bulk. 10% of the conversation taking place on these webcams were sexually very explicit. If you were one of the people who did sexy things in front of your webcam with your partner who lives in another country, you had done nothing wrong. Yet, you were still running the risk of being spied on.

And if you feel you are being watched you start to self-censor, you pay attention to the kind of culture you can access, your rein in your freedom of speech. It’s similar to what happened when people distrusted their own flat, even their family members because they were afraid of the STASI.

For Machon, if we don’t have privacy, we can’t have a functional democracy. In 1948, the Universal Declaration of Human Rights stated that we have the right to privacy (see article 12.)

However, there are ways to fight back!

1. There is the democratic approach: concerned citizens should ask their representatives to act on their behalf, have laws put in place that would further protect privacy and achieve greater transparency and accountability from the spying agencies.

2. The guerrilla warfare way: wikileaks that protects their sources and keep the information online, encryption tools, Tor anonymity network, etc. Machon recommends going to a CryptoParty where you’ll be shown the basics of cryptography such as Tor, disk encryption and virtual private networks.

We are living a dystopia that even Orwell couldn’t have envisioned.

1984_015Pyxurz
Image from the film Nineteen Eighty-Four, directed by Michael Radford and based on George Orwell’s novel of the same name. Seen here, members at the Two Minutes Hate, and a large screen featuring the face of Big Brother. Image via

3. The third way we can fight back is by looking into Code Red, an advocacy group on digital rights that Machon recently launched together with privacy activist Simon Davies. The advisory group of the project includes Jacob Appelbaum, crypto pioneer Whitfield Diffie, security guru Bruce Schneier and computer scientist and former NSA employee turned whistleblower William Binney, among many others.

Code Red aims to building bridges between communities of lawyers, whistleblowers, journalists, activists, etc. It will also create a clearing house for information in the anti-surveillance movement and will support whistleblowers and sources.