Transparency manipulation

Transparency, manipulation & reclaiming data

Yleinen

How do these statements make you feel? Do you feel in control of where your data is going? Do they really increase your awareness? Or, do they leave you with some doubts and hesitations?

[First published on LinkedIn June 30, 2023]

This blog is the second part of the iLearn trilogy, written in reflection of my Secondment courtesy of the OpenInnoTrain project, funded by the European Commission, and hosted by Anne-Laure Mention. This part is heavily inspired by my conversations with Dan Harris and David Rousell from the Creative Agency (RMIT University). The discussions focused on my team’s Academy of Finland funded project, BUGGED (The Emotional Experience of Privacy, Cyber Security and Ethics in Everyday Pervasive Computing), and the work that David, Dan and colleagues have been doing over the years.

For a while now, my team (Hanna-Kaisa Alanen, Ville Vakkuri and Satu Rantakokko) and I have been negotiating the significance of the General Data Protection Regulation (GDPR). From a research perspective, there are several interesting and somewhat unnerving dimensions to the regulation. From the outset, there seems to be pro-citizen intentions behind GDPR. The thought of being able to control one’s data through understanding of its collection and processing, and the ability to ‘opt-in’ or ‘opt-out’ seems like one step forward. Yet, as many of you would have already noticed, there are a few pitfalls already to this first step. Hands up those of you who actually read every cookie sign you encounter? And, do you accept the cookie statement and data collection before or after reading?

How do these statements make you feel? Do you feel in control of where your data is going? Do they really increase your awareness? Or, do they leave you with some doubts and hesitations, even though you did accept, because you want access to that content at the other side of the statement?

So, there’s a shadow of doubt, a taste of darkness and possible receipt when entering into a website of your favorite artist, entertainment website, supermarket, or even public service. The statement was placed there in your best interest, but you haven’t spent the time in familiarizing yourself adequately, and now you are the lazy, negligent user.

Then, what about the spaces in which you don’t ‘opt in’ or ‘opt out’? Unless, of course, you do your homework well and choose routes known not to have Closed Circuit Television (CCTV) for instance (see the fascinating work of CCTV detection by Andrei Costin’s team), you will not doubt be leaking data every step you take.

While planning this current visit, I deliberated data collection in relation to which part of the BUGGED project I would focus on (Sensations of Privacy (degree of spookiness), Ethics & Behaviour (boundaries), Threads of Experience (stories), and Design4Comfort (principles). In the first draft of my research plan I mentioned ‘data collection’. The plan boomeranged by administration highlighting the complexity of importing data into Europe thanks to GDPR. Naturally, this fact is complex and baffling in itself, assuming that cloud storage would be used, which renders even the European data international.

Yet, I began considering a few aspects of this matter. Firstly, contemporary challenges to research. Already research ethics, ethical clearances, and the knowledge of ethical clearance application processes, committee meetings, and amendments, have meant that in time-challenging situations, researchers are discouraged from engaging in topics and target groups requiring clearance. But now, in relation to GDPR, it feels that this ethical clearance process is multiplied in a chain of events that does not merely stop with the publication of anonymized results, yet, which a line of responsibility and accountability that sees the guardianship of data, where it’s stored, how it’s accessed lasting years after the research is published. Moreover, not only are the researchers responsible for this data, but so too are many individuals and teams across institutions.

Thus, another aspect about GDPR that I began to consider was whether or not, indeed, GDPR is intended to protect people so much as frighten them. A whole new economy driven by data has emerged. That’s true. Has GDPR stopped data trade? You can answer that yourselves. Certainly, internet scamming, fraud and identity theft are rampant.

Definitely. But, what about the rest of it?

Talking about this with David Rousell and Dan Harris was empowering. I had begun to take an approach of fully identifiable data collection, where participants have agency and attribution. Patting myself on the back for originality, David was quick to share his experiences of reclaiming datathrough projects with children and young people in which he and colleagues such as Liz de Freitas projected electrodermal activity (EDA) data onto walls, playing with the performativity of the children, their bodies, and their interaction with their own sub- and unconscious data. This data represents just a portion of the pervasive data collection practices that the world of commerce engages in to access the unconscious psycho-physiological systems resting at the core of human existence. In their work, Rousell and colleagues embody this data through iMotion technology and enable children to reclaim what is theirs.

These performative and participatory acts connect with movements of Sousveillance (to watch from below) in which surveillance can be understood as the individual gaze from the inside out. Steve Mann made the term famous via his cyborg existence and experiences of cyborg discrimination. And, if we think about this for a moment. The technology that was (and is) used by organizations and institutions to surveil citizens, had been implanted into and worn on Mann (i.e., cameras, sensors, sequential wave imprinting machine etc.). Mann

has equipped himself with and advanced the technology of the organizations that monitor populations. He is effectively able to sense and record how societal institutions sense and record us. Yet, also, being such ingenious ideas, many of his quirky innovations are being swallowed by mainstream corporate, once again, to satisfy people’s desires for augmentation, while allowing the big machine to access the most intimate and sensitive areas of human life.

It becomes an invisible watchdog that has individuals placed in constant struggle of wanting to be a part of society, while being painfully aware that this participation may come at the cost of autonomy and safety. I like the idea of transparency, particularly from the perspective of research. Why, in the name of GDPR should people want to be turned into a serial number? To my mind, voluntarily offering insight that is then attributed to the individuals themselves (not the thoughts and ideas of the researcher), heightens peoples Voice. In an era of autonomisation and data-isation, don’t we want to be heard, for the humans that we are?

 

Leave a Reply

Your email address will not be published. Required fields are marked *