Presenting research on ‘automated powerlessness’ at interdisciplinary international conferences

Innovation Fellow Carolina Are talks about presenting her research on 'automated powerlessness' on online platforms such as Instagram and TikTok at international conferences.

Presenting research on ‘automated powerlessness’ at interdisciplinary international conferences

I have just concluded a season of conferences presenting my first paper arising from my work at the Centre for Digital Citizens, titled ‘An autoethnography of automated powerlessness: lacking platform affordances in Instagram and TikTok account deletions.’ Now published in Media, Culture and Society, this paper starts from my own experiences of social media de-platforming to offer pointers to discuss a more equal platform governance.

My paper blends the fields of creator studies, feminist digital sociology and platform governance studies. Thanks to its interdisciplinary nature, I presented it (both virtually and in-person) at conferences such as the ICA postconference in Paris, Global Perspectives on Platforms and Cultural Production at the University of Amsterdam, the European Feminist Research Conference in Milan and the Trust and Safety conference in Stanford, California throughout the spring, summer and early autumn.

I’m a researcher and a social media creator myself. Ahead of joining the CDC and during my PhD, I supported myself through teaching online and offline pole dancing classes, and through brand partnerships on my Instagram and TikTok, where I have a combined following of over 350,000.

But when first TikTok and then Instagram deleted my accounts in 2021, I suddenly found myself without the tools that were so essential towards my livelihood. This disempowering experience reveals the lacking affordances of platform governance, which has so far largely focused on removing user content and accounts, with very little chance to appeal, or even to speak to someone within platforms to attempt to recover their profile.

#GDC_UVA22 and the field of Creator Studies

In June, I spoke at Global Perspectives on Platforms and Cultural Production at the University of Amsterdam, a conference setting the tone and the future agenda for the field of Creator Studies.

A field framed and identified by Queensland University of Technology’s Distinguished Professor Stuart Cunningham and University of Southern California’s Clinical Professor David Craig, Creator Studies is an interdisciplinary field focused on the dynamics of new forms of cultural production across social media platforms from diverse fields, methods, and epistemologies.”

As part of my #GDC_UVA22 Suppressing Sex panel, I argued that after creating positive, inclusive spaces for nudity and sexuality, platforms now govern those who post this content as if they’re having sex by default, bringing a lack of consent to their governance. In my Media, Culture and Society paper, I describe this phenomenon as ‘automated powerlessness,’ which replicates forms of gender-based violence by simultaneously sexualising and silencing users.

Due to my experiences as a researcher and as a creator, it was an honour to also be asked to join the conference’s Creator Studies roundtable, discussing where our field is going next. We discussed five main ways in which our reliance on platforms plays out in our research:

1. Most researchers are platform users themselves. This means that, although we strive to remain objective and detached from our findings, we bring our own personal knowledge and user experience to platforms as we use them to gather data and participants;

2. Precisely because researchers are also users, we inevitably face similar challenges to the users who inform our research – and that includes an opaque platform governance. Here, I shared my experiences of circulating a survey for my current CDC project only for it to be flagged as a ‘dangerous’ or ‘spam’ link by Instagram while other researchers spoke about being shadowbanned (i.e. finding their profiles or content to be hidden but not outright deleted) while attempting to outreach to sex workers for their research;

3. In a networked world, social media are not only a crucial participant outreach tool – they are a research promotion and public relation tool to increase researchers’ profiles…

4. … but this has its own challenges, because with this visibility comes the threat of platform censorship, of online abuse by certain internet subcultures, and of an excessive, unpaid and unsupported amount of digital labour for the sake of impact;

5. Precisely because of these challenges, we discussed the need to hold platform power accountable, embracing research’s activism potential to improve freedom of expression and work for all users.

Trust and Safety… for whom?

At Stanford University’s Trust and Safety conference, I presented my Media, Culture and Society paper to an entirely different audience: far from content creation and the creative industries, the conference largely focused on making the internet safer, anticipating and removing risks. My talk initially seemed to be opposing the conference’s trend, provocatively asking: making the internet safer for whom?

Safety of a certain category of users should not come at the expense of another but, in the aftermath of FOSTA/SESTA – an exception to the 1996 US Telecommunication Act approved in 2018, making platforms liable for promoting sex trafficking – the safety and content of those who post nudity, sex and sex work have become expendable. In the paper I explain that blended with platforms’ commercial interests, the concept of ‘safety’ is being co-opted by social media companies to remove content they and their investors may find objectionable

By sharing my disempowering experiences of de-platforming at the Trust and Safety conference, attended by researchers and Big Tech workers alike, I argued that the ability to work through and exist on social media should be an element of trust and safety, too. If citizens are removed from digital society with no form of recourse or appeal, they are not safe: they are excluded from networking, work opportunities, from earning a living and from keeping in touch with their loved ones.

However different the push for less censorship might have been at #TSRC22, most researchers agreed on one connecting theme: platform transparency. While this may not solve issues in platform governance, knowing more about platform processes, as well as better communications with both users and researchers, may help society as a whole understand the challenges platforms face and highlight avenues to improve the detection of false negatives, minimising removals of false positives.

Addressing platform governance inequalities through research

Presenting the issue of de-platforming under a freedom of expression lens is particularly important when sex and nudity are concerned: still a taboo subject, this content and these behaviours are central to human life, but are often conflated with danger, particularly post-FOSTA/SESTA.

Still, as a creator myself and as someone researching on how Big Tech governance affects the lives and livelihoods of marginalised creators such as sex workers, I found this season of conferences incredibly hopeful, generative and energising: seeing that increasing work on de-platforming and its impact on users’ freedom of expression and income is being produced will hopefully means that better solutions can also be designed.