Technology is an extension of societal mores and seeps into our lives, how we breathe, what we know, and how we move sometimes leaving an indelible digital mark. Since the rise of Covid-19, technology has played a major role in how we understand the progression of the disease, how we communicate with each other, and how our movement is now—suddenly—linked to an airborne disease. During a moment where computers, cellular phones, and online conferences can help support social distancing and minimize the risk of spreading the novel coronavirus, it is also important to think about the ways that digital technology—whether it be cameras, surveillance, and artificial intelligence software—do the work of what Lisa Nakamura terms “cybertyping.”[1] That is, the ways that the Internet disseminates and commodifies images through racialized technology. What we know is that technology has a deleterious impact on our data and the environment. However, it is worth unpacking the politics of data, privacy, and the environment from an intersectional lens, especially as we think of the invisibility and hypervisibility of marginalized groups.
Over the recent weeks, there have been ample debates about the politics of track and tracing apps, especially how they can potentially impose on people’s privacy. Some governments argue that accessing the movement of people could provide public health data that would spot clusters of infection for the novel coronavirus. In Europe, contact tracing apps have emerged as part of a broader public health policy to track the progression of Covid-19. Italy introduced its platform Immuni that began in mid-June, France has activated its StopCovid app, and Germany’s app (which is where I live) began its app Corona Warn app in June 2020.
While the General Data Protection Regulation guides data protection and privacy in the European Union and European Economic Area, one major issue is that the tracing apps open up a host of questions regarding the power of tech firms to accumulate and access people’s data. This has led some officials to say that the risk of privacy intrusion is not worth engaging in tracing apps. In Norway, officials halted the use of the Smittestopp app—created to combat the novel coronavirus—after the country’s data-protection authority raised alarms. The Norwegian government’s concern about the tracing app extends to the latest coronavirus virus and the processing of personal data. Yet the issue is much deeper than that.
Data privacy is an intersectional issue that should consider how marginalized groups are disproportionately targeted or harmed by data technologies. While Black people living in the United States are disproportionately infected and dying from Covid-19, very little is being done to provide. The consequences of contact tracing apps extend from the preexisting inequalities that get reproduced to the carbon footprint of the technology itself. This means it’s necessary to think deeply about the gender and racial absences in tech design. The reality of racist technology, as Ruha Benjamin outlines in her latest monograph, is a work of politics and sociology that explores how social relations, particularly of race and power, shape the digital landscape, by inquiring into the design practices of the tech industry. Borrowing from the legal scholar Michelle Alexander, Benjamin argues that we live in a new era, a space where the data divide manifests as “the New Jim Code.”[2] Blackness, in some cases, is made hypervisible especially in a moment when activists are fighting for their lives.
Unfortunately, the concerns about data privacy does not just impact African Americans, other groups are affected too. As the Washington Post reported in 2019, companies such as Palantir has been criticized for using the data that it collected to help health agencies to also find and remove undocumented people. Journalists have also expressed caution that contract tracing apps could inadvertently exacerbate the privacy of domestic abuse survivors if contact details were sent to their perpetrators.
An intersectional approach to data technologies would try to unpack how these data are biased in their outcome, but also, how they are biased by design. As Caroline Criado Perez noted, “[t]he gender data gap isn’t just about silence. These silences, these gaps, have consequences.”[3] What Criado Perez points to is how coding structures neglect women and make them invisible. However, one should go even further and point out that the gender binary and cis-gender centrism ignores the concerns of gender non-binary and transgender people. As Britt Rusert’s Fugitive Science shows, there is a long history of Black scientists, scholars, and artists resisting and subverting racist science. In the twenty-first century, racism is a classification practice that bears new life in technology.
One area that is often left out of the conversation on data studies is the way that it impacts the environment. Indeed, we should be concerned about the relationship between digital technologies and surveillance. At the same time, it is worth noting the ways that the tools we use—from cellular phones to data centers—create a deleterious impact on the environment. As Greenpeace reported, smartphones and data centers are harming the environment since they are one of the many mechanisms that impose a carbon footprint. The phones themselves produce little energy but their production will have long-term environmental impacts.
As media scholar Athina Karatzogianni noted, critical digital activism plays a pivotal role in generating social movements from below. A right to privacy is one we should all have, one we can effectively shape and collectively demand. Data for Black Lives, a non-profit organization, has been doing phenomenal work on how data needs to move from unaccountable racist policies towards one rooted in collective action.
Over the past few months, the world has come to know more about the life of the novel coronavirus—not something that is merely framed as the common flu, but more. We have to reckon with the global impact of the disease and the extent that healthcare workers and medical institutions can provide treatment. In situations where providers have limited resources or depending how the pandemic materializes in the Global South, a comprehensive solution based on mutual aid, material assistance, and universal healthcare will have to emerge. As Vijay Prishad remarked in February 2020, this is a time for solidarity, not stigma. That means we need to be critical of the surveillance laws that continue to devastate the lives of women, people of color, and people from the Global South.
[1] Lisa Nakamura, Cybertypes: Race, Ethnicity, and Identity on the Internet (New York, NY: Routledge, 2002).
[2] Ruha Benjamin, Race After Technology: Abolitionist Tools Fo for the New Jim Code (Cambridge, UK: Polity Press, 2019), 48.
[3] Caroline Criado Perez, Invisible Women: Exposing Data Bias in a World Designed for Men (London: Chatto & Windus, 2019), xi.
*Cover image credit: Creative Commons.
[Image description: Street-level poster advertising on a street light, one reading BIG DATA IS WATCHING YOU]