In one of their previous blog post Bertrand de La Chapelle and Lorrayne Porciuncula conceptualized Datasphere as a complex system where massive amounts of data, people, and norms interact. Re-imagining data in such a constellation allows us to look with a fresh eye at the phenomenon of data itself – not as a resource to capitalize on but rather as a new sphere to explore and govern for shared benefit. This new concept also invites certain boldness of imagination. My imagination leads me to think about how the Datasphere could be a system free of abuses and inequality. However, if the Datasphere includes norms and people, there is a risk of simply transferring into the Datasphere the structures of abuse and subordination existing in other forms of relationships. Without early deliberate intervention, the Datasphere will become yet another frontier where the entrenched relationships of power ossify. This is not the Datasphere we want. In this blog post, I will provide a brief overview of the literature in the emerging field of feminist critique of data-driven phenomena and existing framework for data protection with the idea of finding some lessons for a just and inclusive Datasphere.
Opposing inequality needs to be deliberate
Many authors exploring the new technologies and fields of science from a feminist perspective underline how these systems tend to amplify the pre-existing power relations based on the exploitation and exclusion of certain groups. For example, Mar Hicks in Sexism is a Feature, Not a Bug tells the story of how the computing industry at its beginning used skilled female workers, only to exclude them once computing jobs gained more prestigious status. In a similar vein, research exploring the coding of gender on Facebook indicated developments on the platform allowing for non-binary gender identification, while at the same time collapsing into binary distinctions at the deeper levels of data structure. This insincere maneuver allowed Facebook to cater to the needs of the advertisers – the platform’s main source of income, while at the same time parading the image of inclusiveness to the users.
Sometimes the problem of lack of inclusiveness and discrimination might be presented as simply a problem of ‘bad code’, meaning: easy to fix, not calling for any fundamental changes in the system. However, as indicated by Gurumurthy and Chami, this “detract[s] attention from digital capitalism’s primitive instinct for “datafying” the social and mining the “datafied” social, ad infinitum”.1
This means that when conceptualizing the Datasphere, one needs to be very deliberate and vigilant in naming and challenging discrimination, forms of abuse, and exclusion that might otherwise penetrate the system as a part of the pre-existing setup.
Power structures matter
Connected to the previous point, feminist researchers underline the importance of power structures in the context of data. One of the main contributions of feminist scholarship is challenging the pre-existing, accepted structures and exposing how they enable keeping certain people in power (while at the same time exploiting others) in a deliberate albeit hidden manner.2
In their recent book Data Feminism, Catherine D’Ignazio and Lauren Klein present seven principles for feminist data – which are also relevant in conceptualizing the Datasphere. The first principle in feminist approach to data, according to these authors, is to examine power – to analyze how power operates in the world. The second principle is to challenge power by challenging unequal power structures.3 The unequal power structures are the structures mounted for the benefit of certain privileged groups, functioning in a way that perpetuates their position at the cost of those in marginalized positions (because of their gender, race, age, and disability – feminist scholarship is relevant for any group excluded from the system of power attribution). D’Ignazio and Klein give a very stark example: Amazon’s algorithm facilitating the recruitment process, which was trained on the historic data of the job applicants. Since the past Amazon job applicants were mostly male, the algorithm learned to screen against female applicants (eventually, this technology was abandoned by the company).
As mentioned above, the systems of power are embedded in various societal structures and translated into technological systems. Even the most complete datasets, or AI systems yielding the most accurate results, might be used for oppression and exploitation – technological solutions and improvements will not solve the problem of unjust technologies.4 Noticing the injustice created by the power system requires a lot of attention and discipline, since they are so pervasive it is easy to simply accept that ‘this is the way things have always been done’.5 The concept of Datasphere is useful not only because it provides a good analytical tool for talking about modern systems of data governance but also because it creates momentum. The Datasphere Initiative calls to escape the inertia and look at data critically, to imagine anew what might be possible. Such a constitutive moment is precisely where we might be more prone to examine and then challenge the power structures around data that create unjust outcomes.
Feminist data sovereignty and digital self-determination – the new perspectives
Feminist scholarship provides not only critique and framework for approaching the data from the perspective of power, it also helps to think at new modes of governing data, which could be operationalized in the Datasphere. For example, Gurumurthy and Chami discuss approaches for data governance focused on a collective interest in data, and thus contesting the current individualistic framework set forth in the European Union General Data Protection Regulation (and later repeated around the world in various data protection laws). These scholars challenge the current ideas of data sovereignty based on two kinds of understanding of this concept: as belonging to an autonomous individual and the sovereign states protecting their data resources. Rather, they propose the “idea of sovereignty as collective will formation – the right to democratically determine the ends to which data and data-enabled intelligence will be used”.6
Such a feminist approach to data sovereignty would enable individuals to participate in the decision-making about data and could use the data commons for the public good, beyond the platform and government structures. This is where the ideas of data trusts and data cooperatives could serve as an example of putting feminist thinking into novel forms of data governance, precisely in the spirit of the concept of Datasphere.
Similarly, in recent years another concept has drawn the attention of scholars and governments who want to rethink the position of humans in digital ecosystems, that is the concept of digital self-determination.7 In particular, the Swiss initiative8 on digital self-determination in the context of data spaces has laid down the framework for using such a concept to rethink our current approach to data, where the inescapable conundrum is this: how do we protect people from data-driven abuses (be it for example discrimination, surveillance, manipulation, disempowerment, extraction of economic gains, etc.) while at the same time opening up the possibility of sharing data and using their enormous potential for social good?
While the concept of digital self-determination is still fuzzy, the main goal of introducing it into the discussions on the future of digital ecosystems is to bridge the gap between protecting individuals and allowing widespread use of data. In this vein, the proponents of the concept of digital self-determination suggest that we re-focus the discussion from the omnipresent topic of personal data protection to the possibilities of using data in trusted and secure data spaces. Moreover, the idea of digital self-determination urges us to perceive the self not only as a hyper-rational disconnected agent but rather as a social and connected being, participating in communal life and exchanging both personal and non-personal data for realizing various personal and societal needs.
Here too feminist scholarship might be useful, and should not be overlooked in further developing the concept of digital self-determination. Of particular interest in this context is the feminist exploration of the concept of self. One of the great contributions of vast feminist scholarship in this area is challenging the traditional Western ideas of selfhood underpinning our idea of freedom, autonomy, and empowerment, which are all relevant to the idea of digital self-governance. Feminist writers point out that such traditional concepts are presented in literature from an individualistic perspective of a rational being, exercising free choice and unhinged by the social relationships of subordination or limits of the body. Since this approach does not reflect the lived experience of many groups in society, including women, many feminist writers focus on the process of construction of self, which is embedded in social relationships and highly contextual.9
My final point is thus that feminist writing can and should serve as a source of ideas for imagining the Datasphere, its normative layers, and modes of governance. Feminist scholars have explored various illusions, injustices, and biases inherent in many traditional ways of going about different aspects of life and their critique captures the current Zeitgeist of rethinking human interaction with data.
1 Gurumurthy, A., Chami, N. (2022). Beyond data bodies: New directions for a feminist theory of data sovereignty, Data Governance Network Working Paper 24; see also Theilen, J. T. & Baur, A. & Bieker, F. & Ammicht Quinn, R. & Hansen, M. & González Fuster, G. (2021), Feminist data protection: an introduction. Internet Policy Review, 10(4).
2 Suárez-Gonzalo, S (2019). Personal data are political. A feminist view on privacy and big data, Recerca. Revista de Pensament i Anàlisi, 24(2), pp. 173-192.
3 D’Ignazio, C., & Klein, L. F. (2020). Data feminism. The MIT Press, p.17.
4 See Kalluri, P. (2020). Don’t ask if artificial intelligence is good or fair, ask how it shifts power. Nature, 583(7815).
5 According to Nancy Fraser, noticing injustice is often hard because it requires resources of which those in marginalized position might be deprived, Fraser, Nancy (2012), On justice. Lessons from Plato, Rawls and Ishiguro. New Left Review 74, March-April 2012. the power structures are deliberately obfuscated.
6 Gurumurthy, A., Chami, N. (2022), above, p. 9.
7 The concept of digital self-determination is currently explored by the International Network on Digital Self-determination, including the Directorate of International Law of the Swiss Federal Department of Foreign Affairs in cooperation with the Office for Communications of the Swiss Federal Department of Environment Transport, Energy and Communications, Berkman Klein Center at Harvard University, Centre for AI and Data Governance at Singapore Management University and Technical University in Munich.
8 See the report Creating trustworthy data spaces based on digital self-determination Report from the DETEC and FDFA to the Federal Council on 30 March 2022.
9 For example: Butler, J. (2005), Giving an Account of Oneself, Fordham University Press; Brison, S. (2017), Personal Identity and Relational Selves, in: Garry, A., Khader, S.J., & Stone, A. (Eds.), The Routledge Companion to Feminist Philosophy (1st ed.).