Why diversification in tech and data is not enough – a feminist perspective:

Have you heard of a feminist approach to data protection? I am quite intrigued by the possibility of understanding, criticising and reimagining data protection in feminist terms. 

The editorial published on Internet Policy Review below explores some current topics on data protection under a feminist perspective. Below I have summarised some of the talking points that stood out to me, and mixed in some important aspects of automated decision making that fit right in there too.

Technology is embedded in social relations. Feminism challenges the status quo and the promise of objectivity through technology. 

Under this lens, technology, surveillance and data processing reproduce, entrench and deepen various forms of discrimination, marginalisation and oppression already present in society. 

What we call ‘data bias’ is just a particular form of discrimination and marginalisation. It can be useful to draw attention to discrimination, but it also reduces the many layers of problems that exist. 

– > It signals that there is an easy fix by diversifying the underlying data set or the team behind it. 

This in turn leads to gaslighting and tokenisation of those from marginalised groups, especially women of color. While divers teams are sorely lacking, diversification will not fix the many problems that arise due to discrimination, marginalisation, and exclusion. 

We can already see and feel the effects of automated decision-making and social sorting, it is clear that there is a large problem that is structural, and entrenches a system of marginalisation and exclusion based on the algorithms that segregate users into specific groups and offer or exclude different products, services, or prices on the basis an affinity to this group, this is called discrimination by association (discrimination based on being sorted into a group). 

Don’t want to miss new posts? Then don’t forget to like, subscribe and follow this space.

Leave a comment