Artificial intelligenceBrain power

What AI can teach us about stereotypes

Originally posted on The Horizons Tracker.

One of the main concerns with AI technologies today is the fear that they will propagate the various biases we already have in society. A recent Stanford study1 turned things around however, and highlighted how AI can also turn the mirror onto society and shed light on the biases that exist within it.

The study utilized word embeddings to map relationships and associations between words, and through that measure the changes in gender and ethnic stereotypes over the last century in the United States. The algorithms were fed text from a huge canon of books, newspapers and other texts, whilst comparing these with official census demographic data and societal changes, such as the women’s movement.

“Word embeddings can be used as a microscope to study historical changes in stereotypes in our society,” the authors say. “Our prior research has shown that embeddings effectively capture existing stereotypes and that those biases can be systematically removed. But we think that, instead of removing those stereotypes, we can also use embeddings as a historical lens for quantitative, linguistic and sociological analyses of biases.”

Dissecting society

The researchers used embedding to single out specific occupations and adjectives that tended to be biased toward women or ethnic groups each decade from 1900 to the present day. These embeddings were trained on newspaper articles, whilst also tapping into the work of fellow Stanford researchers who had developed embeddings trained on large text datasets, such as the American books contained on Google Books.

The biases located by the embeddings were then compared to the demographic changes identified in each official census undertaken during the period.

The analysis found a clear shift in how gender was portrayed throughout the 20th century, with things generally changing for the better during that time.

For instance, adjectives such as ‘intelligent’ and ‘logical’ would more often be associated with men in the first half of the 20th century, but this gap narrowed considerably (although it still remains) as we came closer to the present day.

There was also a shift in attitude towards Asians and Asian Americans. In the early part of the 20th century words like ‘barbaric’ and ‘cruel’ were commonly used adjectives used to describe people with Asian surnames, but towards the end of the century, huge progress had been made. By the 1990s, the most common adjectives were ‘passive’ and ‘sensitive’.

“The starkness of the change in stereotypes stood out to me,” the authors say. “When you study history, you learn about propaganda campaigns and these outdated views of foreign groups. But how much the literature produced at the time reflected those stereotypes was hard to appreciate.”

The work underlines the potential for AI to provide us with greater insight into what biases exist in society, although it is still some way off being able to detect the biases inherent in its own working. Hopefully that will be something for future research.

Article source: What AI Can Teach Us About Stereotypes.

Header image source: Image 60512 by WikiImages on Pixabay is in the Public Domain.

Reference:

  1. Garg, N., Schiebinger, L., Jurafsky, D., & Zou, J. (2018). Word embeddings quantify 100 years of gender and ethnic stereotypes. Proceedings of the National Academy of Sciences, 115(16), E3635-E3644.
Rate this post

Adi Gaskell

I'm an old school liberal with a love of self organizing systems. I hold a masters degree in IT, specializing in artificial intelligence and enjoy exploring the edge of organizational behavior. I specialize in finding the many great things that are happening in the world, and helping organizations apply these changes to their own environments. I also blog for some of the biggest sites in the industry, including Forbes, Social Business News, Social Media Today and Work.com, whilst also covering the latest trends in the social business world on my own website. I have also delivered talks on the subject for the likes of the NUJ, the Guardian, Stevenage Bioscience and CMI, whilst also appearing on shows such as BBC Radio 5 Live and Calgary Today.

Related Articles

Back to top button