Search engines can reveal some unattractive biases when you look at them closely. Check out what Google autofills, or displays in its top results, and you will see just how entrenched our collective prejudices are regarding stuff such as "feminine" work or black criminality.
Now, thanks to a new paper by a trio of researchers at Brazil's Universidade Federal de Minas Gerais, you can add a new prejudice to the list: Google's depictions of female beauty, they found, are both ageist and racist.
The paper - which has been submitted to the International Conference on Social Informatics but has not yet been published - looks at how Google and Bing represent female beauty in their image search results, particularly when it comes to different age and racial groups.
To do that, a graduate student and professors Virgilio Almeida and Wagner Meira Jr scraped the top 50 images for "beautiful woman" and "ugly woman" across dozens of international versions of Google and Bing. They then passed those 2,000-plus images through a programme, Face++, which estimates subject age, race and gender with 90 per cent accuracy.
Sadly, the race and age breakdown of the "beautiful" pictures versus the "ugly" ones are probably what you would expect. For almost every country analysed, white women appear more in the "beautiful" results and black and Asian women appear in the "ugly" ones.
• Blackness is considered less attractive in 86 per cent of the countries surveyed.
• In Japan and Malaysia, queries for "beautiful woman" do not turn up women much older than 23.
Blackness is considered less attractive in 86 per cent of the countries surveyed on Google, including nations such as Nigeria, Angola and Brazil, where there is a predominance of people with black or brown skin.
Likewise, beauty is associated almost exclusively with youth - extreme youth, in some countries. In Japan and Malaysia, for instance, queries for "beautiful woman" do not turn up women much older than 23.
In the United States, searches for "beautiful" women result in pictures that are 80 per cent white and women roughly between the ages of 19 and 28. Searches for "ugly" women are roughly 60 per cent white and 30 per cent black and the women fall into the 30 to 50 age range.
This sort of bias has been observed on search engines before, of course - although, in a typical chicken/egg quandary, it remains unclear whether the results shape society, or society shapes the search results.
They do not fault Google though - rather, they concluded that the search engine started predicting racist, homophobic stuff because its users kept entering it.
The authors of this latest paper are not quite so comfortable absolving Google and Bing of blame.
One of the co-authors Almeida, a visiting professor at Harvard, acknowledged that pre-existing social biases shape image search results - someone had to upload and tag and post these photos in the first place, after all.
But "the way search engines index and rank images" could also contribute to the creation, or at least the enforcement, of stereotypes, he explained by e-mail.
"We do not have (enough) information about the techniques used by search engines to rank images and photos," he added.
It is unlikely that researchers such as Prof Almeida and his co-authors will get access to that type of information: Google and Bing guard their ranking algorithms like state secrets.
But the professor says that he and Prof Meira have other graduate students who are working on "techniques to increase (the) transparency of platforms" and that both computer scientists remain interested in auditing, and even reverse-engineering, them.
Until they have more complete information, however, they are hopeful that companies such as Google and Microsoft will begin re-evaluating their algorithms.
"Given the importance of search engines as sources of information," they write, "we suggest that they analyse the prominent presence of negative stereotypes and find algorithmic ways to minimise the problem."