-
Tips for becoming a good boxer - November 6, 2020
-
7 expert tips for making your hens night a memorable one - November 6, 2020
-
5 reasons to host your Christmas party on a cruise boat - November 6, 2020
-
What to do when you’re charged with a crime - November 6, 2020
-
Should you get one or multiple dogs? Here’s all you need to know - November 3, 2020
-
A Guide: How to Build Your Very Own Magic Mirror - February 14, 2019
-
Our Top Inspirational Baseball Stars - November 24, 2018
-
Five Tech Tools That Will Help You Turn Your Blog into a Business - November 24, 2018
-
How to Indulge on Vacation without Expanding Your Waist - November 9, 2018
-
5 Strategies for Businesses to Appeal to Today’s Increasingly Mobile-Crazed Customers - November 9, 2018
Google shows you better job ads if it thinks you’re a man
She reported in 2013 that Google searches for black-identifying names, like “DeShawn and Darnell”, were more likely to generate ads that contain the word “arrest” than searches for white-identity names, like “Geoffrey and Jill”.
Advertisement
How they track and post the ads is mystery, Researchers from Carnegie Mellon University and the worldwide Computer Science Institute (ICSI) have said that Google’s ad-targeting algorithm discriminates between internet users.
A new survey has found that women are much less likely to see ads for high-paying jobs through Google.
In addition to the findings regarding gender discrimination, another experiment showed that some changes in ads presented to users based on their browsing activity are not transparently explained via the Ad Settings page.
However, the scientists involved in this research don’t know exactly who to blame for results such as these related to algorithm discrimination because the systems functioning today are very intricate and their designs are so unclear to the general public. The above example however doesn’t break any specific privacy rules, though Google policy forbids targeting on the basis of health conditions. He said he hopes other organizations will use tools such as AdFisher to monitor the behavior of their ad targeting software and that regulatory agencies such as the Federal Trade Commission will use the tool to help spot abuses.
“I think our findings suggest that there are parts of the ad ecosystem where kinds of discrimination are beginning to emerge and there is a lack of transparency”, Anupam Datta, an associate professor at Carnegie Mellon University, told MIT Technology Review. “This is concerning from a societal standpoint”.
Even companies that run online ad networks don’t have a good idea of what inferences their systems draw about people and how those inferences are used, says Datta.
The researchers created 17,370 fake user profiles, which they used to visit jobseeker sites, and returned with 600,000 adverts that were analysed.
Advertisement
Google did not officially respond to the researchers about their findings a year ago.