Online Ads’ Black Box a Mystery, Even to Companies Themselves
July 8 2015Google users who identify themselves as men are more likely to see ads for high-paying executive jobs. Why? The answer to questions like this may not be entirely clear — even to the online ad companies themselves.
Such were the findings of a research paper presented at the 15th Privacy Enhancing Technologies, which emphasized the complex, opaque, and sometimes discriminatory effects of online advertising. The researchers used a tool called “AdFisher” to record how Google delivered ads across the web. In one of their experiments, the researchers found that male users — as identified in their Google Ad Settings — were statistically more likely to see ads depicting high-paying executive jobs than female users were. (Notably, previous research by Harvard professor Latanya Sweeney showed how Google search ads could reproduce racial biases.)
To be certain, the researchers noted important limitations on their findings. Because the online ad industry is so complex and involves so many different actors, it is difficult, if not impossible, to “assign responsibility for [their] findings to any single player…” For example, it could be that Google’s algorithms inferred that more men were clicking on particular ads, or it could be that an advertiser specifically chose to target men, observed Anupam Datta.
But that opaqueness is revelatory in and of itself. In some cases, “[e]ven companies that run online ad networks don’t have a good idea of what inferences their systems draw about people and how those inferences are used,” said Datta. In fact, Microsoft is now paying these researchers to help them figure out what their ad systems are doing, reports the MIT Technology Review.
Online advertising companies and the data broker industry use large amounts of data and sophisticated analytics to target people based on race, sex, and financial vulnerability. But those issues are just one piece of a much larger puzzle: If large companies like Google or Microsoft don’t even understand exactly what’s going on with their own systems, we have a long way to go.