News Details

Facebook’s latest algorithm problem: allowing advertisers to reach ‘Jew haters’

                           One of Facebook’s biggest selling points to advertisers is that you can use the company’s vast data trove to target users based on almost any personal characteristic.

But the wide scope of Facebook’s targeting capabilities was revealed on Thursday when ProPublica discovered that you could target people using anti-Semitic phrases, including “Jew hater” and “How to burn jews.”

Slate did a quick follow up and found that Facebook also enabled targeting for other hateful groups, like the “Ku-Klux-Klan.”

The way this works is that advertisers using Facebook’s automated ad buying software can target users based on specific information that they’ve added to their profile. Users can enter whatever they went on their profile under categories like field of study, school, job title or company. Facebook’s algorithm then surfaces these labels when ad buyers (or journalists) go looking for them.

In this case, users were entering things like “Jew hater” under “field of study,” which meant it showed up in the targeting search results, and was an actual option for ad buyers.

Facebook issued a statement saying that it would remove the inappropriate categories, adding that the company “[has] more work to do” in preventing this kind of targeting from the site. [You can read the full statement below.]

But the issue was yet another example of what can happen when the algorithms that drive Facebook’s business and determine what you see, and don’t see, in News Feed, aren’t properly managed.

It’s been a bad year for Facebook algorithms, starting with a realization this spring that the company’s News Feed algorithm was abused to help spread misinformation during last year’s US presidential election.

More recently, Facebook admitted that “inauthentic accounts” from Russia bought $100,000 worth of political advertising during the same US election. The accounts were able to make the purchases because algorithms, and not humans, were approving and facilitating the transactions. (It’s still unknown if there are more, similar ads that are unaccounted for.