Contents:
Americans hear about what the United Nations calls the continuing genocide of Rohingya Muslims in Myanmar and may think it's tragic and outrageous, but wonder: Myanmar called the million or more Rohingya who lived there "illegal immigrants" in the Rakhine state that has a Buddhist majority.
Rohingya have lived in Myanmar for centuries. But they are not considered citizens.
Myanmar's military began forced removals of the Rohingya last year , and more than , are now in crowded camps, vulnerable to starvation, murder, rape and disease. This week, Facebook released a report from Business for Social Responsibility that showed how the company's social media platform has been used to incite violence against the Rohingya. At least 25, people have been killed. Myanmar isn't widely wired for the Internet. The regime is dominated by generals who try to suppress news, and have jailed reporters. But people in Myanmar have cellphones.
Facebook says about 20 million people in the country use Facebook to connect with each other. But, many of them have also read malicious messages aimed at the Rohingya and truly fake news, including a false and inflammatory chain letter that said Rohingya Muslims planned to attack Buddhists. Many of those lies and distortions came from sham accounts run by Myanmar's military. A year ago, ProPublica found that advertisers could target "Jew haters" and other anti-Semitic terms. In this case, under the "interests" category, which advertisers use to direct ads to a relevant audience, the term "white genocide conspiracy theory" was presented as a group available for targeting.
Facebook said , people fell into that bucket. Those were people who "expressed an interest in or like pages related to 'white genocide conspiracy theory. In this case, humans had to approve the category. Interest-based target audiences are updated all the time so marketers can use current references to reach people.
NAYPYIDAW, Myanmar — They posed as fans of pop stars and national heroes as they flooded Facebook with their hatred. One said Islam. In the wake of the midterm elections on Tuesday, tempers seem to be running hot at the White House: Attorney-General Jeff Sessions, who.
For instance, a new pop star or movie might need to be added as an interest for targeting. Also, popular conversations permeating the culture can lead to new interest-based target audiences, as in this case. The interests are sometimes suggested by an algorithm that looks at terms and hashtags from new pages on Facebook.
Ultimately, however, all the interest-based categories pass through human approval, and this one should not have been accepted, according to Joe Osborne, a Facebook spokesman. This category was approved by Facebook in August, which would have been around the time that President Trump was tweeting about alleged "white genocide" in South Africa.
There have been media reports of racial tension in South Africa, and rising fear particularly among the right-wing of the political spectrum that the white population of the country was being threatened. The term "white genocide" is often thrown around by white supremacists looking to stoke fear about minorities.
The alleged Pittsburgh synagogue shooter appeared to harbor beliefs about Jewish people orchestrating a genocide against white people. Facebook does not say exactly how it comes up with the groups, but there are pages on the site with content related to the subject.
The Intercept said that as further targeting options, Facebook's automated system suggested people of interest in far-right media like Tucker Carlson. The interest group related to white genocide conspiracies has been removed from Facebook.
Facebook said there were a few advertisers that found the interest group and used it for their campaigns. The ads were not offensive, however, and would have been approved with any targeting parameters, Osborne said. One ad was for a lecture about conspiracies and another was about news stories that referenced "white genocide," according to Osborne. Facebook would not have checked those ad campaigns for their targeting parameters because all the options were presumable already vetted before entering the system.
This is a very complex problem.
ProPublica was previously able to find "Jew haters" as a targeting option because thousands of Facebook's more than 2 billion users had typed that term in the fields where they were asked for their "job" and "education" experiences. Facebook did not account for the likelihood that those self-reported fields could be abused when it opened them for targeting. Since then, Facebook approves all employment and education targeting categories.