Facebook to ramp up ‘integrity efforts’ in India
The company also revealed that it was yet to witness “anything in terms of coordinated behaviour (multiple actors trying to spread misinformation) in India”. But Facebook expects such activity to emerge closer to elections.
In an interaction with ET, Katie Harbath, Facebook’s global politics and government outreach director, said that the company was close to appointing an election integrity head, a position it was trying to fill for the last three months. She also said for Facebook founder and CEO Mark Zuckerberg and COO Sheryl Sandberg Indian polls are “top priority” and “they remain very involved in Facebook’s election integrity plans”.
“We’re very close. But what I will say is that there is a lot of hiring we have done across the company, in a lot of different locations (for election integrity work). That we are still hiring people should not take away from the fact that we have been working on the Indian elections since October 2017, when we first started pulling together teams and started thinking about 2019 elections and state elections in between,” she said.
The social media company also wants to step up its enforcement of transparency for political advertisements, a move it started early December last year. “We have been exploring a variety of different options and best ways to execute that. Some of the things we are trying to work through, for instance, is — how do we balance people’s privacy (providing a name and can cause problems) versus the public’s right to know who is behind these (political) ads,” Harbath said.
This move can potentially include entity verification, which Facebook has rolled out in the US. Entity verification essentially covers unofficial Facebook pages
(e.g., Indians for India, or Americans for America), which typically use obscure names, and act as a proxy for political leaders or parties. Last year, on the eve of US midterm elections, Vice News, an American media outlet, bought ads on behalf of top senators and the US vice-president.
In a subsequent story, Vice wrote that “Facebook knew who was behind the ads internally, but externally, what Facebook users would see was a completely made up ‘paid for’ information.”
“We’re now looking, country by country about what the different options could be to make it clear for people, who are the people funding the ads, and also ensure that (slip-ups) like the ones mentioned in the Vice article, can’t get through our approval system,” Harbath added.
Facebook will also use its “learnings from the recent global and Indian election cycle” as part of its preparation for the 2019 general elections.
In Brazil, a non-English speaking country, Facebook got a glimpse of language-based weaponisation of social platforms ahead of the elections. India could present the company with a challenge of a scale it hasn’t witnessed before. Harbath said Facebook’s experience in Brazil, particularly with the language aspect could only help the company in India.
“One of the biggest learnings was having a combination of native speakers and other integrity teams together in the rooms. Particularly, being able to explain what the content might say, and give context,” she said.
She added, “Because, even with the translation but without context, we think something was violating our standards. Putting Portuguese speakers in Menlo Park (Facebook’s headquarters) interacting (with) teams, helped put things in context.”