Oversight board will be a level above 30k content moderators: Brent C Harris, Facebook
Brent C Harris, director of governance and global affairs, Facebook, sees the oversight board as a group of independent experts who will take some hard decisions on content. In an exclusive interview with ET, Harris, who is building the oversight board, says he expects to start work by end-2019, discusses the role of the board, its independence, why social media can’t have a Hyde Park kind of place and more.
Facebook has 30,000 people reviewing content. What's the difference in what they do and what the board will do?
The board will be a level of appeal beyond the 30,000 people who work on content moderation. It will provide an opportunity for users to say, “I am not sure a right decision was made at that level and I want to have people who are beyond Facebook to make a decision,” and that's what we will provide. We believe that will add greater procedural fairness and additional due process to the system.
How can you call this an external or independent oversight board if Facebook is forming it and paying for it?
Right now, we are paying for it. We are not sure who else will or who else should pay for it. But we believe it’s our responsibility to build it. We are looking into how board members exercise judgement independent of Facebook. We intend to build that in as part of the contractual arrangement. Besides, board members will have no incentive or interest that is tied to Facebook. We are (still) figuring out ways in which we can create that independence.
They may not get some perks but they will still be on Facebook payroll. Won’t it compromise independence of the board?
We are thinking about creating an independent trust that has its own governance and provides payments and salaries for (board) members. So, any compensation they receive is truly at arm’s length from the company.
The board means a surrender by Facebook that we can’t handle misinformation, let’s see if someone else can do it?
Absolutely not. The intention of the board is, how do we exercise our responsibility better and more transparently. We are providing a mechanism where we reach conclusions for review by an outside group of people. They could say, “The way you did this looks spot on,” or they could say, “This is actually something you could have thought about differently.”
Who could be a typical board member?
(It would be) people from a wide range of backgrounds. We are looking for people who can really do the job very well as individuals. Their role is not to represent a particular interest or a particular place. Members should be in a position to take cases, hear disputes on whether or not content should be up or down, deliberate with other members and be able to explain and make powerful decisions on what content is allowed on Facebook.
There’s really a wide range of backgrounds that people could come from. You could envision people who have been judges or lawyers, experts from academics, voices from the civil society and so on.
How many queries is your 30,000-strong team unable to address that could go to this board? Who decides on them at present?
At present, we decide. That’s partly what prompted us to think about the board. There are two type of things that could go to the board — one, are the hard questions we raise from Facebook and seek the board to address. My sense is that there are dozens of such queries every year. We envision the board for the really difficult cases where there are really good arguments on either side and really fantastic perspective on what to do. From our perspective, there are dozens of these cases every year, where there is real debate in the company on which way to go.
The second would be people who want to appeal the content moderation decisions.
Where will the board be located?
It’s possible that the board’s head office will be in India, in New Delhi. We heard lot of feedback as to not have it in the US. There's a chance it will be in Delhi or likely in Asia.
People will have opinions and there will always be some community or group that does not agree. Within that, is there room for a Hyde Park kind of place where you can go and say anything? Can Hyde Park happen on digital media?
It’s about people who are part of the community — what do they expect when they come on the platform? We have a responsibility as a company. When you go on to the site, there is reasonable expectation of being safe and we take that seriously. That is different from Hyde Park. But it’s a good point.
People share their experience and creativity and that's balanced against safety. There's a limit to how much voice people can have, if it ends up harming (other) people.
How involved is the top leadership of Facebook in the oversight board?
Mark Zuckerberg is very personally engaged, and so are (chief operating officer) Sheryl Sandberg and (head of global affairs) Nick Clegg . Besides, we have a crossfunctional team of around 100 people with expertise in regional (matters), public policy, products, communications and so on. We are also doing workshops around the world to get feedback on the oversight board.