theodore M I R A L D I mpa ... editor, publisher, writer

Wednesday, May 25, 2016

Why Mark Zuckerberg changed how FB chooses what is news



KRISTEN SOLTIS ANDERSON

 The decentralized and entrepreneurial wild west of America's tech industry ought to be a natural ally of America's liberty movement. Yet the gulf between Silicon Valley and conservatism feels pretty wide these days, especially on the heels of allegations that Facebook slighted conservative news in its "Trending Topics" feature.
Last week, Facebook held a meeting between CEO Mark Zuckerberg, other major company executives, and a handful of conservative media figures to discuss the issue. I was among those present, and though the meeting was deemed "off-the-record," plenty of debate has spilled out into the public in the days since our Menlo Park get-together.
Glenn Beck, posting at Medium, wrote that he was "disturbed" by the conversation in the meeting, lambasting participants for advocating that Facebook increase their understanding of conservatives through either hiring or "diversity training." Taking the other view, Tucker Carlson took aim at Beck's description of the meeting, in particular the "total mischaracterization" of the arguments being made in favor of ideological diversity, noting that if Facebook had had more conservatives in their ranks, problems like this would be less likely to crop up.
Facebook should, of course, be able to operate however they please. But by inviting a room full of conservatives to come together to discuss how Facebook can do better, they wisely and voluntarily chose to bring in fresh perspectives to help them identify any blind spots.
This issue isn't ultimately about conservatism, however. To me, this issue raises a broader discussion about the role of data versus human judgment, and why even well-intentioned people are fallible when trying to fill the gaps where algorithms fall short.
In the case of Facebook's Trending Topics, if the top stories on Facebook were being identified purely by an algorithm, the story that sparked the controversy may never have happened. (This hands-off, data-driven approach is how Facebook's News Feed product works, and is similar to how Netflix recommends what movies you might want to watch by looking at data about your past actions.)
But in the case of Trending Topics, humans — imperfect and biased as we all are — were needed to step in where algorithms weren't enough. For instance, without a light touch of human judgment, "lunch" would be a trending topic by noon every day, and #JeSuisCharlie and "Charlie Hebdo" might be treated as separate topics instead of being counted as the same news topic. Such a light touch theoreticallyshouldn't be problematic.
But because we are all human, we are all imperfect. And we all have biases we can't see. Of course, nobody says they have biases. "I really just try to look at the facts and be impartial," is a refrain I've heard in a thousand focus groups.
Many conservatives in the Facebook meeting noted that they launched conservative media properties as a response to bias. While journalists will often say they are objective, many conservatives see bias in things like the choice of stories that mainstream outlets push to the fore.
What is extraordinary about a platform like Facebook is that it allows conservatives to bypass the filters, conscious and unconscious. News stories about people using Second Amendment rights to defend themselves or heartwarming stories of churches doing good work can make their way to people who want to read them, without a news editor in Manhattan deciding if they're really "news." Social media has, in many ways, democratized news and created a true "marketplace of ideas," which is also a core part of Facebook's stated mission.
The way Facebook's Trending Topics product was set up, it had a chance to further that goal. Rather than an editor picking and choosing from the get-go, Facebook's algorithm would surface the "most talked about" items.
So far, so good.
But imagine that you're 24 years old, working as a contractor for Facebook. You've probably got a degree from an elite institution, and get much of your news from watching John Oliver. Suddenly, a story breaks on a conservative website you've never heard of about, say, a pro-life march. You don't have anything against conservatives, you're just cleaning up the results you're given, but this article isn't on your list of trusted news sources, and are there really that many people talking about it?
You can see how, even with good intentions, things could go awry. Adding a feature that decides what is "trending," even if it is mostly driven by data, meant Facebook risked replicating the editorial processes that conservatives had fought against.
Monday, Facebook proposed some changes to Trending Topics that aim to address some concerns, and have said they will continue to refine the product. I genuinely believe Facebook wants to be a marketplace of ideas rather than a media property with an editorial slant. I think, at a minimum, they know that's a smarter business decision, in addition to being more on-mission.
When people are making judgment calls about whether they're getting signal or noise from algorithms, it would be a smart business decision to have a lot of voices involved in deciding what is and isn't "real" news. Stating this certainly isn't blackmail or a list of demands, and it hardly constitutes a breach of conservative principle, as some in the meeting have now alleged. Greater cross-pollination between the tech world and the conservative community would be a wonderful step, and Facebook has a chance to be a leader in that regard.
Facebook has the right to do whatever they wish with their company. I'm heartened to see that they want to do better.
Sources>http://www.washingtonexaminer.com/why-mark-zuckerberg-changed-how-facebook-chooses-what-is-news/article/2592197

No comments:

Post a Comment