#Facebook新闻偏见门#

Fears of Facebook Bias Seem to Be Overblown

Focus should be on service’s news feed, rather than its trending topics

Of the 1,500 or so posts pumped out by the average Facebook user’s friends every day, that user only looks at about 300. ENLARGE

Of the 1,500 or so posts pumped out by the average Facebook user’s friends every day, that user only looks at about 300. PHOTO: JAAP ARRIENS/ZUMA PRESS

By CHRISTOPHER MIMS

Updated May 16, 2016 12:17 a.m. ET


Two competing narratives emerged last week after an article in tech blog Gizmodo accused Facebook of suppressing conservative news in its “trending topics” feature. Both are distractions from what I believe is the real issue.

The first narrative is based on the allegations in the Gizmodo article: that Facebook’s news curators, who select the trending-topics items, are consciously or unconsciously biased against conservative news services and topics.

In response, Sen. John Thune asked Facebook for more details about how it picks trending topics in the blink-and-you’ll-miss-it box in the upper right-hand corner of its home page. Then, Facebook released its guidelines for picking these topics, a 28-page document seemingly designed to eliminate bias, but revealing just how much human editors shape the process.


While Facebook has denied allegations that its "trending topics" feature is biased, the social network acknowledges it uses human curators to complement algorithms in delivering news to users. Here are some of the key factors that impact what posts you see. Photo: Getty Images

All in, it’s an interesting story about the unexpected presence of humans in a process that Facebook had suggested was algorithmically driven, with the possibility that bias seeps in—but it hardly seems to warrant the attention it generated.

Others argued that concern over trending topics was misplaced, because the feature occupies little real estate on Facebook’s Web service, and doesn’t initially appear on mobile devices. Thus was born the second narrative: To find whether real bias exists on Facebook, examine the news feed—that river of baby pictures, jokes, updates from your friends and occasional links to news stories where people spend the vast majority of their time on Facebook.

Here, those people said, is where the algorithm that determines what appears in your personalized feed, and in what order, does its dastardly work. You see, of the 1,500 or so posts pumped out by the average Facebook user’s network of friends every day, that user only looks at about 300. The trick is which ones.

Many people are concerned that Facebook has created a so-called filter bubble, in which it shows users only what they want to see, to entice them to spend more time on the network. Such concerns are heightened because little is known about Facebook’s algorithm.

Facebook CEO Mark Zuckerberg with the social network’s news feed in 2013. An algorithm determines what is seen. ENLARGE

Facebook CEO Mark Zuckerberg with the social network’s news feed in 2013. An algorithm determines what is seen. PHOTO: JEFF CHIU/ASSOCIATED PRESS

That people like to see things that conform to their pre-existing notions is well known—it’s a part of what psychologists call confirmation bias.

Claiming that Facebook is contributing to our age of hyper-partisanship by only showing us things that fit our own personal slant is, ironically, an example of confirmation bias, because the evidence around it is mixed.

After an exhaustive search of the literature around filter bubbles, five co-authors and Frederik J. Zuiderveen Borgesius, a researcher at the a researcher at the Personalised Communication project at the University of Amsterdam, concluded concerns might be overblown. “In spite of the serious concerns voiced, at present there is no empirical evidence that warrants any strong worries about filter bubbles,” Mr. Zuiderveen Borgesius wrote in an email.


The authors examined not only Facebook but other online services, including Google search. Mr. Zuiderveen Borgesius’s conclusion: We don’t have enough data to say whether Facebook is biasing the news its readers see, or—and this is even more important—whether it affects their views and behavior.

Facebook’s opacity aside, where does the hand-wringing come from? Two places, I think: the first is that everyone in the media is terrified of Facebook’s power to determine whether individual stories and even entire news organizations succeed or fail. The second is an ancient fear that, by associating only with people like ourselves, and being selective in what we read, we are biasing ourselves unduly.

Before the filter bubble, there was the so-called echo chamber.

A search of Google’s Ngram—a service that tracks the frequency with which words or phrases appear in books—reveals that “echo chamber” first gained popularity in the late 1930s. I asked the Journal’s resident etymologist, columnist Ben Zimmer, about the earliest use of the term “echo chamber” in its modern sense. He found this, from Blackwood’s Edinburgh Magazine, published in 1840 in Scotland:

“Since the year 1813, the interest in things German, both in this country and in France, has been steadily on the increase; foreign criticism has become now something better than an echo-chamber for the bandying about of mutual misunderstandings.”

Facebook didn’t exist in 1840. “Mass media” meant pamphlets and thin, poorly circulated newspapers. But even then, the combination of humanity’s natural tendency to associate with like minds, and to seek voices that echo our own, was a source of consternation.

Here’s the question we should be asking about “bias” in Facebook’s news feed: Is it substantially worse than in the heyday of newspapers and magazines, when readers in major cities could choose to get their news from among a dozen or more publications tailored to their biases, or in the age of cable news, in which enormous profit has been reaped from continuing this tradition?

Is it possible that Facebook’s algorithm produces a news feed that might even be a less-biased news source than what came before? Facebook, after all, is simply performing the same function gossip and social stratification has accomplished since the dawn of civilization, by allowing us to filter what we hear to precisely the degree we please.

—Follow Christopher Mims on Twitter @Mims or write to christopher.mims@wsj.com.

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容

  • **2014真题Directions:Read the following text. Choose the be...
    又是夜半惊坐起阅读 13,394评论 0 23
  • 早上去社保中心了解了一下 设了社保密码 机器上就可以办 八月拉7月的 中午zhizhi 乐乐 nia xinxin...
    角落蜷缩阅读 1,033评论 0 0
  • 这部电影有一位最终变得善良勇敢的父亲,有一位懂事的女孩,有一位为了爱妻和未出世的孩子,赤手空拳勇闯4节车厢的丈夫...
    茕茕白兔11_6阅读 2,454评论 0 0
  • 最近一直晚上占用你的书桌看书学习,今天无意中发现你抽屉里的标签,理解你的心情,高中的你真的很努力,我们也很感欣慰,...
    小兔子粽宝宝阅读 1,898评论 1 1
  • 妈妈跟爸爸吵架,然后哭哭啼啼的打电话向我控诉爸爸不够温柔体贴,放假了大家都出去游玩,只有爸爸不带她出去,不管她怎么...
    瑶瑶啊阅读 3,520评论 6 5