29

工作裁判工作:'Facebook如何写下规则,以容纳特朗普'

Elizabeth Dwoskin, Craig Timberg, and Tony Romm, reporting for The Washington Post ina remarkable story that takes a while to get to the juicy parts:

该文档以前未报告并由帖子获取,重了四个选项。They included removing the post for hate speech violations, making a one-time exception for it, creating a broad exemption for political discourse and even weakening the company’s community guidelines for everyone, allowing comments such as “No blacks allowed” and “Get the gays out of San Francisco.”

Facebook spokesman Tucker Bounds said the latter option was never seriously considered.

The document also listed possible “PR Risks” for each. For example, lowering the standards overall would raise questions such as, “Would Facebook have provided a platform for Hitler?” Bickert wrote. A carveout for political speech across the board, on the other hand, risked opening the floodgates for even more hateful “copycat” comments.

Ultimately, Zuckerberg was talked out of his desire to remove the post in part by Kaplan, according to the people. Instead, the executives created an allowance that newsworthy political discourse would be taken into account when making decisions about whether posts violated community guidelines.

I don’t get the “on the other hand” here regarding whether Facebook’s rules would have provided a platform for Adolf Hitler. A blanket “carveout” for “political speech” and “newsworthy political discourse” certainly would have meant that Adolf Hitler would have been able to use Facebook as a platform in the 1930s. That sounds histrionic to modern ears, but Hitler wasn’t universally seen asHitler the unspeakably evil villainuntil it was too late. Infamously, as late as August 1939 — 1939! — The New York Times Magazine saw fit to run a profile under the headline “Herr Hitler at Home in the Clouds”(sub-head:“高h up on his favorite mountain he finds time for politics, solitude and frequent official parties”).

Ananything goesexception for political speech from world leaders isthe exact same hand作为希特勒的平台。这是我现在爬出木制品的Goddamn Nazis。

Two months before Trump’s “looting, shooting” post, the Brazilian president [Jair Bolsonaro] posted about the country’s indigenous population, saying, “Indians are undoubtedly changing. They are increasingly becoming human beings just like us.”

Thiel, the security engineer, and other employees argued internally that it violated the company’s internal guidelines against “dehumanizing speech.” They were referring to Zuckerberg’s own words while testifying before Congress in October in which he said dehumanizing speech “is the first step toward inciting” violence. In internal correspondence, Thiel was told that it didn’t qualify as racism — and may have even been a positive reference to integration.

Thiel quit in disgust.

If that post is not dehumanizing, what is? If that post is acceptable on Facebook, what isn’t?

Facebook的安全工程师于2016年12月提出了广泛的内部调查调查结果,被称为P项目, to senior leadership on how false and misleading news reports spread so virally during the election. When Facebook’s security team highlighted dozens ofpages that had peddled false news reports据熟悉公司思维的人表示,华盛顿(包括Kaplan)的高级领导人,包括Kaplan,反对立即关闭他们,争论这种情况不成比例地影响保守派。最终,该公司的页面比最初提出的页面较少,而在开发策略以处理这些问题的情况下也是如此。

A year later, Facebook considered how to overhaul its scrolling news feed, the homepage screen most users see when they open the site. As part of the change to help limit misinformation, it changed its news feed algorithm to focus more on posts by friends and family versus publishers.

In meetings about the change, Kaplan questioned whether the revamped algorithm would hurt right-leaning publishers more than others, according to three people familiar with the company’s thinking who spoke on the condition of anonymity for fear of retribution. When the data showed it would — conservative leaning outlets were pushing more content that violated its policies, the company had found — he successfully pushed for changes to make the new algorithm to be what he considered more evenhanded in its impact, the people said.

如果不公平的基本指控是可信的,关于偏见的投诉只有合法。让我们说有一个篮球比赛,裁判对一个团队犯规10犯规,只有1个反对另一个球队。裁判是否有偏见?我们不知道这些事实。What matters is how many fouls each team actually committed. If one team actually did commit 10 fouls and the other 1, then the refs are fair and the results are fair. If both teams actually committed, say, 5 fouls apiece, but the refs called 10 violations against one team and only 1 against the other, then the refs are either crooked or just plain bad and the results are, indeed, unjust.

Joel Kaplan wanted Facebook to call the same number of fouls against both sidesno matter what fouls were actually committed. And that’s exactly how Facebook has run its platform. If you don’t punish cheaters and liars you’re rewarding them. Far from being biasedagainstRepublicans in the U.S. and right-wing nationalists and authoritarians around the globe, Facebook has been biasedfor他们。