Зарегистрируйтесь на getAbstract, чтобы получить доступ к этому краткому изложению.

Facebook and the Future

Зарегистрируйтесь на getAbstract, чтобы получить доступ к этому краткому изложению.

Facebook and the Future

Aspen Institute,

5 мин на чтение
5 основных идей
Аудио и текст

Что внутри?

Facebook is proactively self-regulating to combat fake news and to safeguard elections.

автоматическое преобразование текста в аудио
автоматическое преобразование текста в аудио

Editorial Rating

8

Qualities

  • Analytical
  • Overview
  • Hot Topic

Recommendation

From inception, Facebook has championed conversation and community. But the use of Facebook advertising tools to flood Americans with disinformation and hate speech in 2015–2016 was a wake-up call for the company. In this Aspen Ideas Festival exchange, Facebook’s chief product officer Christopher Cox sits down with Wired editor in chief Nicholas Thompson to discuss how Facebook will prevent future abuses. getAbstract suggests this talk to Facebook users and anyone concerned about social media hacking.

Summary

Facebook once saw its platform and users as a “force for good.” However, the gaming of the platform to subvert the 2016 US presidential election challenged this view. Since then, the company has become more expert at recognizing disinformation and spam. In the trade-off between privacy and easy use, Facebook protocols have shifted toward privacy. And in a trade-off between safer communities and free speech, the platform has increased limits on free speech. Facebook now employs hundreds of people to help protect elections in more than 40 countries. These staffers work with election commissions and other experts. In 2018, they took down 10,000 pages of disinformation before the ...

About the Speaker

Christopher Cox is chief product officer of Facebook and its subsidiaries Instagram, WhatsApp and Messenger.


Comment on this summary