Melden Sie sich bei getAbstract an, um die Zusammenfassung zu erhalten.

Facebook and the Future

Melden Sie sich bei getAbstract an, um die Zusammenfassung zu erhalten.

Facebook and the Future

Aspen Institute,

5 Minuten Lesezeit
5 Take-aways
Audio & Text

Was ist drin?

Facebook is proactively self-regulating to combat fake news and to safeguard elections.

automatisch generiertes Audio
automatisch generiertes Audio

Editorial Rating

8

Qualities

  • Analytical
  • Overview
  • Hot Topic

Recommendation

From inception, Facebook has championed conversation and community. But the use of Facebook advertising tools to flood Americans with disinformation and hate speech in 2015–2016 was a wake-up call for the company. In this Aspen Ideas Festival exchange, Facebook’s chief product officer Christopher Cox sits down with Wired editor in chief Nicholas Thompson to discuss how Facebook will prevent future abuses. getAbstract suggests this talk to Facebook users and anyone concerned about social media hacking.

Summary

Facebook once saw its platform and users as a “force for good.” However, the gaming of the platform to subvert the 2016 US presidential election challenged this view. Since then, the company has become more expert at recognizing disinformation and spam. In the trade-off between privacy and easy use, Facebook protocols have shifted toward privacy. And in a trade-off between safer communities and free speech, the platform has increased limits on free speech. Facebook now employs hundreds of people to help protect elections in more than 40 countries. These staffers work with election commissions and other experts. In 2018, they took down 10,000 pages of disinformation before the ...

About the Speaker

Christopher Cox is chief product officer of Facebook and its subsidiaries Instagram, WhatsApp and Messenger.


Comment on this summary