Summary of How Do You Teach a Car That a Snowman Won’t Walk Across the Road?

Looking for the article?
We have the summary! Get the key insights in just 5 minutes.

How Do You Teach a Car That a Snowman Won’t Walk Across the Road? summary
Start getting smarter:
or see our plans

Rating

8

Qualities

  • Engaging
  • Innovative

Recommendation

Headlines about artificial intelligence make lofty claims about machines outperforming humans. However, the technology isn’t there yet, as computer science professor and author Melanie Mitchell explains in an engaging essay for Aeon. Humans have inherent “core knowledge” that AI simply doesn’t have, and instilling those understandings is a challenge when developing better systems for the future. Those in the AI sector should keep Mitchell’s message on their radar.

About the Author

Melanie Mitchell is a computer science professor at Portland State University and Santa Fe Institute. Her books include Complexity: A Guided Tour and Artificial Intelligence: A Guide for Thinking Humans.

 

Summary

AI starts out as a “blank slate” that doesn’t have the well-rounded “core knowledge” of a human being.

People understand that if a ball rolls out into the road, a child might soon follow. And that a snowman isn’t going to dislodge itself from a snowbank to cross the street. AI doesn’t get those concepts. Self-driving cars, for instance, are often rear-ended when they suddenly step on the brakes for an object that human drivers wouldn’t worry about.

While AI may perform better than humans in certain capacities, such as language processing, the technology can make unexpected errors that a human ...


More on this topic

Customers who read this summary also read

How to Teach Artificial Intelligence Some Common Sense
8
The Seven Deadly Sins of AI Predictions
8
Welcome to the Era of the AI Coworker
7
“The Discourse Is Unhinged”
7
5 Distractions that Cloud Our Thinking About AI
8
The Malicious Use of Artificial Intelligence
8

Related Channels

Comment on this summary