Summary of How Do You Teach a Car That a Snowman Won’t Walk Across the Road?

Looking for the article?
We have the summary! Get the key insights in just 5 minutes.

How Do You Teach a Car That a Snowman Won’t Walk Across the Road? summary

Editorial Rating



Headlines about artificial intelligence make lofty claims about machines outperforming humans. However, the technology isn’t there yet, as computer science professor and author Melanie Mitchell explains in an engaging essay for Aeon. Humans have inherent “core knowledge” that AI simply doesn’t have, and instilling those understandings is a challenge when developing better systems for the future. Those in the AI sector should keep Mitchell’s message on their radar.

About the Author

Melanie Mitchell is a computer science professor at Portland State University and Santa Fe Institute. Her books include Complexity: A Guided Tour and Artificial Intelligence: A Guide for Thinking Humans.


AI starts out as a “blank slate” that doesn’t have the well-rounded “core knowledge” of a human being.

People understand that if a ball rolls out into the road, a child might soon follow. And that a snowman isn’t going to dislodge itself from a snowbank to cross the street. AI doesn’t get those concepts. Self-driving cars, for instance, are often rear-ended when they suddenly step on the brakes for an object that human drivers wouldn’t worry about.

While AI may perform better than humans in certain capacities, such as language processing, the technology can make unexpected errors that a human ...

Comment on this summary

More on this topic

Biased AI Is Another Sign We Need to Solve the Cybersecurity Diversity Problem
Expanding AI’s Impact with Organizational Learning
The Panopticon Is Already Here
The Citizen’s Perspective on the Use of AI in Government
Using AI to Enhance Business Operations
AI Government Procurement Guidelines

Related Channels