Summary of A New Model and Dataset for Long-Range Memory

Looking for the article?
We have the summary! Get the key insights in just 5 minutes.

A New Model and Dataset for Long-Range Memory summary

Editorial Rating



  • Analytical
  • Scientific
  • Hot Topic


One of the challenges for AI systems is that, as they become more sophisticated and swallow up larger and larger databases, they need ever more processing power to achieve marginally better results. Even the biggest server farms have their limits. Yet people don’t want to wait for more than a few microseconds, so a major task in AI science is to determine how to reduce mountains of crude data to hills of important data. Google’s DeepMind project’s latest proposal is to use sleep as an inspiration for data processing. Your Eureka moment might have come one afternoon during a power nap.

About the Authors

Jack Rae and Timothy Lillicrap are staff researchers at Google’s DeepMind project in London.


A major problem in AI is reducing extremely large datasets into more manageable dimensions for processing and analysis.

A simple model of human memory is the neural network, where each piece of information can interact with every other. Unfortunately, it produces dauntingly large datasets. Kilobytes of data can result in gigabytes of parameters to be processed, and this limits the practical utility of neural networks in AI. Researchers need better models that combine the flexibility of neural networking with more practical information processing demands.

Human memory exists at different timescales and different levels of detail.

Human memory is not simply a collection...

Comment on this summary

More on this topic

The AI Organization
The AI Advantage
Deep Medicine
Reprogramming the American Dream
Electric Brain
Data Feminism

Related Channels