Ever been totally dominated by the computer player in a video game? A new artificial intelligence system takes on all comers with a handful of old Atari titles, and it does so after learning the rules bit by bit like a human. Its creators claim this is just the very beginning of what it can do. In a few years, it may be driving you to work.
“What we’re trying to do is use the human brain as an inspiration,” Google DeepMind researcher Demis Hassabis told reporters in a telephone conference call about the research, published in Thursday’s issue of the journal Nature. “This is the first rung of the ladder to showing that a general learning system that goes from end to end, from pixels to actions, even on tasks that humans find difficult.”
This article originally appeared on Recode.net.