In the last five years, Artificial Intelligence has made striking progress, now defeating humans at subtle strategy games, such as Go, and even Poker. However, these algorithms are running on traditional processors which have a radically different architecture than the biological neural networks they are inspired from. This considerably slows them down and requires massive amounts of electrical power, more than ten thousand times what the brain typically needs to function. This energy dissipation is not only becoming an environmental issue, but it also sets a limit to the size of neural networks that can be simulated. We are at a point where we need to rethink the way we compute, and build hardware chips directly inspired from the architecture of the brain. This is a challenge. Indeed, contrarily to current electronic systems, the brain is a huge parallel network closely entangling memory and processing.
In this talk, I will review the current efforts at building neuromorphic chips of the future, which imply redesigning CMOS circuits, and inventing novel nanodevices for synapses and neurons. I will present recent achievements in neuromorphic computing, highlight current challenges, as well as the fascinating perspectives of this emerging field.