Using The Brain To Inspire The Next Generation Of Computing For AI

For a little while there has existed the slightly wonky perception that AI approaches such as neural networks take inspiration from the brain.  It’s a notion that isn’t especially accurate, yet has kinda taken hold nonetheless.  A recent study from IBM researchers suggests we might actually be getting a bit closer to such a comparison however.

The team are working on a new computer architecture that represents a major shift from the von Neumann architecture that computers today are built upon.  This traditional approach relies upon stovepipe components, such as the CPU, input/output devices and a memory unit.  The researchers propose a new system that is instead inspired by the brain that would see processing and memory units coexisting, and therefore increase the efficiency of the system.

“If you look at human beings, we compute with 20 to 30 watts of power, whereas AI today is based on supercomputers which run on kilowatts or megawatts of power,” they explain. “In the brain, synapses are both computing and storing information. In a new architecture, going beyond von Neumann, memory has to play a more active role in computing.”

Diverse inspiration

The team drew on diverse inspiration for their work.  On one hand they strived to exploit the memory device’s state dynamics to perform a range of computational tasks within the memory, much in the same way the brain performs memory and processing together.  They also drew inspiration from the synaptic network structures of the brain to help design arrays of phase change memory devices to help accelerate the kind of training performed in artificial neural networks.  Finally, the team drew from the dynamic and stochastic nature of neurons and synapses in the brain to create a computational substrate to power spiking neural networks.

In initial tests, the new system has achieved a level of performance that surprised even the researchers themselves.

“We always expected these systems to be much better than conventional computing systems in some tasks, but we were surprised how much more efficient some of these approaches were,” they say.

The initial tests were done using an unsupervised machine learning algorithm on both a conventional computer and their new computational memory platform to compare performance.

“We could achieve 200 times faster performance in the phase change memory computing systems as opposed to conventional computing systems.” the team explain. “We always knew they would be efficient, but we didn’t expect them to outperform by this much.”

Related

Facebooktwitterredditpinterestlinkedinmail