Computer modeling is essential to understanding how the brain creates and retains complex information, such as memories. But developing such models is a challenging task.

A symphony of biochemical and electrical signals and a jumble of connections between neurons and other cell types create the hardware for memories to hold. However, putting the process into a computer model to analyze it further has proven difficult, as neuroscientists are still learning about the biology of the brain.

The Hopfield network, a popular computer model of memory, has now been modified by researchers at the Okinawa Institute of Science and Technology (OIST) in a way that improves performance by taking inspiration from biology. Scientists found that the new network could hold significantly more memories and more accurately reflect the wiring of neurons and other brain cells.

Hopfield networks use patterns of weighted connections between different neurons to hold memories. Researchers can test the network’s memory of these patterns by giving it a series of vague or incomplete patterns and seeing if it can identify them as one it already knows. The network is “trained” to encode these patterns.

But in classical Hopfield networks, neurons in the model make a series of “pairwise” connections with other neurons. Pairwise connections simulate the interaction of two brain neurons at a synapse, where they connect. The brain relies on a much more complicated network of synapses to perform cognitive functions because neurons contain complex branches called dendrites that provide different locations for connection. Astrocytes, another cell class, also influence the connections between neurons.

Thomas Burns, a Ph.D. student in the group of Professor Tomoki Fukai, head of OIST’s Neural Coding and Brain Computing Unit, said: “It’s just not realistic that there are only pairwise connections between neurons in the brain.”

Mr. Burns created a modified Hopfield network in which not only pairs of neurons, but also sets of three, four or more neurons could be linked, as can occur in the brain via astrocytes and dendritic trees.

Although these so-called “set-wise” connections were possible, the new network nevertheless had the same total number of connections as the old one. The team found that a network with a mix of pairwise and setwise connections functioned optimally and retained the most memories. They predict it to perform at least twice as well as a conventional Hopfield network.

Mr Burns said: “You need a combination of functions in a certain balance. You should have individual synapses, but you should also have some dendritic trees and some astrocytes.”

The team plans to continue working with their modified Hopfield networks to make them even more powerful. Also will Mr. Burns are exploring ways to get the network’s memories to communicate with each other as they do in the human brain.

Mr Burns said: “Our memories are multifaceted and vast. We still have a lot to discover.”

Magazine reference:

  1. Thomas F Burns, Tomoki Fukai. Simple Hopfield Networks. ICLR 2023 poster. Paper link.