Biology

In a first, memory shown to have two distinct past and future paths

In a first, memory shown to have two distinct past and future paths
For the first time, scientists have teased apart two parts of our memory that makes the all-important whole
For the first time, scientists have teased apart two parts of our memory that makes the all-important whole
View 1 Image
For the first time, scientists have teased apart two parts of our memory that makes the all-important whole
1/1
For the first time, scientists have teased apart two parts of our memory that makes the all-important whole

For the first time, scientists have found that the region in our brain that houses memory is made up of not one but two sections: one that deals with past experiences of time and places, and one that is more predictive, and actively constructs future behaviors.

Cornell University researchers found that in the hippocampus, one type of neural code is in charge of our ability to ‘connect the dots’, such as remembering where we need to go, for example, if we need to pick up a loaf of bread. The second type of neural code, the predictive half, involves formulating new plans on the fly. So, with this same example, if the bakery was unexpectedly closed, this second section would formulate a plan B and predict the path in which to take, shaping more flexible behaviors.

In a rat study, the scientists used advanced optogenetics to disable one type of memory at a time, and through this were able to identify and isolate the two distinctive functions of memory. This, understandably, holds exciting promise for the treatment of memory and learning issues that present in Alzheimer’s disease and dementia.

“We uncovered that two different neural codes support these very important aspects of memory and cognition, and can be dissociated, as we did experimentally,” said Antonio Fernandez-Ruiz, assistant professor of neurobiology and behavior At Cornell’s College of Arts and Sciences (A&S).

The researchers placed a set of electrodes in the rat’s brain to track all firing neurons simultaneously, and then used optogenetics to manipulate that activity. A virus was then used to subdue but not completely silent one precise set of neurons.

Using this method in a region of the hippocampus related to task learning, the scientists observed how the rats were able to learn a path from points A to D that had a reward at the end, however, the memory failed to ‘stick’ because of the induced misfiring. Repeating the experiment after sleep, the rat could remember points A and D, but not the path in which it should use to reach the reward.

“That sequence of steps is encoded in the brain as a sequence of cells firing,” Fernandez-Ruiz said. “The way we will remember this in the future is that when we are sleeping, the same sequence of activity is replayed, so the same neurons that encode [the path] will fire in the same order.”

Essentially, when asleep, the neurons couldn’t fire in sequence to solidify the memory, so while it could remember two points, no association was retained.

In another experiment, rats were similarly treated and had the task of finding a new path each day to collect a reward. When particular neurons had been scrambled, the rats couldn’t remember how to reach the goal.

“That behavior required it to form a map, and required planning and prediction capabilities, and remember it to guide its movements,” Fernandez-Ruiz said.

And in another test, rats were tasked with associating a location with a reward. When scientists inhibited the predictive coded area, their associative memory remained intact, showing for the first time just how the two sides of the same coin are in fact quite distinct.

“By looking at which type of memory deficits occur in a patient,” Fernandez-Ruiz said, “we can try to infer what type of underlying neuronal mechanism has been compromised, which will help us develop more targeted interventions.”

The hippocampus, named after the genus to which the seahorse belongs, due to its similar shape, is an area of particular interest for research into Alzheimer’s disease treatment. Cognitive decline in dementia diseases greatly impacts the hippocampus, causing debilitating loss of memory and navigation.

The study was published in the journal Science.

Source: Cornell University

4 comments
4 comments
notarichman
So using this theory in computer programming...especially for AI use, it would be helpful to have at least two paths for learning/memory. But maybe multiple paths?
One example i'm trying to think of is...an improvement path, i.e. in memory a rat found one path to the food, in learning the rat knows of another source of food and
checks whether there is food at the memory site, in the improvement path the rat tries to come up with 1. a shorter path 2. a quicker path 3. a less dangerous path
4. a new source of food that the rat has yet discovered. 5. a way/place to store food for the future 6. a way to preserve food so it doesn't spoil 7. a way to prepare the
food so it is more digestible. 8. a way to produce/grow food for the future.
Audrey
This is what normal people call muscle memory.
TechGazer
This reveals new opportunities for AI development. Humans are stuck with what evolved, but AIs could use different and possibly more effective techniques. Instead of modelling just historic and predictive pathways, it could add in some completely new ones, such as "translate this into words and use your language processor" or even "translate this into haiku poetry and compare with other poems". Once AIs start being creative about creating new AIs, we'll probably be quite amazed at the results ... and hope that they don't decide that exterminating humans is a good idea.
anthony88
This may have applications in the field of neurolinguistics. The ability to recall past events and then retell them coherently and with the correct tense structure is difficult for second language learners whose first language has a different way of connecting form and meaning to the language being learned. In addition, the ability to correctly and automatically produce the grammar used for past versus current events may be explained partly by this Cornell study.