Kaster, Marvin ; Czappa, Fabian ; Butz-Ostendorf, Markus ; Wolf, Felix (2024)
Building a realistic, scalable memory model with independent engrams using a homeostatic mechanism.
In: Frontiers in Neuroinformatics, 2024, 18
doi: 10.26083/tuprints-00027314
Article, Secondary publication, Publisher's Version
Text
fninf-18-1323203.pdf Copyright Information: CC BY 4.0 International - Creative Commons, Attribution. Download (4MB) |
|
Text
(Supplement)
Image_1.pdf Copyright Information: CC BY 4.0 International - Creative Commons, Attribution. Download (151kB) |
Item Type: | Article |
---|---|
Type of entry: | Secondary publication |
Title: | Building a realistic, scalable memory model with independent engrams using a homeostatic mechanism |
Language: | English |
Date: | 13 May 2024 |
Place of Publication: | Darmstadt |
Year of primary publication: | 19 April 2024 |
Place of primary publication: | Lausanne |
Publisher: | Frontiers Media S.A. |
Journal or Publication Title: | Frontiers in Neuroinformatics |
Volume of the journal: | 18 |
Collation: | 21 Seiten |
DOI: | 10.26083/tuprints-00027314 |
Corresponding Links: | |
Origin: | Secondary publication DeepGreen |
Abstract: | Memory formation is usually associated with Hebbian learning and synaptic plasticity, which changes the synaptic strengths but omits structural changes. A recent study suggests that structural plasticity can also lead to silent memory engrams, reproducing a conditioned learning paradigm with neuron ensembles. However, this study is limited by its way of synapse formation, enabling the formation of only one memory engram. Overcoming this, our model allows the formation of many engrams simultaneously while retaining high neurophysiological accuracy, e.g., as found in cortical columns. We achieve this by substituting the random synapse formation with the Model of Structural Plasticity. As a homeostatic model, neurons regulate their activity by growing and pruning synaptic elements based on their current activity. Utilizing synapse formation based on the Euclidean distance between the neurons with a scalable algorithm allows us to easily simulate 4 million neurons with 343 memory engrams. These engrams do not interfere with one another by default, yet we can change the simulation parameters to form long-reaching associations. Our model's analysis shows that homeostatic engram formation requires a certain spatiotemporal order of events. It predicts that synaptic pruning precedes and enables synaptic engram formation and that it does not occur as a mere compensatory response to enduring synapse potentiation as in Hebbian plasticity with synaptic scaling. Our model paves the way for simulations addressing further inquiries, ranging from memory chains and hierarchies to complex memory systems comprising areas with different learning mechanisms. |
Uncontrolled Keywords: | learning, memory, homeostatic plasticity, structural plasticity, scalable |
Identification Number: | Artikel-ID: 1323203 |
Status: | Publisher's Version |
URN: | urn:nbn:de:tuda-tuprints-273142 |
Classification DDC: | 000 Generalities, computers, information > 004 Computer science 600 Technology, medicine, applied sciences > 610 Medicine and health |
Divisions: | 20 Department of Computer Science > Parallel Programming |
Date Deposited: | 13 May 2024 13:31 |
Last Modified: | 17 Sep 2024 04:42 |
SWORD Depositor: | Deep Green |
URI: | https://tuprints.ulb.tu-darmstadt.de/id/eprint/27314 |
PPN: | 521331722 |
Export: |
View Item |