ALL Metrics
-
Views
-
Downloads
Get PDF
Get XML
Cite
Export
Track
Research Article

Sketch of a novel approach to a neural model

[version 1; peer review: awaiting peer review]
PUBLISHED 18 Feb 2025
Author details Author details
OPEN PEER REVIEW
REVIEWER STATUS AWAITING PEER REVIEW

Abstract

There is room on the inside. In this paper, we lay out a novel model of neuroplasticity in the form of a horizontal-vertical integration model of neural processing. The horizontal plane consists of a network of neurons connected by adaptive transmission links. This fits with standard computational neuroscience approaches. Each individual neuron also has a vertical dimension with internal parameters steering the external membrane-expressed parameters. These determine neural transmission. The vertical system consists of (a) external parameters at the membrane layer, divided into compartments (spines, boutons) (b) internal parameters in the sub-membrane zone and the cytoplasm with its protein signaling network and (c) core parameters in the nucleus for genetic and epigenetic information. In such models, each node (=neuron) in the horizontal network has its own internal memory. Neural transmission and information storage are systematically separated. This is an important conceptual advance over synaptic weight models. We discuss the membrane-based (external) filtering and selection of outside signals for processing. Not every transmission event leaves a trace. We also illustrate the neuron-internal computing strategies from intracellular protein signaling to the nucleus as the core system. We want to show that the individual neuron has an important role in the computation of signals. Many assumptions derived from the synaptic weight adjustment hypothesis of memory may not hold in a real brain. We present the neuron as a self-programming device, rather than passively determined by ongoing input. We believe a new approach to neural modeling will benefit the third wave of AI. Ultimately we strive to build a flexible memory system that processes facts and events automatically.

Keywords

plasticity, learning, neural model, vertical integration, internal parameters, neuroplasticity, memory, synaptic plasticity

Introduction: The neuron as a system with internal and external parameters

The current framework

Experimental research on neurons, and as a consequence theoretical analysis, is often divided into electrophysiology and molecular biology. The first deals with membrane potentials, spikes and activations in networks of neurons linked via synapses, the second deals with intracellular signaling, genetic and epigenetic expression, regulation of receptors, ion channels, transporters etc. In both cases signal-induced processes of plasticity are investigated. But a theoretical understanding of neural plasticity in neural networks (horizontal plane) as well as molecular interactions in single neurons (vertical interactions) is missing.

The master framework for neural plasticity focuses on associative synaptic plasticity, usually in the form of long-term potentiation or depression.16 Many interesting and relevant criticisms and alternative suggestions have been raised.713 In this paper, we discuss the experimental results from a new perspective with the goal of finding a better, adequate functional description for a neuron model in the context of a network. Already, complex dynamic models of intracellular signaling1416 and genetic read-out1719 exist, as well as elaborate simulation models.20 But simulation models attempt to match actual biological processes precisely. Accordingly, they model only very small parts of systems in order to model them with high accuracy. Here we are not interested in that. Instead, we want to raise the question how the neuron is organized as a complex system, how it processes and stores information, and how this may apply to computational models of its interactions.

To illustrate this, Figure 1 contrasts the conventional model (A) with a more realistic picture (B). We see that a direct link from synaptic AMPA/NMDA receptor activation to the nucleus and back - the basis of associative synaptic plasticity theory - is unsubstantiated, because of too many intervening factors. Also, the neuron receives neuromodulatory (NM) input and its dendritic ion channels are plastic. This alters the simplistic picture of synaptic plasticity.

f851a29d-ec57-4810-8b64-07d7b049b0e5_figure1.gif

Figure 1. A shows an outline of the classical synaptic plasticity model.

AMPA receptor protein (A) and NMDA (N) in postsynaptic position receive signals from a paired, presynaptic neuron. Calcium influx will activate cAMKII and maybe other proteins, activating ERK and then transcription factors (TF) in the cytoplasm, which enter into the nucleus, where new AMPA receptor protein is produced and transported back to the potentiated synapse. B shows a slightly more complex picture, with input at the synapse from a paired neuron and input at synapse, spine, and dendrite from neuromodulation. Both mRNA in the spine and DNA in the nucleus can produce new proteins. GPCRs control ion channels, and both of their levels are also mRNA/DNA regulated, not fixed. Activated TFs (or ERK) cause nuclear read-out. The connection to inputs is unspecified. No specific model is implied.

As a first approach, we suggest to operate with the very general abstraction of high-level functional parameters. We model membrane properties by a set of external parameters, localized at the synapse, the spine (where it exists), the dendrite, the soma, or the axonal bouton. External parameters respond to signals from the outside, but they are also guided and controlled by submembrane processes. We describe submembrane or cytoplasmic proteins and processes by internal parameters. While external parameters influence neural transmission properties visible to other neurons, internal parameters only have indirect effects on neural transmission and remain hidden from other neurons. Together external and internal parameters form the ‘processing layer’ or the upper part of the ‘vertical’ system of neural plasticity. In contrast to that, the nucleus contains ‘core’ parameters, both epigenetic (histones) and genetic (DNA). This provides another layer of depth for processing information and long-term memory. Core parameters receive and send information to the internal parameters via the nuclear membrane. In this paper, we will not analyze core parameters and their function in detail.

Artificial neural networks (ANNs) were originally developed to obtain functional, artificial intelligence (AI) models based on ideas about neural interactions. They are centered on the idea of computing graphs with adaptive weighted connections,21 with many variations of that theme, e.g. Ref. 22 While they are suitable for statistical computation, they are highly inaccurate with respect to neurobiological results. On the other hand, for single neurons, internal cellular models, such as biochemical reaction systems and genetic transcription networks, are available. They are usually dynamical systems simulations which are unsuited for functional computation. They are also subject to dependence on small parameter changes and work best for small-scale simulation models. Without further elaboration,23 different models are highly incompatible. The neuronal cell performs a number of tasks in the wider context of cell-internal mechanisms. Our goal must be to isolate information processing from complex cellular computations. Dynamical systems models do not focus on the tasks of information processing that interest us. Therefore we choose to describe the systems of sub-membrane and cytoplasmic intracellular signaling24,25 in a simplified way by amorphous internal parameters, to be elaborated at a later time. A system of membrane, cytoplasmic and nuclear processes may help as a first foray into organization of experimental data. This could be replaced at a later time with parameters and interactions on the basis of mathematical models. Our first goal in this paper will be to investigate what insights we can gain from this point of view.

Outline of the new model

External parameters roughly correspond to membrane proteins, such as receptors and ion channels. They undergo plasticity on at least three time-scales: fast (milliseconds, (de)sensitization), intermediate (seconds-minutes, endocytosis, insertion), and slow (hours, new proteins and morphological features like spines). Only the fast responses, which consist of protein conformational changes, do not need the participation of the internal, vertical processing system. The endocytosis/insertion of membrane proteins requires a system of internal interactions. The long-term ubiquitination of proteins and the generation of new proteins uses core parameters.

External parameters respond to outside signals, but internal parameters guide the response. Spines and boutons contain hundreds of protein species which interact to orchestrate membrane protein position and efficacy.26,27 The set of internal parameters is the first line in orchestrating the plasticity of external parameters. We hypothesize that internal parameters constitute an internal generative model of the external membrane. From a computational point of view, this is a very important observation. Some experimental evidence exists for this.28 Within the context of a neural group, this means that synaptic and membrane plasticity is not only and not directly dependent on the signals among the neurons. Instead the activity of the internal system is necessary for selecting and responding to signals.28,29 The response of the neuron cannot be determined from the signals it receives.

The internal system contains localized elements in distinct positions (e.g. at dendritic spines, axonal boutons, at a synapse). It also has a central’workspace’ in the form of the cytoplasm with its organelles and protein complexes. One task of the processing layer is to take the localized, time-structured signal and produce a spatially centralized, temporally integrated signal of transcription factors, which move to the nucleus and cause DNA read-out of individual genes or genetic programs. This is not a simple task. Signals are selected at the membrane. They are then shaped and re-structured in time. They need to be sorted or transferred to protein complexes in the cytoplasm and ultimately activate the proteins which enter into the nucleus. In the nucleus, epigenetic adaptation regulates access to DNA, transforming signals by enhancement and suppression. It is clear then that signals from the periphery have to be processed on several levels, and with several types of control structures, before they can actually be used to control or regulate DNA read-out, or be stored in the epigenetic layer around it.

Signals are accumulated and integrated, temporally and spatially before the central core is activated. Signals pass from the cytoplasm to the nucleus usually in the form of transcription factors activating coordinating sets of genes. Thus whole genetic programs can be started, such as morphological growth of spines, axons, or dendrites. Individual genes (e.g. AMPA receptors) may also be transcribed during periods of neural plasticity. The nucleus has DNA as permanent storage for protein information. There is also an epigenetic histone system which regulates the accessibility of DNA.30 Histones are phosphorylated or methylated in a reversible manner and code the for access to genes.

The other task of the internal parameter system in the sub-membrane processing layer is to enact short-term plasticity and directly shape external membrane expression on the basis of existing information (i.e. a generative model). Signals from the environment occur fast and they are ordered in space and time. When they arrive at the membrane, they are being transformed at the membrane layer through internal parameters by feedback cycles and similar control structures (cf. Figure 2). These can suppress, lengthen, or augment a signal by engaging the intracellular signaling network. The result is new external membrane expression as a combination of the stored generative model and incoming signals.

f851a29d-ec57-4810-8b64-07d7b049b0e5_figure2.gif

Figure 2. Regulation and plasticity of external parameters.

Ion channels and receptors at the spine or the membrane receive signals and activate an internal system which (a) changes properties of receptors/ion channels short-term (regulation loop) or (b) continues to signal via transcription factors to the nucleus (memory loop). Activating the memory loop may require stronger signals and additional readiness on the part of the internal system.

The membrane with its many inserted proteins defines the neuron’s response to outside signals in the form of membrane potential and action potentials. This controls neural transmission, i.e. the horizontal interactions between neurons. The membrane compartmentalizes into dendritic branches, spines and synapses, and axonal branches and boutons. Outside signals are therefore received by a highly structured system. Signals can be located by origin, and are processed in a spatially distributed way. Synaptic signals, which are brief (milliseconds), produce calcium transients from NMDA receptors and voltage-gated calcium channels (VGCCs). Calcium buffering proteins and calcium release from internal stores31,32 contribute to the signal. There is evidence that calcium is a major signal for the induction of plasticity and that different shapes of the signal result in different outcomes.33 Outside signals govern the short-term external parameter response. The selection and filtering of membrane signals by the internal system integrates information from internal and external sources.

In this context, it is interesting that the intracellular system itself is devoid of memory. It cannot respond to signals by adaptive plasticity or regulate its own plasticity.3436 There is temporal integration for membrane signals, and temporal extended intracellular responses, such as G protein-mediated signaling by small molecules or the kinase/phosphatase system.37 But only the abundances of proteins via mRNA translation can change the system’s properties. This may take hours to be reset and recalculated. This observation strengthens the view of the internal parameter system as a processing layer, programmable by signals from the outside and updated from inside via parameter re-sets such as local mRNA translation and the core DNA transcription system (cf. Figure 2, Figure 6).

To summarize, the membrane layer is adaptive on several time scales, and in the long term by genetic read-out. Under certain conditions, protein abundances in the processing layer or at the membrane may change, adapting the neuron to a new environment. A special case of signal-induced long-term membrane plasticity is associative synaptic plasticity, such as long-term potentiation/depression (LTP/LTD). But it is not the only expression of plasticity at the neuronal membrane, and associativity is just one of the properties which may cause it. Synaptic plasticity, while it may be fairly reliably triggered by signals in the short term, requires special conditions in the postsynaptic density to be expressed as a long-term property.38

Signal selection and filtering

External parameters at the membrane

In our conceptual model, parameters stand for proteins/molecules, sometimes for groups of them, or variants of the same molecule. A precise matching of parameters to biochemistry is not intended at this point, but this is a step that can always be added. The values for these parameters – external, at the membrane, internal, near the membrane, or localized somewhere in the cytosol, or core parameters in the nucleus – change by computations or by outside signals. Outside signals are first sensed by external parameters. They are processed with the help of internal parameters, i.e. filtered or selected, stored or dropped. In response to signals, external parameters may return to their values (homeostatic regulation) or they may be re-set to new values (adaptive regulation).

Outside signals are at first pre-processed by external parameters (receptors and ion channels). Immediate and reversible plasticity of receptors and ligand-gated ion channels is the first response. This happens usually by protein conformational changes. Calcium and small molecules, like cAMP, as well as the kinase/phosphatase system engage the internal parameter system. Control structures and computations by internal parameters decide about the fate of signals (cf. Figure 2). There is direct regulation of ion channels and receptors by internal parameters, the ‘regulation loop’, that occurs within seconds or minutes. This does not constitute lasting memory. Memory requires mRNA read-out and new protein translation. In (cf. Figure 2), this is exemplified by transcription factors, which engage the nucleus, and a slower loop, the ‘memory’ loop. In principle, the ‘regulation’ and the ‘memory’ loop may occur within a dendritic spine, using mRNA translation as the memory loop (not shown).

We hypothesize that membrane proteins have a strong tendency towards homeostasis, i.e. return to existing values, after short-term disruption.39 But in some situations, the accumulation of traces from each signaling event is predominant. For the synaptic ligand-gated ion channels AMPA and NMDA a number of protocols exist which cause lasting plasticity (up- or down-regulation 40). It is also known that ion channels integrated in the membrane are variable in their expression levels (Figure 3A,B). The factors influencing this, often summarily referred to as intrinsic plasticity,41,42 are less well-explored (cf. below, co-regulation of ion channels). Calcium modulation has often been indicated as a major factor in initiating plasticity.4346 Calcium enters the cell via various ion channels and receptors. The shape of the calcium transients matters, with a preference for brief strong signals. High-amplitude calcium transients directly permeate into intracellular signaling to begin to make changes.4749 Intracellular calcium buffers, and mechanisms for release from intracellular stores, influence calcium transients and are an example for the integration of outside signals and internal parameters.

f851a29d-ec57-4810-8b64-07d7b049b0e5_figure3.gif

Figure 3. A, B show the variability of ion channel density and the selective modulation of a neuron’s activation function. Three different ion channels are analyzed, cf. Ref. 52 for details. C shows a model neuron with NM modulated re-design of its dendritic function.

Red dots show putative spine inhibition (in red-circled areas), blue lines show the decreased dendritic transmission, marked by yellow arrows.

Neuromodulation (NM) via G-protein coupled receptors (GPCRs)50 also influences signal selection. This is strongly based on external-internal integration. G proteins are internal proteins, which have a number of functions. This includes transiently regulating the ion channels in the membrane, reducing or enhancing their efficacy. The ion channels determine the excitability of the neuron, conditional on which (central) NM signal modulates them. NM-ion channel interactions can reversibly remodel dendrites at intermediate time frames (Figure 3C). The activation of different NMs highlights the set of neurons that are modulated by them.51

There is a notable difference in plasticity between spiny and aspiny neurons. Spines are localized at projection neurons in areas of strong adult plasticity, such as pyramidal neurons in cortex, hippocampus and amygdala, medium spiny neurons in striatum and in certain dendritic regions of Purkinje cells in cerebellum - areas of high fine-tuned plasticity. Dendritic spines (and possibly axonal boutons) act as cellular compartments with a highly reduced intracellular signaling apparatus.53 In contrast, synapses of aspiny neurons don’t have a dedicated local system of intracellular signaling. For spiny neurons, spine removal and generation is an efficient modification of existing information. Aspiny plasticity would require more substantial remodeling of global, rather than compartmental parameters. This fits with the idea that learning at these synapses is much more restricted, at least in adults.

Models for external-internal integration

Several models are possible for the integration of the external with the internal parameter system. For instance, we may assume that external parameters fluctuate in response to signals, they follow a random walk and need reinforcement by internal parameters to store a value. Such a model could emulate modern feature selection methods54 very effectively. Alternatively, external parameters receive outside signals, but they are corrected by values from the internal parameters. If internal and external sets match, there is no plasticity. Otherwise, the internal parameter system selectively overrides the external parameters by stored values. Only few external parameters can change internal values. The common theme is that outside signals are ignored or passed on depending on the responsivity of the neuron, as well as the strength of the signal itself. This is related to the problem of feature selection for learning and memory.54 The signal is selected if it is either strong (high amplitude, co-occurring at various sites), or repeating (at the same sites), or long-lasting (even at a lower level of signaling). Even fairly weak signals can be retained, if signals combine within a specified time frame, and if the neuron is highly responsive. Even strong signals fail to register, if the signal is isolated and the neuron has low responsivity. Experimentally, the effect of responsivity is studied along the lines of.28,55,56 Theoretically, the question arises, which feature selection processes are most useful. These questions have to be explored further.

Outside signals to a neuron have the potential to activate transcription factors as immediate early genes (e.g. Arc, CREB) within minutes (5-20 minutes for Arc in hippocampal neurons57,58). These transfer to the nucleus59,60 and may start new transcription. But they have to pass the intracellular signaling system. This has a large number of control structures such as thresholds, feedback loops, feed-forward loops, antagonistic signaling.61 For instance, negative feedback loops serve to broaden a signal and extend its duration. They may also dampen a signal below threshold.62 Signals may be modified, such that later signals in a sequence are being suppressed. A sequence of signals may be enhanced to reach a threshold.63 These control structures are activated by signals but they depend on internal parameters. In this sense, internal control structures serve to adapt a signal and influence whether it is passed on to the next layer.

The associative synaptic model of plasticity assumes that the activated synapse signals to the core, and the core signals back to the relevant synapse. Therefore a synapse needs a tag to be identified. This “synaptic tag” hypothesis has been much discussed, but turned out not to be experimentally verifiable.6466 Instead, mRNA translation can be local to the spine, for spiny neurons. But in general, genetic read-out affects the whole neuron. This concerns the external and the internal parameter system. There is spatial and temporal integration of signals happening in the internal layer. The internal system is important to perform computations on localized, distributed signals. Localized plasticity can be achieved by the combination of global activation from the core and local parameters. Local sites signal to the global core. The central, global site signals back to all sites which are responsive. Synapses can be differentially regulated in spite of a global bias term.

Vertical computation

Naive and mature neurons

In our view, each neuron represents a processor, which will process information according to its own program. These processors have been set up by evolution as neuronal types with their own genetic equipment. It is important to realize that any neural computation is performed by a set of heterogeneous neurons which are adaptive but retain their cellular identity. Evolution has done the ‘main work’ in establishing the neuronal types. For instance, there are pyramidal neurons, a type of excitatory projection neuron. There are specific types as cortical vs. hippocampal pyramidal neurons. For cortical neurons, there are surface-layer vs. deep-layer pyramidal neurons. As we increasingly learn, there are even a number of genetically distinct deep-layer cortical pyramidal neurons.67 And there are many other neuronal types, for instance, types of cortical interneurons. To model neurons, the evolutionary processes do not need to be re-created (even though that could be very interesting). Instead, neurons could be set up from a set of pre-established neuronal types. Each of these neurons then has a genetic make-up and provides a type-specific programmable unit. Adaptation, or ‘programming’ serves to individualize the response.

We hypothesize that neurons may change their phenotype as a result of adaptation. A neuron which has a generic program for a neuronal type may be called a “naive” neuron. When it has acquired a specific program, it turns into a “mature” neuron.29,68 A neuron can be programmed by the complex control structures of protein signaling and acquire an adapted internal model. We may distinguish between the responses of naive neurons with default control structures vs. mature neurons with a learned internal parameter structure. This is a concept that will need more exploration, there is probably a continuum. Evidence for this idea exists in e.g. Ref. 69, where the lifetime distribution of PSD-95, a synaptic marker, was tracked on neurons in cortex and hippocampus. They found short-lifetime PSD-95 at young or innate (‘naive’) neurons, and longer PSD-95 lifetimes at aged (’mature’) neurons. Possibly it is also the case that’naive neurons’ are dominated by a single NM, and only mature neurons have acquired noticeable responses to several NMs.

Naive neurons exist with default configurations, possibly with random elements. Exposure to patterned signals lets them acquire a specific generative internal-external model which reflects the neuron’s experience and expectation. This is a’mature’ neuron, rather, it is the process of maturation. A mature neuron may still re-learn, and possibly even re-set to the naive state under exceptional circumstances. An outside signal has a different effect for a naive neuron - where it may be imprinted - or a mature neuron where it is matched or non-matched. In the naive neuron there is no individualized control structure to filter the signal. Internalization involves setting-up a copy of the external parameter set as the basis of a generative model or as a “prior” for later processing.70 In a mature neuron, the signal is filtered and then matched to the existing parameter set. A neuron may contain both naive and mature sites (especially at spines), it could be ‘chimeric’ with respect to maturity. Spines could be either naive or mature on the same neuron, and a dendrite could be ‘bit-coded’ for grade of maturity. A naive site will store or read in a value, a mature site already contains a value and is protected against overwrite.

Programming a neuron

Internal signaling has several known mechanisms which fulfill the idea of a programmable system. Internal signaling can cause activation or de-activation of a dendritic branch. It can reset presynaptic vesicle pools (e.g. by cAMP-dependence). It guides spine maturation and spine decay. It performs spatial integration on the spine (e.g. AMPA reservoirs on the spine shaft moving to the spine synapse71). There are local buffers (cAMKII), and in the cytoplasm, there are timed delays, queues, feedback cycles, which suppress a signal or enhance it. There is spatial integration over several dendritic compartments.50 There is antagonistic signaling61 and transactivation or overflow switches.72

The internal signaling system passes outside signals either to actuators in the membrane or to further processing in the nucleus. The cytoplasm acts as a shared access system for the dendritic compartments and protein complexes. From here, inputs from many dendritic compartments are routed into a single nuclear compartment. (The axonal system is not further described here). The internal system guides signals from a localized system either to a central, queued system or feeds it back to locations on the membrane.

In Ref. 20, we employed a transfer function approximation for G protein coupled receptor (GPCR) regulation. Transfer functions allow to generate (context-dependent) look-up tables for endpoints of internal parameters in response to outside signals. This is the basis for learning internal parameters by changes in protein expression.

A common, ubiquitous motif in protein interactions is antagonistic signaling. This happens when an input is linked to an output by two antagonistic pathways (positive and negative).61 Examples are D1/D2-type dopaminergic signaling, or beta/alpha-adrenergic signaling, cf. Figure 4A. The antagonistic motif allows re-scaling of inputs which vary over a large scale, compressing multiple orders of magnitude to a single-scale value. Also, antagonistic signaling allows for peak transients when one pathway is slightly faster than the other. These features are useful basic properties for any system that handles signals.

f851a29d-ec57-4810-8b64-07d7b049b0e5_figure4.gif

Figure 4. Control structures in vertical computation.

A: Antagonistic signaling.61 A signal (calcium, dopamine) has both positive and negative effects on a target, often with a time difference. B: Overflow switches.72 Reorganizing structure in intracellular signaling. The orange pathways are only activated with high expression and activation of the beta-adrenergic 2 receptor, the G-protein Gi or EGFR. Signaling via MAPK pathway from adrenergic activation at b1,b2 and a1 receptors can be switched on or off.

In Ref. 72, we showed how a strong signal at a receptor together with high protein expression at a target pathway (which is not usually involved in signaling by this receptor) results in activation of the target. This activates a different pathway from the canonical one (‘transactivation’). The result is signaling by a pathway which happens only under special conditions. We called this an overflow switch, as a motif in computation, emphasizing the mechanical control aspect of internal signalling, cf. Figure 4B.

Molecular abundances are important not just for establishing (high expression) or removing (low expression) connections between pathways. They also dictate the temporal execution of the biochemical reactions underlying intracellular signaling.73 High expression means that reactions are fast and homeostatic endpoints are reached in short time. When protein expression is low, internal reactions are slowed down and signals are processed sluggishly, cf. Figure 5A. In principle, this allows neurons to act at different time scales. An example for the power of the global workspace is shown in Figure 5A. Using different types of buffers available, we can achieve re-sorting of signals in time. A typical example is the protein CaMKII, which buffers calcium by autocatalytic activity, and which releases calcium after some time – depending on the activation status of each buffer molecule. This allows to create a reverse sequence. It therefore provides a basis for queued access to nuclear transcription. The sequence reversal motif, i.e. the reordering of signals in time, together with other motifs, allows to produce canonical shapes. This would make it easier for other calcium-receptive pathways to process the signal. A sequence of synaptic inputs, which are sorted and combined from their spatial distribution into larger components (e.g. like a bit code which is translated into ‘words’), could provide a shape of signal that can be easily read.

f851a29d-ec57-4810-8b64-07d7b049b0e5_figure5.gif

Figure 5. Timing in Vertical Computation.

A: Temporal re-sorting shows how buffers can re-order the timing of signals for activating transcription factors (TFs). Buffer C reacts first because it is saturated. B: Changing the concentration of proteins in a biochemical reaction network (by 100%, details in Ref. 73). Speed changes are shown in red (faster) and blue (slower) hues. (B1) Increasing a target protein (DARPP-34) has a narrow clustered response. (B2) Increasing both kinase (PKA) and phosphatase (PP2A) leads to a strong speedup of a widespread number of reactions.

There is mRNA-mediated plasticity at the spine, which remains localized74 or spreads locally along the dendrite. But if sufficient signals combine and the nucleus is involved, we assume that neural remodeling may affect the membrane globally, not just an activated synapse. Some effects may occur in the global workspace, the cytoplasm, rather than at dedicated membrane sites.

To summarize, outside signals are received by external parameters and activate a program on the internal system which determines the system response.75 The internal computational system that filters and temporally re-sorts signals, that combines signals, and releases them, is a programmable and re-programmable system. The system contains parameters which program its functionality. It can be reset by the core, an additional system. To re-program the system requires new transcription or mRNA translation. To investigate the power of such systems, explicit programming, agent-based, dynamical systems as well as machine learning techniques could be suitable.

Combining vertical and horizontal functions

The individual neuron and its interaction in groups

In our view of horizontal-vertical computation, spiking activity is distributed over a group of connected neurons. Suitably binned, these are spatio-temporal patterns. They are reflected as signals at a neuron or a neuronal site (like a spine). At the neuron, the processing of the signals is determined by external, internal and core parameters. If a signal produces a match with the dendritic structure of the neuron, the neuron spikes. Neuronal spiking transmits a signal to all other neurons which are connected by synapses. But we assume that transmission of signals is not sufficient to result in parameter adaptation. It only leads to adaptation if the signal is sufficiently strong and internal parameters permit it.

The signal that the individual neuron receives is defined by the pattern, i.e. the spatio-temporal spiking distribution on the horizontal layer.70 These spiking patterns appear as spatio-temporal patterns (STPs) over a group of neurons. Over time, patterns are undergoing transformation by the variability of the neurons which carry the pattern.76,77 To simplify discussions of horizontal interactions, we assume neurons interact only by spiking. This means neurons tell each other how active they are, and when they are active. Neural transmission, the horizontal interaction between neurons, is explicitly determined by external membrane parameters. These parameters can be local to the synapse (e.g. AMPA, NMDA). They may affect a dendrite (e.g. ion channels). Or they may become activated by neuromodulation (e.g. different kinds of GPCRs). Other factors (such as the activity of transporters) also exist.

For each neuron there is a set of external and internal parameters that defines the state of the neuron. External parameters respond to outside signals through the horizontal layer. Structured groups of neurons generate patterns. Neural types from the same brain region form ensembles.30 Alternatively, a group can be a neural loop,78 where neurons are located in different brain areas, and consist of different neural types. For the efficiency of computation for a network of processors, such as neurons, a small bandwidth of communication is preferable. It is an interesting hypothesis rarely investigated that for the functioning of the neuron, evolution has developed neurons to be as independent as possible from outside signals.

An example is the co-regulation of ion channels.39,79,80 Co-regulation requires genetic read-out. It shows strong internal bias, i.e. genetic conservatism for a neuronal type, and does not require network-based regulation.81,82 The neuron itself “knows” its well-balanced state. If each processor element monitors its own stability, construction of horizontal models becomes much easier. If the neuron is mostly independent and stable as a processor, the capacity for outside signals to create significant change - beyond the random walk fluctuations at the membrane83 - is quite limited. It is probably not automatic with each spike or transmission event. In this sense then the neuron is autonomous, even though outside signals can re-program its state.

Neurons as programmable elements

An adaptive weighted neural network is a simple data structure (graph) with only one degree of freedom (weights) for recording past experience. Accordingly, it stores data (or data-classification pairs) and generalizes by interpolation. In contrast, a horizontal-vertical network is a network of programmable processors. This system can do more than adapt to outside signals and store traces of them. It can select and filter, i.e. ignore many outside signals if there is no internal reinforcement. It can also self-program, i.e. self-adapt its internal processing structure. The self-programming mechanism relies essentially on an existing external-internal processor, plus feedback from the slow core to adjust it. A stream of outside signals (data) provides the input ( Figure 6). The processor has a set of stored functions, data set parameters for the functions, and the core adjusts the probabilities and rewrites the functions.

f851a29d-ec57-4810-8b64-07d7b049b0e5_figure6.gif

Figure 6. Building blocks for self-programming of neurons.

Outside signals (Data) are processed by external parameters (EXT) which interact with internal parameters (INT). The EXT/INT system changes over time. The decisive feature for adaptation is the feedback between the core and internal systems. Note that there is no direct access between external parameters and the core and that data are maximally separated (filtered) from the core system.

These features define the capacity for self-programming for a neuron within a network:

  • - preset ranges for fast adaptation of external parameters to outside signals

  • - rate of change for parameters (slow and fast) to address the stability-flexibility dichotomy

  • - re-programming by genetic core control of protein abundances

  • - programs in the genetic core which allow complex morphological adaptation such as spine growth, dendrite remodelling, axonal branching

Interestingly, internal signaling networks as a type of biochemical reaction networks guarantee a return to homeostasis.20,36,84 This means that these signaling networks have no memory of their own. They react to signals, but then return to their preset values. They are thus protected against signal-based memory. Only changes in protein abundances re-program the system. But these are under control of another system, namely RNA translation, or DNA transcription. So the system itself would not have any memory of the actual processing it performs. This means that it acts as a processing layer between membrane and nucleus. What can be programmed? The neuron has a number of control parameters available which govern neural transmission, i.e. neuronal behavior in interaction with other neurons on the horizontal plane.

These parameters code for membrane excitability, i.e. firing threshold, membrane resting potential; after-depolarization/afterhyperpolarization, i.e. refractory reset properties; spike latency, axonal delay, degree of myelination,85 etc. They set the firing rate and the capacity for synchronization. The placement of ion channels, GABA-A,86 and glutamate receptors influences the capacity of the dendrites to route excitations to the soma. There are other membrane properties which also constitute programmable elements but which influence neural transmission in an indirect way. NM receptors or GPCRs (including GABA-B, mGLU) are regulated in response to ligand availability. It seems both homeostatic and positive adaptive regulations of GPCRs occur. GPCRs activate internal parameters (e.g. Gα , Gβγ , cAMP) near the membrane. They can modulate membrane properties directly, by changing the properties of ion channels, often tuning them up or down. They also signal to the nucleus, directly (PKA, PKC), or indirectly (MAPK, ERK) using the internal parameter system.20,72 In this way GPCRs provide a significant part of the filtering system for outside signals.

What can be programmed? The neuron has a number of control parameters available which govern neural transmission, i.e. neuronal behavior in interaction with other neurons on the horizontal plane. These parameters code for membrane excitability, i.e. firing threshold, membrane resting potential; after-depolarization/afterhyperpolarization, i.e. refractory reset properties; spike latency, axonal delay, degree of myelination,85 etc. They set the firing rate and the capacity for synchronization. The placement of ion channels, GABA-A,86 and glutamate receptors influences the capacity of the dendrites to route excitations to the soma. There are other membrane properties which also constitute programmable elements but which influence neural transmission in an indirect way. NM receptors or GPCRs (including GABA-B, mGLU) are regulated in response to ligand availability. It is difficult to pinpoint the mechanisms, but it seems both homeostatic and positive adaptive regulations occur. GPCRs activate internal parameters (e.g. Gα , Gβγ , cAMP) near the membrane. They can modulate membrane properties directly changing the properties of ion channels, often tuning them up or down. They also signal to the nucleus, directly (PKA, PKC), or indirectly (MAPK, ERK) using the internal parameter system20,72 and in this way provide a significant part of the filtering system for outside signals.

To build a generative model for the external parameter set of a neuron, internal parameters can exchange or restore external parameters (e.g. they modulate receptor endocytosis followed by insertion or ubiquination). They can protect parameters against overwrite (e.g. by placing Sk-channels next to calcium channels/AMPA/NMDA receptors87). They can also modify receptor complexes (e.g. co-localization of D1/NMDA receptors88). The internally regulated, targeted placement of neuromodulator receptors allows to switch between several modalities on short time scales (seconds to minutes) by activation of neuromodulatory signals.51

Recently it has been argued that novel behavioral signals may affect at first epigenetic features, i.e. histones, while read-out of DNA happens only after repetition.89 Maybe, in a naive neuron, signals first affect accessibility of DNA. Further confirmation is needed to actually read out DNA programs. Programming a neuron’s core identity may take several steps (Figure 6). A complex program guides the actions that re-define a neuron’s morphology or alter its membrane composition. For instance, the core contains programs that grow or mature a spine, produce and insert neuromodulatory receptors, move new AMPA receptors to the postsynaptic density,90 convert silent NMDA receptors, or balance and co-regulate ion channels.42 The precise mechanisms by which the neuron self-programs its many features are still a matter of further research.

Horizontal interactions in the network

The horizontal system performs massively parallel search for matches of signals to neurons. If there is a match, the neuron’s spiking activity changes (in most cases it increases), a subgraph of connected neurons is activated, and new patterns arise. Excitation in a neuron produces additional spikes, which is relevant for the function of the horizontal layer. Strong matching signals may generate internal signals and re-program a neuron. For a non-matching signal, there are no additional spikes. Internal suppression will stop the outside signal from passing into the processing layer.

STPs rely on membrane (synaptic and non-synaptic) properties of neurons which are subject to parameter adaptation. STPs undergo constant transformation, even in the absence of perception. Thus a pattern is transformed via the signals it produces at individual neurons. As a result, the pattern may match existing stored templates. In this way a pattern could become adapted to existing knowledge. While patterns are formed, new perceptual input may be suppressed.

Neuromodulation has an interesting role in that it codes for cellular identity in a combinatorial fashion.91 It can thus serve as labeling, both for neurons and connections (nodes and edges of a graph structure). Such labeling is extremely useful because it allows for constrained inferences.51 For instance, a dopamine D2 receptor may adjust GIRK channels and subsequently the spike pattern reflects hyperpolarized membranes at all affected neurons. This is a temporary change which requires a dopamine signal, and which labels neurons during computation.

Based on experimental evidence, neuronal groups may enter into different coordinated states.92 Parameter adjustment (internalization or externalization) may happen in specific states or regimes. We may introduce an objective function, “strain”, defined by high spiking activity and synchronization. Within a neural group, focal areas of high activation, such as resulting from sensory input, may experience high “strain”. To define strain, one could use a normalized value to measure excitation (taking peaks and temporal spacing (“bursting”) into account) and a measure for synchronization (by co-occurrence of spikes, or membrane potential fluctuations). The system then aims to minimize strain over a group. Accordingly a group is selected for memorization of a pattern, not just a single neuron.30 Only under high strain may we assume vertical computation to be initiated. The objective function builds up in the horizontal plane and gets resolved by adjusting parameters in vertical computation. A neuron can “absorb” strain from the representation. Parameter adaptations allow a return to a well-adjusted horizontal regime of low strain. High strain may cause a network to remove information from the external membrane, or vice versa to read out parameters so as to reduce activations.

Overall, we anticipate a network of processors with their own individual memory, connected by binary-type signals, participating in low-dimensional representational configurations, to offer a highly flexible and sophisticated framework suitable for complex computations of the kind that humans perform.9,12

Discussion

From a cognitive and behavioral perspective we know that episodic memory is not a reversible operation where a blank slate is inscribed and wiped again, but an operation where permanent changes occur when memories are stored. A developmental trajectory exists from neonate to juvenile to adult to aged individual. Only a fraction of events is stored with completeness. Much information is not kept and deemed not worth keeping. In episodic memory, many representations of events undergo abstraction and become part of a complex (schema) that they add to in small ways. Often, information that is sufficiently represented by existing knowledge is either summed or discarded.

In a very general way, neuronal plasticity can be seen as a form of difference learning between stored information and new, incoming signals. Difference learning is also known as “predictive” or “generative learning”. It is substantially different from the associative learning that Hebb and others postulated. We outlined a model with external parameters at the membrane, which guide transmission of information. Internal parameters, hidden from interactions, store and process information and are capable of re-setting external values. The core system programs and re-programs the processing system. The difference between outside signals and internal information drives adaptation. It forces a system, such as the brain, to remain an open system with respect to its environment. Such a system lacks the properties of an algorithmic or formally ‘closed’ system. Memory management is central in such a system. It ensures consistency, promotes signal loss where appropriate,93 and prevents ‘catastrophic forgetting’.68,94

The synaptic theory of memory is difficult to reconcile with neurobiological data since synapses have high turnover.95,96 This means that information cannot remain physically located at the synapses long-term without being lost or overwritten. Various schemes have been suggested to counteract this problem.96 The basic idea is, as part of the synaptic information is lost, it is retrieved by other synapses in a pattern completion process. Memory is thus constantly moved among neurons and connections.

Any neural system must contain elements with high stability - which provide the backbone of the knowledge structure and are protected against change - and elements which flexibly adjust to a current situation. Neuronal types, defined by genetics and the product of evolutionary learning, are essentially stable, and they form part of a background structure. The stability-flexibility trade-off is probably at the center of the capacity of brains to build structured representations.94

Neurons as processors operate with graded stability (from outside to inside). This principle seems very intuitive and sound (Figure 7), but it is unusual in computational theory. Memorization involves transforming transient information into stable information elements. Axonal/dendritic morphology is also a stable component. It requires core-mediated programs for alteration. Concentric abstraction of temporal depth provides short-term adaptation, long-term stability and the capacity to store information into increasingly stable formats (Figure 7).

f851a29d-ec57-4810-8b64-07d7b049b0e5_figure7.gif

Figure 7. The neuron - drawn as a prototypical cell - acts as a processor which receives fast signals and responds with fast (external) and slow (core) adaptations.

A major problem with the synaptic hypothesis is that adjusting weights in graphs has weak expressive power. Treating neurons as adjustable activation functions improves efficiency of implementation, but does not extend the expressive power of the system.9799 While such systems can store large amounts of data, they overwrite storage by recency. They don’t function for more complex hierarchical knowledge and action plans, like language generation.100 They may mimic it by pattern generation for images or text. Certain modifications of weight-adjusting neural networks were proposed on technical grounds22,101 to solve problems of insufficient memory performance, cf. Figure 8. A mathematical theory of the expressive power of various neuronal representation systems is still lacking. This would involve comparing graph-theoretic to vertical-horizontal models.

f851a29d-ec57-4810-8b64-07d7b049b0e5_figure8.gif

Figure 8. A. The Neural Turing Machine (NTM)22 is a combination of two recurrent neural networks, one for memory (slow) and one for control/access (fast). B. A long short-term memory (LSTM) cell101 uses neurons as input (i) and output (o) gates to’memory cells’ (c) and computes a hidden state for a memory cell that is separate from its output value.

Many diseases of the brain relate to processes, molecules and interactions that are insufficiently described with the concept of synaptic plasticity. For instance, it was recently found that ketamine drives genetic expression of the KCNQ2 channel, which increases the IM (K+) current.102104 In hippocampus, this reduces bursting and lack of reset after spike firing, by reducing the afterdepolarization (ADP). It also reduces spontaneous spiking at the excitatory synapse. This latter effect may be responsible for the long-lasting anti-depressant effect of single-dose ketamine.105 Here a one-time increase of the genetic expression of an ion channel provides behavioral changes for many weeks. This type of result is easy to incorporate into a model of the horizontal-vertical type but not into a synaptic weight adjustment model. It is also important to realize that remembering is a biochemical event. Neural’loop’ structures connect cortical regions with deep brain regions. Often, a cocktail of neurochemicals (monoamines, neuropeptides, hormones etc.) is released upon recall. This underpins the’emotional’ component of memory. Horizontal-vertical integration models can be considered a substantial innovation, especially compared to synaptic associative weight adjustment models. They use neuron models with internal memory to build complex processor networks, combining data and experiments from electrophysiology and molecular biology. The proposed neuron model allows to realize self-programming of neurons and building complex models of cognition. The brain, of course, has many additional components, such as the capillary and glymphatic networks, glia cells, and the signals that pass the blood-brain barrier. In the future more comprehensive brain models may encompass such additional components.

An important advance of the proposed theoretical model as a horizontal-vertical model is to offer a blueprint for theoretical neuroscience. This has applications in both artificial intelligence and disease models. There is an unmet need for brain theories and models with AI functionality. Such theories must model the cellular processes exhibited by the neuron. They can then adequately capture their disruption within a disease process.

Conclusion

The neuron is a highly specialized cell type, strongly compartmentalized with extensive axons and dendrites. Each compartment, dendritic or axonal, connects directly and specifically to other neurons. For this reason, there has always been a strong focus within neuroscience on the synapse as a specialized connective element. The postsynaptic density protein composition, and the presynaptic vesicle release mechanism are very intricate cellular structures. This has led theoretical neuroscience to focus on synaptic weights as the main memory mechanism. The experimental literature on neuronal plasticity shows that synapses are only one component of plasticity. Intrinsic forms of plasticity via ion channels and NM receptors have been documented in detail.42,61,97,106111 Especially in the field of disease modeling, the internal dimension of the neuron with plasticity in proteins and the nucleus has been shown to be of particular significance.

The development of a theoretical framework, linking all those aspects of neural plasticity, has been missing so far. We have begun to fill the void by re-thinking the concept of neural plasticity from the perspective of the individual neuron. We believe that even synaptic plasticity is incompletely understood when placed outside of the context of the individual neuron and its cellular functioning. Purely horizontal models with weight adaptation, like current neural network models, can only respond to current input. They have no internal component to set and re-set responsivity and to generate or eliminate adaptation of membrane models. That is very restrictive. Huge networks with billions of parameters are needed to solve simple problems by storing millions of patterns. These restrictions are systematic and mathematically explainable, since the expressive power of networks with adjustable weights is weak.

A programmable memory at each neuronal site allows for more complex and concise operations. For instance, patterns from the horizontal plane can get stored into a small set of dedicated neurons for a particular problem. Such models may then be used to build larger knowledge structures.

We believe that the “third wave of AI” will have to employ some kind of horizontal-vertical brain model. The opportunities of linking intracellular intelligence with large-scale neuron modeling may allow to achieve a truly intelligent, self-programmable system. An immediate challenge will be to create models which run in a self-contained way, and build up internal structures from the initial set-up and the processing of input patterns.

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 18 Feb 2025
Comment
Author details Author details
Competing interests
Grant information
Copyright
Download
 
Export To
metrics
Views Downloads
F1000Research - -
PubMed Central
Data from PMC are received and updated monthly.
- -
Citations
CITE
how to cite this article
Scheler G. Sketch of a novel approach to a neural model [version 1; peer review: awaiting peer review]. F1000Research 2025, 14:218 (https://doi.org/10.12688/f1000research.161106.1)
NOTE: If applicable, it is important to ensure the information in square brackets after the title is included in all citations of this article.
track
receive updates on this article
Track an article to receive email alerts on any updates to this article.

Open Peer Review

Current Reviewer Status:
AWAITING PEER REVIEW
AWAITING PEER REVIEW
?
Key to Reviewer Statuses VIEW
ApprovedThe paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approvedFundamental flaws in the paper seriously undermine the findings and conclusions

Comments on this article Comments (0)

Version 1
VERSION 1 PUBLISHED 18 Feb 2025
Comment
Alongside their report, reviewers assign a status to the article:
Approved - the paper is scientifically sound in its current form and only minor, if any, improvements are suggested
Approved with reservations - A number of small changes, sometimes more significant revisions are required to address specific details and improve the papers academic merit.
Not approved - fundamental flaws in the paper seriously undermine the findings and conclusions
Sign In
If you've forgotten your password, please enter your email address below and we'll send you instructions on how to reset your password.

The email address should be the one you originally registered with F1000.

Email address not valid, please try again

You registered with F1000 via Google, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Google account password, please click here.

You registered with F1000 via Facebook, so we cannot reset your password.

To sign in, please click here.

If you still need help with your Facebook account password, please click here.

Code not correct, please try again
Email us for further assistance.
Server error, please try again.