In this section the main principles and technical (programming) solutions are discussed.
What time-aware simulation means is discussed in section Time awareness. In neuronal operations, an artificial delay for signalling is introduced. That delay is calculated from the conduction velocity (neural impulse 's speed) and the distance between the computing objects (mainly the presynaptic and postsynaptic neurons).
The synapses have their rather independent life, so they are discussed in section The synapses.
Initially, the neurers are in state NeurerOperatingBit_t::nsb_Resting. In this state the synaptic inputs are effective and (tentatively) they are considered as the initialization of an action potential.
As discussed in section Time awareness, the simulators work with some reasonably-sized time slot, and the events happening within that time slot, are happening "at the same time". This is the time resolution. The time is internally stored as microseconds, and the simulated time is expressed in millliseconds, with two digits behind the decimal point.
Or central routine AbstractNeuron::MembranePotentialContribution_Add handles membrane's potential changes. It is called for current changes and voltage changes The caller can be either computing or signalling routine.
According to Koch [7] (Eq. 14.20 and Fig. 14.8), "The evolution of the network, often termed neurodynamics, is governed by a coupled system of single-cell equations. For any one cell i (out of n such cells) it takes the form"
\[ C\frac{dV_i(t)}{dt} = -\frac{V_i(t)}{R} + I_i(t) + \sum_{j=1}^{n}w_{i,j}g(V_j(t)) \]
"where \(f_j\) is the firing rate of the \(j\)th neuron and \(I_i\), corresponds to the external current injected into the cell. In our usual circuit idiom in which both \(f_i\) and \(V_i\) are voltages, the synaptic coupling between the two neurons, that is, \(w_{i,j}\) has the dimension of a conductance. Negative, hyperpolarizing synapses are implemented by inverting the output of the amplifier. The evolution of the circuit in Figure Fig_Koch_14_8 can be expressed by an equation of the form shown above.
"Circuit model of two interacting mean rate neurons. Such continuous output units constitute the standard working horse of neural networks. Common to all is that the coupling among neurons is characterized by a scalar \(w{i,j}\) that can take on any real value, depending on whether the synapse is inhibitory or excitatory. The interaction among synaptic inputs is strictly linear. Local learning rules of the type discussed in Sec. 13.5 are used for determining the amplitude of the \(w{i,j}\)s. A qualitatively very similar model of linear synaptic interactions has been used in the neural network community for studying the computational power of networks of spiking units."
That is, to calculate the change in membrane's potential value, we need to consider contributions from the actual membrane's potential, the current injected into the cell, and the synaptic coupling between the neurons. They need to be calculated with different frequency.
The equation above can be rewritten from differential to differentia equation and rearranged as
\[ \Delta V_i(t) = -\frac{V_i(t)}{RC} {\Delta t} + \frac{I_i(t)}{C} {\Delta t} + {\Delta t}\sum_{j=1}^{n}w_{i,j}\frac{g(V_j(t))}{{C}} \]
This form suggests that \(\Delta V_i(t)\) has three independent contributions from the voltage \(V_i\), \(I_i\) and the \(w_{i,j}\) synaptic couplings. The first term on the right side makes evident that we are recalculating the potential at \(\frac{\Delta t}{RC}\) fraction of the time constant of the \(RC\) circuit, the second term represents the charge contribution due to current arriving through synapse \(i\) in the \(\Delta t\) period, and the third term represents the cross-talk between the synapses: some portion of the charge integrated by other synapses (how much portion of transmitter delivers change to a foreign synapse).
Given that in most cases the synapses provide input at more-or-less different times, and also their contributions are different, it is worth to handle those contrinutions separately in the numerical solution.
One must distinguish structural and functional role of synapses. The synapses are separated from (and, at the same time, connected to) neuron's membrane. They closely cooperate with their membrane: they contribute immediately (constructively or destructively) to membrane's potential; the membrane changes their synaptic weights; furthermore, the increased synaptic weight enables them to initiate increasing the conduction velocity (via myelinating the axon delivering the neural impulse to their input) [8]. Their situation shows some resemblance with the role of registers in computer processors: they store temporarily the information the neuron computes with, furthermore they change the synaptic weight. Similarly to the registers, they enable quick access to their content, and in contrast with registers, their number is not limited, furthermore they can receive directly information from the outside world, without needing IO intructions and processing time. Another crucial difference is that the synapses can (initiate to) change the conduction velocity, unlike the technological neurons. This ability is decisive: the biological neurons are able to perform both short-time and long-time learning, while the technological systems need to introduce non-biological learning methods and "learning On/Off" switches.