ANNarchy 4.8.2
  • ANNarchy
  • Installation
  • Tutorial
  • Manual
  • Notebooks
  • Reference

  • Model structure
    • Structure of an ANNarchy model
    • Parser
    • Rate-coded neurons
    • Spiking neurons
    • Populations
    • Rate-coded synapses
    • Spiking synapses
    • Projections
    • Connectivity
  • Simulation
    • Setting inputs
    • Simulation
    • Configuration
    • Equations and numerical methods
    • Recording with Monitors
    • Saving and loading a network
    • Parallel simulations and networks
  • Extensions
    • Hybrid networks
    • Structural plasticity
    • Convolution and pooling
    • Jupyter / IPython Notebooks
    • Reporting
    • Logging with tensorboard

On this page

  • Rate-coded neurons
    • Defining parameters and variables
    • Custom functions
    • Predefined attributes
    • Weighted sum of inputs
    • Global operations

Rate-coded neurons

Defining parameters and variables

Let’s consider first a simple rate-coded neuron of the leaky-integrator type, which simply integrates the weighted sum of its excitatory inputs:

\tau \frac{d v(t)}{dt} = ( B - v(t) ) + \sum_{i}^{\text{exc}} w_i \, r_{i}(t)

r(t) = ( v(t) )^+

where v(t) represents the membrane potential of the neuron, \tau the time constant of the neuron, B its baseline firing rate, r(t) its instantaneous firing rate, i an index over all excitatory synapses of this neuron, w_i the efficiency of the synapse with the pre-synaptic neuron of firing rate r_{i}.

It can be implemented in the ANNarchy framework with:

import ANNarchy as ann

LeakyIntegratorNeuron = ann.Neuron(
    parameters="""   
        tau = 10.0
        baseline = -0.2
    """,
    equations = """
        tau * dv/dt + v = baseline + sum(exc)
        r = pos(v)
    """
)

The only required variable is r, which represents the instantaneous firing rate and will be used to propagate activity in the network. All other parameters and variables are freely decided by the user.

Custom functions

Custom functions can also be defined when creating the Neuron type and used inside the equations field:

LeakyIntegratorNeuron = ann.Neuron(
    parameters="""   
        tau = 10.0
        baseline = -0.2
    """,
    equations = """
        tau * dv/dt + v = baseline + sum(exc)
        r = sigmoid(v)
    """,
    functions = """
        sigmoid(x) = 1.0 / (1.0 + exp(-x))
    """
)

Make sure that the name of the function does not overlap with existing mathematical functions (cos, exp), existing variables (tau, r) or built-in functions (pos, t, dt).

Predefined attributes

The ODE can depend on other parameters of the neuron (e.g. r depends on v), but not on unknown names. ANNarchy already defines the following variables and parameters for a neuron:

  • variable t: time in milliseconds elapsed since the creation of the network.
  • parameter dt: the discretization step, the default value is 1 ms.

Weighted sum of inputs

The sum(target) term gives a direct access to the weighted sum of all inputs to the neuron having the corresponding target (see Projections to see how the target is defined). These inputs are organized in a data structure called Dendrite.

It is possible to modify how weighted sums are computed when creating a rate-coded synapse.

Note

The connection type, e.g. exc or inh, needs to match with the names used as a target parameter when creating a Projection. If such a projection does not exist when the network is compiled, the weighted sum will be set to 0.0 for all neurons.

Using only sum() in the equations sums over all defined targets. For example, if two projections with targets "exc" and "inh" reach a neuron, sum() is equivalent to sum(exc) + sum(inh). Inhibitory weights must then be defined as negative.

Global operations

One has the possibility to use global operations on the population inside the neuron definition, such as the maximal activity in the population. One only needs to use one of the following operations:

  • min(v) for the minimum: \min_i v_i,
  • max(v) for the maximum: \max_i v_i,
  • mean(v) for the mean: \frac{1}{N} \sum_i v_i,
  • norm1(v) for the L1-norm: \frac{1}{N} \sum_i |v_i|,
  • norm2(v) for the L2-norm: \frac{1}{N} \sum_i v_i^2

Example where neurons react to their inputs only where they exceed the mean over the population:

WTANeuron = ann.Neuron(
    parameters="""   
        tau = 10.0
    """,
    equations = """
        input = sum(exc)
        tau * dr/dt + r = pos(input - mean(input))
    """
)  
Note

The global operations are computed using values at the previous time step (like weighted sums), not in the step currently evaluated. There is therefore implicitely a delay of dt, but it cannot be changed.

Parser
Spiking neurons
 

Copyright Julien Vitay, Helge Ülo Dinkelbach, Fred Hamker