ANNarchy 5.0.0
  • ANNarchy
  • Installation
  • Tutorial
  • Manual
  • Notebooks
  • Reference

  • Core objects
    • Overview
    • Equation parser
    • Rate-coded neurons
    • Spiking neurons
    • Populations
    • Rate-coded synapses
    • Spiking synapses
    • Projections
    • Connectors
    • Monitors
    • Network
    • Setting inputs
    • Random distributions
    • Numerical methods
    • Saving and loading
    • Parallel simulations
  • Extensions
    • Hybrid networks
    • Structural plasticity
    • Convolution and pooling
    • Reporting
    • Logging with tensorboard

On this page

  • Equation parser
    • Parameters
    • Variables
      • Flags
    • Constants
    • Allowed vocabulary
      • Numerical values
      • Operators
      • Parameters and Variables
      • Random number generators
      • Mathematical functions
      • Conditional statements
      • Custom functions
    • Previous string-based interface (<5.0)

Equation parser

A Neuron or Synapse instance is primarily defined by two sets of values which must be specified in its constructor:

  • Parameters are values such as time constants which are constant during the simulation. They can be the same throughout the population/projection (global), or take different values for each neuron/synapse (local). In projections, they can also be semiglobal: one value per post-synaptic neuron.

  • Variables are neuronal variables (for example the membrane potential or firing rate) or synaptic variables (the synaptic efficiency) whose value evolve with time during the simulation. The equation (which can be an ordinary differential equation or not) ruling their evolution can be described using a specific equation-oriented language.

Parameters

Parameters are provided to Neuron or Synapse models using dictionaries:

neuron = ann.Neuron(
    parameters = dict(
        tau = 10.0,
        baseline = -0.1,
    )
)

A multiline string can also be passed (see below).

As a neuron/synapse type is likely to be reused in different populations/projections, it is good practice to set reasonable initial values in the neuron/synapse type, and eventually adapt them to the corresponding populations/projections later on.

Local vs. global parameters

Neural parameters can be either local (each neuron has a different value of the parameter) or global (the value is the same for all neurons of a population - but can be different in two populations using the same neuron type).

The default locality of a parameter is global. The Parameter dataclass allows to tell ANNarchy that the parameter has to be local, i.e. it should take one value per neuron in the population.

neuron = ann.Neuron(
    parameters = dict(
        tau = ann.Parameter(10.0),
        baseline = ann.Parameter(-0.1),
    )
)

ann.Parameter() also allows to define global parameters, but it is longer:

neuron = ann.Neuron(
    parameters = dict(
        tau = ann.Parameter(10.0, locality='global'), # or just: tau = 10.0
        baseline = ann.Parameter(-0.1, locality='global'), # or just: baseline = -0.1
    )
)

For synapses, there are three different localities (see Locality):

  • 'local': one value per synapse in the projection (e.g. the synaptic efficiency).
  • 'semiglobal': one value per post-synaptic neuron in the projection.
  • 'global': one value per projection.

These strings can be passed to the parameter dataclass. 'global' is still the default.

Type of the parameter

Parameters have floating-point precision by default. If you want to force the parameter to be an integer or boolean, you can set the type argument of parameter:

neuron = ann.Neuron(
    parameters = dict(
        number_spikes = ann.Parameter(0, type=int),
        activated = ann.Parameter(False, locality='global', type=bool),
    )
)
"""

Constants

Alternatively, it is possible to use external constants in the parameter definition by using their name (see later):

tau_exc = ann.Constant('tau_exc', 10.0)

neuron = ann.Neuron(
    parameters = dict(
        tau = 'tau_exc'
    ),
)

The advantage of this method is that if a parameter value is "shared" across several neuron/synapse types, you only need to change the value once, instead of in each neuron/synapse definition.

Variables

Time-varying variables are defined using a list of strings (or ann.Variable instances):

neuron = ann.Neuron(
    parameters = dict(
        tau = 10.0,
        baseline = -0.1,
    ),
    equations = [
        'noise += Uniform(-0.1, 0.1)',
        'tau * dv/dt  + v = baseline + noise',
        'r = pos(v)',
    ]

The evolution of each variable with time can be described through a simple equation or an ordinary differential equation (ODE). ANNarchy provides a simple parser for mathematical expressions, whose role is to translate a high-level description of the equation into an optimized C++ code snippet.

The equation for one variable can depend on parameters, other variables (even when declared later) or constants. Variables are updated in the same order as their declaration in the multistring (see Numerical methods, as it influences how ODEs are solved).

As it is only a parser and not a solver, some limitations exist:

  • Simple equations must hold only the name of the variable on the left sign of the equation. Variable definitions such as r + v = noise are forbidden, as it would be impossible to guess which variable should be updated.
  • ODEs are more free regarding the left side, but only one variable should hold the gradient: the one which will be updated. The following definitions are equivalent and will lead to the same C++ code:
tau * dv/dt  = baseline - v

tau * dv/dt  + v = baseline

tau * dv/dt  + v -  baseline = 0

dv/dt  = (baseline - v) / tau

In practice, ODEs are transformed using Sympy into the last form (only the gradient stays on the left) and numerized using the chosen numerical method (see Numerical methods).

Flags

Locality and type

Like the parameters, variables defined using the ann.Variable() class also accept the local, global and semiglobal attributes to define the locality of the variable, as well as the int or bool flags for their type.

neuron = ann.Neuron(
    parameters = dict(
        tau = 10.0,
        baseline = -0.1,
    ),
    equations = [
        ann.Variable('noise += Uniform(-0.1, 0.1)', locality='global'),
        ann.Variable('tau * dv/dt  + v = baseline + noise'),
        ann.Variable('r = pos(v)'),
    ]

Here, there will be only one value of noise for the whole population.

Initial value

The initial value of the variable (before the first simulation step starts) is 0.0 by default (or 0/bool, depending on the type). It can also be specified using the init attribute:

neuron = ann.Neuron(
    parameters = dict(
        tau = 10.0,
        baseline = -0.1,
        init_v = -0.1,
    ),
    equations = [
        ann.Variable('noise += Uniform(-0.1, 0.1)', init=0.1),
        ann.Variable('tau * dv/dt  + v = baseline + noise', init='init_v'),
        ann.Variable('r = pos(v)', init=ann.Uniform(0.0, 1.0)),
    ]
)

Acceptables values are:

  • a float/int/bool matching the type of the variable, which will be the same for all neurons/synapses.
  • a RandomDistribution object (see Random Distributions), allowing to randomly initialize the variable fo each neuron/variable.
  • the name of a parameter of the same neuron model, or of a constant.

When using a random variable, the size of the array will be sampled after creation of the object using its size (number of neurons/synapses) and the locality (a global variable only need one value).

The initial value can be changed after the Population or Projection objects are created (see Populations).

Min and Max values of a variable

Upper- and lower-bounds can be set using the min and max attributes:

neuron = ann.Neuron(
    parameters = dict(
        tau = 10.0,
        baseline = -0.1,
        init_v = -0.1,
    ),
    equations = [
        ann.Variable('noise += Uniform(-0.1, 0.1)', init=0.1),
        ann.Variable('tau * dv/dt  + v = baseline + noise', init='init_v', min=0.0, max=10.0),
        ann.Variable('r = v', init=ann.Uniform(0.0, 1.0)),
    ]
)

At each step of the simulation, after the update rule is calculated for v, the new value will be compared to the min and max value, and clamped if necessary.

Like init, min and max can be either single values, constants or parameter names.

Numerical method

The numerization method for a single ODEs can be explicitely set by specifying a flag:

neuron = ann.Neuron(
    parameters = dict(
        tau = 10.0,
        baseline = -0.1,
        init_v = -0.1,
    ),
    equations = [
        ann.Variable('noise += Uniform(-0.1, 0.1)', init=0.1),
        ann.Variable('tau * dv/dt  + v = baseline + noise', init='init_v', min=0.0, max=10.0, method='exponential'),
        ann.Variable('r = v', init=ann.Uniform(0.0, 1.0)),
    ]
)

The available numerical methods are described in Numerical methods.

Constants

Constants can be created by the user and used inside any equation. They must define an unique name and a floating point value.

Constants can either be declared at the global level:

tau = ann.Constant('tau', 10.0)

neuron = ann.Neuron(
    equations = "tau * dr/dt + r = sum(exc)"
)

or from a Network instance:

net = ann.Network()
tau = net.constant('tau', 10.0)
neuron = ann.Neuron(
    equations = "tau * dr/dt + r = sum(exc)"
)

Neuron or Synapse models does not have to define the parameter tau to use it: it is available everywhere. If the Neuron/Synapse redefines a parameter called tau, the constant is not visible anymore to that object.

Constants can be manipulated as normal floats to define complex values:

tau = ann.Constant('tau', 20)
factor = ann.Constant('factor', 0.1)
real_tau = ann.Constant('real_tau', tau*factor)

neuron = ann.Neuron(
    equations = [
        'real_tau * dr/dt + r = 1.0'
    ]
)

Note that changing the value of a global constant impacts all networks using them. When the constant was created at the network level, a change of its value only impacts this network.

Changing the value of a constant can only be done through the set() method:

tau = ann.Constant('tau', 20)
tau.set(10.0)

or:

tau = net.constant('tau', 10.0)
tau.set(10.0)

Allowed vocabulary

The mathematical parser relies heavily on the one provided by SymPy.

Numerical values

All parameters and variables use implicitly the floating-point double precision, except when stated otherwise with the int or bool keywords. You can use numerical constants within the equation, noting that they will be automatically converted to this precision:

tau * dv / dt  = 1 / pos(v) + 1

The constant \pi is available under the literal form pi.

Operators

  • Additions (+), substractions (-), multiplications (*), divisions (/) and power functions (^) are of course allowed.
  • Temporal gradients are allowed only for the variable currently described. They take the form:
dv / dt  = A

with a d preceding the variable’s name and terminated by /dt (with or without spaces). Gradients must be on the left side of the equation.

  • To update the value of a variable at each time step, the operators =, +=, -=, *=, and /= are allowed.

Parameters and Variables

Any parameter or variable defined in the same Neuron/Synapse can be used inside an equation. User-defined constants can also be used. Additionally, the following variables are pre-defined:

  • dt : the discretization time step for the simulation. Using this variable, you can define the numerical method by yourself. For example:
tau * dv / dt  + v = baseline

with backward Euler would be equivalent to:

v += dt/tau * (baseline - v)
  • t : the time in milliseconds elapsed since the creation of the network. This allows for example to generate oscillating variables:
f = 10.0 # Frequency of 10 Hz
phi = pi/4 # Phase
ts = t / 1000.0 # ts is in seconds
r = 10.0 * (sin(2*pi*f*ts + phi) + 1.0)

Random number generators

Several random generators are available and can be used within an equation, for example:

  • Uniform(min, max) generates random numbers from a uniform distribution in the range [\text{min}, \text{max}].
  • Normal(mu, sigma) generates random numbers from a normal distribution with mean mu and standard deviation sigma.

See the manual and the reference for more details on the availab le distributions.

Mathematical functions

Most mathematical functions of the cmath library are understood by the parser, for example:

cos, sin, tan, acos, asin, atan, exp, abs, fabs, sqrt, log, ln

The positive and negative parts of a term are also defined, with short and long versions:

r = pos(mp)
r = positive(mp)
r = neg(mp)
r = negative(mp)

A piecewise linear function is also provided (linear when x is between a and b, saturated at a or b otherwise):

r = clip(x, a, b)
Note

In such simple cases, it might be more readable to use the min/max attributes:

neuron = ann.Neuron(
    equations = [
        ann.Variable('r = pos(v)'),
        # is equivalent to:
        ann.Variable('r = v', min=0.0),
    ]
)

For integer variables, the modulo operator is defined:

x += 1 : int
y = modulo(x, 10)

When using the power function (r = x^2 or r = pow(x, 2)), the cmath pow(double, int) method is used. For small exponents (quadratic or cubic functions), it can be extremely slow, compared to r = x*x or r = x*x*x. Unfortunately, Sympy transforms automatically r = x*x into r = pow(x, 2). We therefore advise to use the built-in power(double, int) function instead:

r = power(x, 3)

These functions must be followed by a set of matching brackets:

tau * dmp / dt + mp = exp( - cos(2*pi*f*t + pi/4 ) + 1)

Conditional statements

Python-style

It is possible to use Python-style conditional statements as the right term of an equation or ODE. They follow the form:

if condition : statement1 else : statement2

For example, to define a piecewise linear function, you can nest different conditionals:

r = if mp < 1. :
        if mp > 0.:
            mp
        else:
            0.
    else:
        1.

which is equivalent to:

r = clip(mp, 0.0, 1.0)

The condition can use the following vocabulary:

True, False, and, or, not, is, is not, ==, !=, >, <, >=, <=
Note

The and, or and not logical operators must be used with parentheses around their terms. Example:

var = if (mp > 0) and ( (noise < 0.1) or (not(condition)) ):
            1.0
        else:
            0.0

is is equivalent to ==, is not is equivalent to !=.

When a conditional statement is split over multiple lines, the flags must be set after the last line:

rate = if mp < 1.0 :
          if mp < 0.0 :
              0.0
          else:
              mp
       else:
          1.0 : init = 0.6

An if a: b else: c statement must be exactly the right term of an equation. It is for example NOT possible to write:

r = 1.0 + (if mp> 0.0: mp else: 0.0) + b

Ternary operator

The ternary operator ite(cond, then, else) (ite stands for if-then-else) is available to ease the combination of conditionals with other terms:

r = ite(mp > 0.0, mp, 0.0)
# is exactly the same as:
r = if mp > 0.0: mp else: 0.0

The advantage is that the conditional term is not restricted to the right term of the equation, and can be used multiple times:

r = ite(mp > 0.0, ite(mp < 1.0, mp, 1.0), 0.0) + ite(stimulated, 1.0, 0.0)

Custom functions

To simplify the writing of equations, custom functions can be defined either globally (usable by all neurons and synapses) or locally (only for the particular type of neuron/synapse) using the same mathematical parser.

Global functions can be defined using the add_function() method:

ann.add_function('sigmoid(x) = 1.0 / (1.0 + exp(-x))')

With this declaration, sigmoid() can be used in the declaration of any variable, for example:

neuron = ann.Neuron(
    equations = [
        'r = sigmoid(sum(exc))'
    ]
)

Functions must be one-liners, i.e. they should have only one return value. They can use as many arguments as needed, but are totally unaware of the context: all the needed information should be passed as an argument (except constants which are visible to the function).

The types of the arguments (including the return value) are by default floating-point. If other types should be used, they should be specified at the end of the definition, after the : sign, with the type of the return value first, followed by the type of all arguments separated by commas:

ann.add_function('conditional_increment(c, v, t) = if v > t : c + 1 else: c : int, int, float, float')

Local functions are specific to a Neuron or Synapse class and can only be used within this context (if they have the same name as global variables, they will override them). They can be passed as a multi-line argument to the constructor of a neuron or synapse (see later):

functions == """
    sigmoid(x) = 1.0 / (1.0 + exp(-x))
    conditional_increment(c, v, t) = if v > t : c + 1 else: c : int, int, float, float
"""

Previous string-based interface (<5.0)

Prior to version 5.0, the default interface was using multiline strings to define both parameters and variables. This interface still works, but might become deprecated in a future release.

LeakyIntegratorNeuron = ann.Neuron(
    parameters = """
        tau = 10.0
        baseline = -0.2
    """,
    equations = """
        tau * dv/dt  + v = baseline + sum(exc)
        r = pos(v)
    """
)

When using strings, parameters and variables are by default local, i.e. unique to each neuron in a population or synapse in a projection. With parameters, this is the opposite to the new default behavior with dictionaries. To make the parameters global, you need to pass a flag

  • If population is passed after a colon ‘:’ in a neuron, the parameter will be common to all neurons of the population.
parameters = """
    tau = 10.0                   # equivalent to local
    baseline = 0.0 : population  # equivalent to global
"""
  • If the postsynaptic flag is passed after a colon ‘:’ in a synapse, the parameter will be common to all synapses of a post-synaptic neuron, but can differ from one post-synaptic neuron to another. If the projection flag is passed, the parameter will be common to all synapses of a projection.
parameters = """
    wmin = 0.0                # equivalent to local
    wmax = 10.0 : projection  # equivalent to global
    eta = 0.5 : postsynaptic  # equivalent to semiglobal
"""

Variables are also local by default, and can be modified by the same flags. The init, min, max flags and the numerical method (without method=) can be passed after the colon like in a regular ann.Variable().

equations = """
    tau * dv/dt  + v = baseline + sum(exc) # init=-1.0, max=10.0, exponential
    r = v : min=0.0
"""
Overview
Rate-coded neurons
 

Copyright Julien Vitay, Helge Ülo Dinkelbach, Fred Hamker