ANNarchy 4.8.2
  • ANNarchy
  • Installation
  • Tutorial
  • Manual
  • Notebooks
  • Reference

  • Reference
  • Core components
    • Population
    • Projection
    • Neuron
    • Synapse
    • Monitor
    • PopulationView
    • Dendrite
    • Network
  • Configuration
    • setup
    • compile
    • clear
    • reset
    • set_seed
    • get_population
    • get_projection
    • populations
    • projections
    • monitors
  • Simulation
    • simulate
    • simulate_until
    • step
    • parallel_run
    • enable_learning
    • disable_learning
    • get_time
    • set_time
    • get_current_step
    • set_current_step
    • dt
  • Neuron models
    • LeakyIntegrator
    • Izhikevich
    • IF_curr_exp
    • IF_cond_exp
    • IF_curr_alpha
    • IF_cond_alpha
    • HH_cond_exp
    • EIF_cond_alpha_isfa_ista
    • EIF_cond_exp_isfa_ista
  • Synapse models
    • STP
    • STDP
    • Hebb
    • Oja
    • IBCM
  • Inputs
    • InputArray
    • TimedArray
    • PoissonPopulation
    • TimedPoissonPopulation
    • SpikeSourceArray
    • HomogeneousCorrelatedSpikeTrains
    • CurrentInjection
    • DecodingProjection
    • ImagePopulation
    • VideoPopulation
  • IO
    • save
    • load
    • save_parameters
    • load_parameters
  • Utilities
    • report
  • Random Distributions
    • Uniform
    • DiscreteUniform
    • Normal
    • LogNormal
    • Exponential
    • Gamma
    • Binomial
  • Functions and Constants
    • add_function
    • functions
    • Constant
    • get_constant
  • Plotting
    • raster_plot
    • histogram
    • inter_spike_interval
    • coefficient_of_variation
    • population_rate
    • smoothed_rate
  • Callbacks
    • every
    • callbacks_enabled
    • disable_callbacks
    • enable_callbacks
    • clear_all_callbacks
  • Convolution
    • Convolution
    • Pooling
    • Transpose
    • Copy
  • BOLD monitoring
    • BoldMonitor
    • BoldModel
    • balloon_RN
    • balloon_RL
    • balloon_CN
    • balloon_CL
    • balloon_maith2021
    • balloon_two_inputs
  • Tensorboard logging
    • Logger
  • ANN-to-SNN conversion
    • ANNtoSNNConverter

On this page

  • TimedArray
    • Parameters
    • Methods
      • update

TimedArray

inputs.TimedArray.TimedArray(
    self,
    rates=None,
    geometry=None,
    schedule=0.0,
    period=-1.0,
    name=None,
    copied=False,
)

Data structure holding sequential inputs for a rate-coded network.

The input values are stored in the (recordable) attribute r, without any further processing. You will need to connect this population to another one using the connect_one_to_one() method.

By default, the firing rate of this population will iterate over the different values step by step:

inputs = np.array(
    [
        [1, 0, 0, 0, 0, 0, 0, 0, 0, 0],
        [0, 1, 0, 0, 0, 0, 0, 0, 0, 0],
        [0, 0, 1, 0, 0, 0, 0, 0, 0, 0],
        [0, 0, 0, 1, 0, 0, 0, 0, 0, 0],
        [0, 0, 0, 0, 1, 0, 0, 0, 0, 0],
        [0, 0, 0, 0, 0, 1, 0, 0, 0, 0],
        [0, 0, 0, 0, 0, 0, 1, 0, 0, 0],
        [0, 0, 0, 0, 0, 0, 0, 1, 0, 0],
        [0, 0, 0, 0, 0, 0, 0, 0, 1, 0],
        [0, 0, 0, 0, 0, 0, 0, 0, 0, 1]
    ]
)

inp = ann.TimedArray(rates=inputs)

pop = ann.Population(10, ...)

proj = ann.Projection(inp, pop, 'exc')
proj.connect_one_to_one(1.0)

ann.compile()

ann.simulate(10.)

This creates a population of 10 neurons whose activity will change during the first 10*dt milliseconds of the simulation. After that delay, the last input will be kept (i.e. 1 for the last neuron).

If you want the TimedArray to “loop” over the different input vectors, you can specify a period for the inputs:

inp = ann.TimedArray(rates=inputs, period=10.)

If the period is smaller than the length of the rates, the last inputs will not be set.

If you do not want the inputs to be set at every step, but every 10 ms for example, youcan use the schedule argument:

inp = ann.TimedArray(rates=inputs, schedule=10.)

The input [1, 0, 0,…] will stay for 10 ms, then[0, 1, 0, …] for the next 10 ms, etc…

If you need a less regular schedule, you can specify it as a list of times:

inp = ann.TimedArray(rates=inputs, schedule=[10., 20., 50., 60., 100., 110.])

The first input is set at t = 10 ms (r = 0.0 in the first 10 ms), the second at t = 20 ms, the third at t = 50 ms, etc.

If you specify less times than in the array of rates, the last ones will be ignored.

Scheduling can be combined with periodic cycling. Note that you can use the reset() method to manually reinitialize the TimedArray, times becoming relative to that call:

ann.simulate(100.) # ten inputs are shown with a schedule of 10 ms
inp.reset()
ann.simulate(100.) # the same ten inputs are presented again.

Parameters

Name Type Description Default
rates np.ndarray array of firing rates. The first axis corresponds to time, the others to the desired dimensions of the population. None
geometry int | tuple desired dimensions of the population. This argument will be considered if rates is None. None
schedule float either a single value or a list of time points where inputs should be set. Default: every timestep. 0.0
period float time when the timed array will be reset and start again, allowing cycling over the inputs. Default: no cycling (-1.). -1.0

Methods

Name Description
update Set a new list of inputs. The first axis corresponds to time, the others to the desired dimensions of the population. Note, the

update

inputs.TimedArray.TimedArray.update(rates, schedule=0.0, period=-1)

Set a new list of inputs. The first axis corresponds to time, the others to the desired dimensions of the population. Note, the geometry is set during construction phase of the object.

Parameters

Name Type Description Default
rates array of firing rates. The first axis corresponds to time, the others to the desired dimensions of the population. required
schedule either a single value or a list of time points where inputs should be set. Default: every timestep. 0.0
period time when the timed array will be reset and start again, allowing cycling over the inputs. Default: no cycling (-1.). -1
InputArray
PoissonPopulation
 

Copyright Julien Vitay, Helge Ülo Dinkelbach, Fred Hamker