ANNarchy 4.8.2
  • ANNarchy
  • Installation
  • Tutorial
  • Manual
  • Notebooks
  • Reference

  • Reference
  • Core components
    • Population
    • Projection
    • Neuron
    • Synapse
    • Monitor
    • PopulationView
    • Dendrite
    • Network
  • Configuration
    • setup
    • compile
    • clear
    • reset
    • set_seed
    • get_population
    • get_projection
    • populations
    • projections
    • monitors
  • Simulation
    • simulate
    • simulate_until
    • step
    • parallel_run
    • enable_learning
    • disable_learning
    • get_time
    • set_time
    • get_current_step
    • set_current_step
    • dt
  • Neuron models
    • LeakyIntegrator
    • Izhikevich
    • IF_curr_exp
    • IF_cond_exp
    • IF_curr_alpha
    • IF_cond_alpha
    • HH_cond_exp
    • EIF_cond_alpha_isfa_ista
    • EIF_cond_exp_isfa_ista
  • Synapse models
    • STP
    • STDP
    • Hebb
    • Oja
    • IBCM
  • Inputs
    • InputArray
    • TimedArray
    • PoissonPopulation
    • TimedPoissonPopulation
    • SpikeSourceArray
    • HomogeneousCorrelatedSpikeTrains
    • CurrentInjection
    • DecodingProjection
    • ImagePopulation
    • VideoPopulation
  • IO
    • save
    • load
    • save_parameters
    • load_parameters
  • Utilities
    • report
  • Random Distributions
    • Uniform
    • DiscreteUniform
    • Normal
    • LogNormal
    • Exponential
    • Gamma
    • Binomial
  • Functions and Constants
    • add_function
    • functions
    • Constant
    • get_constant
  • Plotting
    • raster_plot
    • histogram
    • inter_spike_interval
    • coefficient_of_variation
    • population_rate
    • smoothed_rate
  • Callbacks
    • every
    • callbacks_enabled
    • disable_callbacks
    • enable_callbacks
    • clear_all_callbacks
  • Convolution
    • Convolution
    • Pooling
    • Transpose
    • Copy
  • BOLD monitoring
    • BoldMonitor
    • BoldModel
    • balloon_RN
    • balloon_RL
    • balloon_CN
    • balloon_CL
    • balloon_maith2021
    • balloon_two_inputs
  • Tensorboard logging
    • Logger
  • ANN-to-SNN conversion
    • ANNtoSNNConverter

On this page

  • ANNtoSNNConverter
    • Parameters
    • Methods
      • get_annarchy_network
      • load_keras_model
      • predict

ANNtoSNNConverter

extensions.ann_to_snn_conversion.ANNtoSNNConverter.ANNtoSNNConverter(
    self,
    input_encoding='CPN',
    hidden_neuron='IaF',
    read_out='spike_count',
    **kwargs,
)

Converts a pre-trained Keras model .keras into an ANNarchy spiking neural network.

The implementation of the present module is inspired by the SNNToolbox (Rueckauer et al. 2017), and is largely based on the work of Diehl et al. (2015). We provide several input encodings, as suggested in the work of Park et al. (2019) and Auge et al. (2021).

Constraints on the ANN

It is not possible to convert any keras model into an ANNarchy SNN: some constraints ahve to be respected.

  • The only allowed layers in the ANN are:
    • Dense
    • Conv2D
    • MaxPooling2D
    • AveragePooling2D
    • as well as non-neural layers such as Dropout, Activation, BatchNorm, etc.
  • The layers must not contain any bias, even the conv layers:
x = tf.keras.layers.Dense(128, use_bias=False, activation='relu')(x)
x = tf.keras.layers.Conv2D(64, kernel_size=(5,5), activation='relu', padding='same', use_bias=False)(x)
  • The first layer of the network must be an Input:
inputs = tf.keras.Input(shape = (28, 28, 1))
  • Pooling must explicitly be done by MaxPooling2D/AveragePooling2D, strides are ignored.

Please be aware that the module is very experimental and the conversion may not work for many different reasons. Feel free to submit issues.

Processing Queue

The pre-trained ANN model to be converted should be saved in keras (extension .keras). The saved model is transformed layer by layer into a feed-forward ANNarchy spiking network. The structure of the network remains the same as in the original ANN, while the weights are normalised. Please note that the current implementation focuses primarily on the correctness of the conversion. Computational performance, especially of the converted CNNs, will be improved in future releases.

Note

While the neurons are conceptually spiking neurons, there is one specialty: next to the spike event (stored automatically in ANNarchy), each event will be stored in an additional mask array. This mask value decays in absence of further spike events exponentially. The decay can be controlled by the mask_tau parameter of the population. The projections (either dense or convolution) will use this mask as pre-synaptic input, not the generated list of spike events.

Input Encoding

  • Poisson (“CPN”)

This encoding uses a Poisson distribution where the pixel values of the image will be used as probability for each individual neuron.

  • Intrinsically Bursting (“IB”)

This encoding is based on the Izhikevich (2003) model that comprises two ODEs:

\begin{cases} \frac{dv}{dt} = 0.04 \cdot v^2 + 5.0 \cdot v + 140.0 - u + I \\ \\ \frac{du}{dt} = a \cdot (b \cdot v - u) \\ \end{cases}

The parameters for a - d are selected accordingly to Izhikevich (2003). The provided input images will be set as I.

  • Phase Shift Oscillation (“PSO”)

Based on the description by Park et al. (2019), the spiking threshold v_\text{th} is modulated by a oscillation function \Pi, whereas the membrane potential follows simply the input current.

\begin{cases} \Pi(t) = 2^{-(1+ \text{mod}(t,k))}\\ \\ v_\text{th}(t) = \Pi(t) \, v_\text{th}(t)\\ \end{cases}

  • User-defined input encodings

In addition to the pre-defined models, one can opt for individual models using the Neuron class of ANNarchy. Please note that a mask variable need to be defined, which is fed into the subsequent projections.

Read-out Methods

In a classification task, the neuron with the highest activity corresponds corresponds to the decision to which class the presented input belongs. However, the highest activity can be determined in different ways. We support currently three methods, defined by the read_out parameter of the constructor:

  • Maximum Spike Count

read_out = 'spike_count' : the number of spikes emitted by each neuron is recorded and the index of the neuron(s) with the maximum number is returned.

  • Time to Number of Spikes

read_out = 'time_to_first_spike' or read_out = 'time_to_k_spikes': when the first or first k spikes are emitted by a single neuron, the simulation is stopped and the neuron rank(s) is returned. For the second mode, an additional k argument need to be also provided.

  • Membrane potential

read_out = 'membrane_potential': pre-synaptic events are accumulated in the membrane potential of each output neuron. The index of the neuron(s) with the highest membrane potential is returned.

Izhikevich (2003) Simple Model of Spiking Neurons. IEEE transactions on neural networks 14(6). doi: 10.1109/TNN.2003.820440

Diehl PU, Neil D, Binas J,et al. (2015) Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing, 2015 International Joint Conference on Neural Networks (IJCNN), 1-8, doi: 10.1109/IJCNN.2015.7280696.

Rueckauer B, Lungu I, Hu Y, et al. (2017) Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification., Front. Neurosci., 2017, 11. doi: 10.3389/fnins.2017.00682

Park S, Kim S, Choe H, et al. (2019) Fast and Efficient Information Transmission with Burst Spikes in Deep Spiking Neural Networks.

Auge D, Hille J, Mueller E et al. (2021) A Survey of Encoding Techniques for Signal Processing in Spiking Neural Networks. Neural Processing Letters. 2021; 53:4963-4710. doi:10.1007/s11063-021-10562-2

Parameters

Name Type Description Default
input_encoding str a string representing which input encoding should be used: ‘CPN’, ‘IB’ or ‘PSO’. 'CPN'
hidden_neuron str neuron model used in the hidden layers. Either the default integrate-and-fire (‘IaF’) or an ANNarchy Neuron object. 'IaF'
read_out str a string which of the following read-out method should be used: spike_count, time_to_first_spike, membrane_potential. 'spike_count'

Methods

Name Description
get_annarchy_network Returns the ANNarchy.Network instance.
load_keras_model Loads the pre-trained model provided as a .keras file.
predict Performs the prediction for a given input array.

get_annarchy_network

extensions.ann_to_snn_conversion.ANNtoSNNConverter.ANNtoSNNConverter.get_annarchy_network(
)

Returns the ANNarchy.Network instance.

load_keras_model

extensions.ann_to_snn_conversion.ANNtoSNNConverter.ANNtoSNNConverter.load_keras_model(
    filename,
    directory='annarchy',
    scale_factor=None,
    show_info=True,
)

Loads the pre-trained model provided as a .keras file.

In tf.keras, the weights can be saved using:

model.save("model.keras")

Parameters

Name Type Description Default
filename str path to the .keras file. required
directory str sub-directory where the generated code should be stored (default: “annarchy”) 'annarchy'
scale_factor float allows a fine-grained control of the weight scale factor. By default (None), with each layer-depth the factor increases by one. If a scalar value is provided the same value is used for each layer. Otherwise a list can be provided to assign the scale factors individually. None
show_info bool whether the network structure should be printed on console (default: True) True

Returns

Name Type Description
Network An ANNarchy.Network instance.

predict

extensions.ann_to_snn_conversion.ANNtoSNNConverter.ANNtoSNNConverter.predict(
    samples,
    duration_per_sample=1000,
    multiple=False,
)

Performs the prediction for a given input array.

Parameters

Name Type Description Default
samples set of inputs to present to the network. The function expects a 2-dimensional array (num_samples, input_size). required
duration_per_sample the number of simulation steps for one input sample (default: 1000, 1 second biological time) 1000
multiple if several output neurons reach the criteria, return the full list instead of randomly chosing one. False

Returns

Name Type Description
list[int] A list of predicted class indices for each sample.
Logger
 

Copyright Julien Vitay, Helge Ülo Dinkelbach, Fred Hamker