ANNarchy 4.8.2
  • ANNarchy
  • Installation
  • Tutorial
  • Manual
  • Notebooks
  • Reference

  • List of notebooks
  • Rate-coded networks
    • Echo-state networks
    • Neural field
    • Bar Learning
    • Miconi network
    • Structural plasticity
  • Spiking networks
    • AdEx
    • PyNN/Brian
    • Izhikevich
    • Synaptic transmission
    • Gap junctions
    • Hodgkin-Huxley
    • COBA/CUBA
    • STP
    • STDP I
    • STDP II
    • Homeostatic STDP - Ramp
    • Homeostatic STDP - SORF
  • Advanced features
    • Hybrid networks
    • Parallel run
    • Bayesian optimization
  • Extensions
    • Image
    • Tensorboard
    • BOLD monitor I
    • BOLD monitor II
    • ANN to SNN I
    • ANN to SNN II

On this page

  • List of notebooks
    • Rate-coded networks
    • Spiking networks
    • Advanced features
    • Extensions

List of notebooks

This section provides a list of the sample models provided in the examples/ directory of the source code.

The Jupyter notebooks can be downloaded from:

https://github.com/ANNarchy/ANNarchy.github.io/tree/master/notebooks

Rate-coded networks

  • Echo-state networks: echo-state networks are the rate-coded version of reservoir computing (Jaeger, 2001).
  • Neural Field: a simple rate-coded model without learning using neural fields.
  • BCM learning rule: basic demonstration of the Intrator & Cooper BCM learning rule.
  • Bar learning: an implementation of the bar learning problem, illustrating synaptic plasticity in rate-coded networks.
  • Miconi: Reward-modulated recurrent network based on Miconi (2017).
  • Structural Plasticity: a dummy example demonstrating structural plasticity.

Spiking networks

  • AdEx: how the AdEx neuron model (adaptive exponential) can reproduce various spiking patterns in vivo (Naud et al. 2008).
  • PyNN/Brian: a set of single neuron models reproducing various examples from PyNN and Brian.
  • Izhikevich: an implementation of the simple pulse-coupled network described in (Izhikevich, 2003). It shows how to build a simple spiking network without synaptic plasticity.
  • Gap Junctions: an example using gap junctions.
  • HodgkinHuxley: a single Hodgkin-Huxley neuron.
  • COBA and CUBA: an implementation of the balanced network described in (Vogels and Abbott, 2005). It shows how to build a simple spiking network using integrate-and-fire neurons and sparse connectivity.
  • STP: an example of short-term plasticity based on the model of Tsodyks, Uziel and Markram (2000).
  • STDP I and II: two simple examples using spike-timing dependent plasticity (STDP).
  • Ramp: an example of homeostatic STDP based on the model of Carlson, Richert, Dutt and Krichmar (2013).

Advanced features

  • Hybrid networks: a simple hybrid network with both rate-coded and spiking sub-parts.
  • Parallel simulations: shows how to call parallel_run to run several networks in parallel.
  • Bayesian optimization: a demo showing how to use hyperopt to search for hyperparameters of a model.

Extensions

  • Image and Convolution: shows how to use the ImagePopulation class of the image extension to clamp directly images and video streams into a rate-coded network. Also demonstrates the convolution extension.
  • Logging with tensorboard: a simple basal ganglia model to show how to use the tensorboard extension.
  • BOLD monitoring I and II: a showcase of the bold extension allowing to record BOLD signals fron a network.
  • ANN2SNN I and II: demonstrates the ANN-to-SNN conversion tool using the MNIST dataset for a MLP and a CNN.
Echo-state networks
 

Copyright Julien Vitay, Helge Ülo Dinkelbach, Fred Hamker