#!pip install ANNarchyANN-to-SNN conversion - MLP
This notebook demonstrates how to transform a fully-connected neural network trained using tensorflow/keras into an SNN network usable in ANNarchy.
The methods are adapted from the original models used in:
Diehl et al. (2015) “Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing” Proceedings of IJCNN. doi: 10.1109/IJCNN.2015.7280696
import numpy as np
import matplotlib.pyplot as plt
import tensorflow as tf
print(f"Tensorflow {tf.__version__}")2026-05-11 09:50:00.340179: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2026-05-11 09:50:00.343170: I external/local_xla/xla/tsl/cuda/cudart_stub.cc:32] Could not find cuda drivers on your machine, GPU will not be used.
2026-05-11 09:50:00.352172: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:485] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2026-05-11 09:50:00.367295: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:8454] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2026-05-11 09:50:00.371591: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1452] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2026-05-11 09:50:00.382531: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2026-05-11 09:50:02.350068: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Tensorflow 2.17.0
First we need to download and process the MNIST dataset provided by tensorflow.
# Download data
(X_train, t_train), (X_test, t_test) = tf.keras.datasets.mnist.load_data()
# Normalize inputs
X_train = X_train.reshape(X_train.shape[0], 784).astype('float32') / 255.
X_test = X_test.reshape(X_test.shape[0], 784).astype('float32') / 255.
# One-hot output vectors
T_train = tf.keras.utils.to_categorical(t_train, 10)
T_test = tf.keras.utils.to_categorical(t_test, 10)Training an ANN in tensorflow/keras
The tensorflow.keras network is build using the functional API.
The fully-connected network has two fully connected layers with ReLU, no bias, dropout at 0.5, and a softmax output layer with 10 neurons. We use the standard SGD optimizer and the categorical crossentropy loss for classification.
def create_mlp():
# Model
inputs = tf.keras.layers.Input(shape=(784,))
x= tf.keras.layers.Dense(128, use_bias=False, activation='relu')(inputs)
x = tf.keras.layers.Dropout(0.5)(x)
x= tf.keras.layers.Dense(128, use_bias=False, activation='relu')(x)
x = tf.keras.layers.Dropout(0.5)(x)
x=tf.keras.layers.Dense(10, use_bias=False, activation='softmax')(x)
model= tf.keras.Model(inputs, x)
# Optimizer
optimizer = tf.keras.optimizers.SGD(learning_rate=0.05)
# Loss function
model.compile(
loss='categorical_crossentropy', # loss function
optimizer=optimizer, # learning rule
metrics=['accuracy'] # show accuracy
)
print(model.summary())
return modelWe can now train the network and save the weights in the HDF5 format.
# Create model
model = create_mlp()
# Train model
history = model.fit(
X_train, T_train, # training data
batch_size=128, # batch size
epochs=20, # Maximum number of epochs
validation_split=0.1, # Percentage of training data used for validation
)
model.save("runs/mlp.keras")
# Test model
predictions_keras = model.predict(X_test, verbose=0)
test_loss, test_accuracy = model.evaluate(X_test, T_test, verbose=0)
print(f"Test accuracy: {test_accuracy}")2026-05-11 09:50:07.432784: E external/local_xla/xla/stream_executor/cuda/cuda_driver.cc:266] failed call to cuInit: CUDA_ERROR_NO_DEVICE: no CUDA-capable device is detected
2026-05-11 09:50:07.432811: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:135] retrieving CUDA diagnostic information for host: twix
2026-05-11 09:50:07.432817: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:142] hostname: twix
2026-05-11 09:50:07.432909: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:166] libcuda reported version is: 520.61.5
2026-05-11 09:50:07.432927: I external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:170] kernel reported version is: 470.256.2
2026-05-11 09:50:07.432933: E external/local_xla/xla/stream_executor/cuda/cuda_diagnostics.cc:252] kernel version 470.256.2 does not match DSO version 520.61.5 -- cannot find working devices in this configuration
Model: "functional"
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ ┃ Layer (type) ┃ Output Shape ┃ Param # ┃ ┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ │ input_layer (InputLayer) │ (None, 784) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense (Dense) │ (None, 128) │ 100,352 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dropout (Dropout) │ (None, 128) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_1 (Dense) │ (None, 128) │ 16,384 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dropout_1 (Dropout) │ (None, 128) │ 0 │ ├─────────────────────────────────┼────────────────────────┼───────────────┤ │ dense_2 (Dense) │ (None, 10) │ 1,280 │ └─────────────────────────────────┴────────────────────────┴───────────────┘
Total params: 118,016 (461.00 KB)
Trainable params: 118,016 (461.00 KB)
Non-trainable params: 0 (0.00 B)
None
Epoch 1/20
2026-05-11 09:50:07.555050: W external/local_tsl/tsl/framework/cpu_allocator_impl.cc:83] Allocation of 169344000 exceeds 10% of free system memory.
1/422 ━━━━━━━━━━━━━━━━━━━━ 2:46 396ms/step - accuracy: 0.1328 - loss: 2.3993 27/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.1421 - loss: 2.3357 56/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.1933 - loss: 2.2348 85/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.2377 - loss: 2.1456 113/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.2737 - loss: 2.0656 140/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.3039 - loss: 1.9943 166/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.3291 - loss: 1.9322 192/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.3513 - loss: 1.8762 221/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.3733 - loss: 1.8194 249/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.3922 - loss: 1.7696 276/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4088 - loss: 1.7255 306/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4255 - loss: 1.6807 332/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4388 - loss: 1.6450 359/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4515 - loss: 1.6106 388/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4640 - loss: 1.5763 416/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.4753 - loss: 1.5454 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.6384 - loss: 1.0975 - val_accuracy: 0.9132 - val_loss: 0.3398 Epoch 2/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 8s 19ms/step - accuracy: 0.7578 - loss: 0.6655 29/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7955 - loss: 0.6462 55/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7971 - loss: 0.6482 82/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.7993 - loss: 0.6445 109/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8012 - loss: 0.6404 136/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8024 - loss: 0.6374 164/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8033 - loss: 0.6347 190/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8042 - loss: 0.6326 218/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8052 - loss: 0.6301 247/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8063 - loss: 0.6274 277/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8075 - loss: 0.6248 304/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8085 - loss: 0.6223 332/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8096 - loss: 0.6196 359/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8106 - loss: 0.6170 387/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8117 - loss: 0.6143 417/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8127 - loss: 0.6114 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8283 - loss: 0.5697 - val_accuracy: 0.9305 - val_loss: 0.2446 Epoch 3/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.7891 - loss: 0.5870 30/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8459 - loss: 0.5021 58/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8474 - loss: 0.4990 87/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8483 - loss: 0.4965 116/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8496 - loss: 0.4939 142/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8506 - loss: 0.4923 168/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8516 - loss: 0.4905 195/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8526 - loss: 0.4885 221/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8534 - loss: 0.4866 250/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8542 - loss: 0.4846 278/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8547 - loss: 0.4832 307/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8553 - loss: 0.4818 333/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8557 - loss: 0.4806 362/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8561 - loss: 0.4794 391/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8565 - loss: 0.4782 420/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8568 - loss: 0.4771 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8619 - loss: 0.4616 - val_accuracy: 0.9410 - val_loss: 0.2030 Epoch 4/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 19ms/step - accuracy: 0.8672 - loss: 0.5173 26/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8736 - loss: 0.4378 53/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8735 - loss: 0.4339 81/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8737 - loss: 0.4306 109/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8742 - loss: 0.4284 138/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8743 - loss: 0.4280 168/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8743 - loss: 0.4280 195/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8744 - loss: 0.4275 224/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8745 - loss: 0.4271 253/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8748 - loss: 0.4263 282/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8750 - loss: 0.4252 311/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8753 - loss: 0.4241 339/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8756 - loss: 0.4230 368/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8760 - loss: 0.4219 397/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8763 - loss: 0.4208 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8810 - loss: 0.4061 - val_accuracy: 0.9488 - val_loss: 0.1795 Epoch 5/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 6s 17ms/step - accuracy: 0.9062 - loss: 0.3364 30/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8896 - loss: 0.3972 59/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8897 - loss: 0.3929 88/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8897 - loss: 0.3924 117/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8897 - loss: 0.3907 143/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8896 - loss: 0.3896 169/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8896 - loss: 0.3883 197/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8897 - loss: 0.3870 224/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8897 - loss: 0.3861 253/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8898 - loss: 0.3851 282/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8898 - loss: 0.3843 311/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8899 - loss: 0.3836 340/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8900 - loss: 0.3829 370/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8902 - loss: 0.3820 397/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8903 - loss: 0.3813 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.8926 - loss: 0.3682 - val_accuracy: 0.9530 - val_loss: 0.1647 Epoch 6/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.8984 - loss: 0.3950 30/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8929 - loss: 0.3723 57/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8953 - loss: 0.3655 86/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8960 - loss: 0.3645 115/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8964 - loss: 0.3630 142/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8966 - loss: 0.3614 170/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8969 - loss: 0.3597 199/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8973 - loss: 0.3581 225/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8977 - loss: 0.3570 252/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8980 - loss: 0.3562 279/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8981 - loss: 0.3555 308/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8983 - loss: 0.3547 337/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8985 - loss: 0.3539 366/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8986 - loss: 0.3531 393/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8987 - loss: 0.3524 421/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.8989 - loss: 0.3518 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9008 - loss: 0.3422 - val_accuracy: 0.9572 - val_loss: 0.1514 Epoch 7/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9219 - loss: 0.2793 29/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9112 - loss: 0.3061 57/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9085 - loss: 0.3138 84/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9073 - loss: 0.3165 113/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9065 - loss: 0.3185 141/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9063 - loss: 0.3190 169/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9061 - loss: 0.3196 198/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9059 - loss: 0.3199 225/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9059 - loss: 0.3200 252/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9060 - loss: 0.3199 280/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9061 - loss: 0.3196 309/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9063 - loss: 0.3194 338/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9064 - loss: 0.3193 367/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9064 - loss: 0.3194 395/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9065 - loss: 0.3194 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9079 - loss: 0.3192 - val_accuracy: 0.9600 - val_loss: 0.1412 Epoch 8/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 6s 16ms/step - accuracy: 0.8906 - loss: 0.2979 29/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9131 - loss: 0.2991 57/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9139 - loss: 0.2976 83/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9134 - loss: 0.2996 111/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9131 - loss: 0.3010 138/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9127 - loss: 0.3024 168/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9124 - loss: 0.3034 197/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9123 - loss: 0.3039 225/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9122 - loss: 0.3042 254/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9122 - loss: 0.3043 283/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9122 - loss: 0.3041 312/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9123 - loss: 0.3038 341/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9123 - loss: 0.3036 371/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9124 - loss: 0.3033 401/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9124 - loss: 0.3031 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9128 - loss: 0.3026 - val_accuracy: 0.9622 - val_loss: 0.1350 Epoch 9/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.8984 - loss: 0.3383 29/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9091 - loss: 0.3117 57/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9098 - loss: 0.3128 86/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9108 - loss: 0.3119 113/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9116 - loss: 0.3107 141/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9123 - loss: 0.3087 169/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9129 - loss: 0.3069 196/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9134 - loss: 0.3055 224/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9137 - loss: 0.3042 253/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9139 - loss: 0.3032 281/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9141 - loss: 0.3022 310/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9143 - loss: 0.3012 339/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9145 - loss: 0.3002 368/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9147 - loss: 0.2993 396/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9149 - loss: 0.2984 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9179 - loss: 0.2862 - val_accuracy: 0.9653 - val_loss: 0.1246 Epoch 10/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 18ms/step - accuracy: 0.8984 - loss: 0.3151 29/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9129 - loss: 0.3007 58/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9129 - loss: 0.3002 87/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9142 - loss: 0.2966 116/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9153 - loss: 0.2937 145/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9163 - loss: 0.2909 171/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9169 - loss: 0.2891 199/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9174 - loss: 0.2879 228/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9179 - loss: 0.2868 257/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9181 - loss: 0.2861 286/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9184 - loss: 0.2855 315/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9187 - loss: 0.2847 343/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9189 - loss: 0.2840 372/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9191 - loss: 0.2833 399/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9193 - loss: 0.2826 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9224 - loss: 0.2710 - val_accuracy: 0.9645 - val_loss: 0.1228 Epoch 11/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9688 - loss: 0.1634 29/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9234 - loss: 0.2693 58/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9221 - loss: 0.2662 86/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9226 - loss: 0.2628 115/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9224 - loss: 0.2629 144/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9225 - loss: 0.2628 173/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9227 - loss: 0.2622 200/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9229 - loss: 0.2620 228/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9229 - loss: 0.2622 256/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9229 - loss: 0.2623 285/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9230 - loss: 0.2624 314/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9230 - loss: 0.2627 343/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9230 - loss: 0.2628 372/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9230 - loss: 0.2630 401/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9230 - loss: 0.2630 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9234 - loss: 0.2644 - val_accuracy: 0.9660 - val_loss: 0.1187 Epoch 12/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 10s 25ms/step - accuracy: 0.9297 - loss: 0.2424 24/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9219 - loss: 0.2638 49/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9232 - loss: 0.2644 75/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9237 - loss: 0.2629 93/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9237 - loss: 0.2625 112/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9236 - loss: 0.2622 124/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9236 - loss: 0.2621 139/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9236 - loss: 0.2619 157/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9235 - loss: 0.2620 181/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9235 - loss: 0.2620 203/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9235 - loss: 0.2620 225/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9236 - loss: 0.2619 241/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9237 - loss: 0.2617 264/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9238 - loss: 0.2615 288/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9239 - loss: 0.2613 310/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9240 - loss: 0.2610 332/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9241 - loss: 0.2607 353/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9242 - loss: 0.2605 373/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9243 - loss: 0.2601 395/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9244 - loss: 0.2598 417/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9245 - loss: 0.2596 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9264 - loss: 0.2542 - val_accuracy: 0.9680 - val_loss: 0.1125 Epoch 13/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 8s 21ms/step - accuracy: 0.9297 - loss: 0.3074 16/422 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9278 - loss: 0.2755 38/422 ━━━━━━━━━━━━━━━━━━━━ 1s 3ms/step - accuracy: 0.9282 - loss: 0.2679 60/422 ━━━━━━━━━━━━━━━━━━━━ 0s 3ms/step - accuracy: 0.9291 - loss: 0.2622 85/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9295 - loss: 0.2592 108/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9299 - loss: 0.2570 129/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9301 - loss: 0.2557 151/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9303 - loss: 0.2548 174/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9304 - loss: 0.2542 199/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9305 - loss: 0.2535 225/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9306 - loss: 0.2528 251/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9306 - loss: 0.2523 277/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9305 - loss: 0.2519 303/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9305 - loss: 0.2515 326/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9304 - loss: 0.2512 354/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9304 - loss: 0.2509 383/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9303 - loss: 0.2505 412/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9303 - loss: 0.2502 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9299 - loss: 0.2455 - val_accuracy: 0.9692 - val_loss: 0.1085 Epoch 14/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9375 - loss: 0.2185 28/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9302 - loss: 0.2424 56/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9308 - loss: 0.2435 81/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9305 - loss: 0.2457 109/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9305 - loss: 0.2462 136/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9307 - loss: 0.2457 162/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9309 - loss: 0.2449 189/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9310 - loss: 0.2443 218/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9311 - loss: 0.2438 247/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9312 - loss: 0.2433 276/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9313 - loss: 0.2429 305/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9315 - loss: 0.2425 333/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9316 - loss: 0.2421 362/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9318 - loss: 0.2417 389/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9318 - loss: 0.2414 416/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9318 - loss: 0.2412 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9322 - loss: 0.2389 - val_accuracy: 0.9697 - val_loss: 0.1075 Epoch 15/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9375 - loss: 0.1846 30/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9332 - loss: 0.2495 58/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9332 - loss: 0.2470 87/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9333 - loss: 0.2436 116/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9335 - loss: 0.2407 143/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9335 - loss: 0.2393 171/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9334 - loss: 0.2380 200/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9333 - loss: 0.2369 229/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9333 - loss: 0.2360 259/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9333 - loss: 0.2353 284/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9332 - loss: 0.2349 310/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9332 - loss: 0.2345 337/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9332 - loss: 0.2343 364/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9332 - loss: 0.2340 391/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9332 - loss: 0.2337 418/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9333 - loss: 0.2334 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9341 - loss: 0.2299 - val_accuracy: 0.9708 - val_loss: 0.1010 Epoch 16/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9141 - loss: 0.3003 28/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9327 - loss: 0.2199 56/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9334 - loss: 0.2153 85/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9342 - loss: 0.2141 114/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9346 - loss: 0.2147 142/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9348 - loss: 0.2156 171/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9349 - loss: 0.2168 200/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9350 - loss: 0.2177 224/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9350 - loss: 0.2183 238/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9350 - loss: 0.2187 252/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9350 - loss: 0.2189 272/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9351 - loss: 0.2191 294/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9351 - loss: 0.2194 317/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9351 - loss: 0.2198 335/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9351 - loss: 0.2201 357/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9351 - loss: 0.2203 382/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9351 - loss: 0.2205 404/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9352 - loss: 0.2206 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9362 - loss: 0.2216 - val_accuracy: 0.9723 - val_loss: 0.0991 Epoch 17/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 8s 20ms/step - accuracy: 0.9531 - loss: 0.1510 23/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9343 - loss: 0.2079 44/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9339 - loss: 0.2140 65/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9346 - loss: 0.2147 86/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9351 - loss: 0.2154 107/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9355 - loss: 0.2158 128/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9357 - loss: 0.2159 150/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9358 - loss: 0.2163 171/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9361 - loss: 0.2166 192/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9363 - loss: 0.2166 216/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9365 - loss: 0.2165 241/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9367 - loss: 0.2164 265/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9368 - loss: 0.2164 291/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9370 - loss: 0.2164 318/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9371 - loss: 0.2165 345/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9371 - loss: 0.2164 372/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9372 - loss: 0.2164 398/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9372 - loss: 0.2164 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9373 - loss: 0.2173 - val_accuracy: 0.9735 - val_loss: 0.0957 Epoch 18/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9453 - loss: 0.1658 29/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9392 - loss: 0.2004 57/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9375 - loss: 0.2078 86/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9376 - loss: 0.2098 115/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9379 - loss: 0.2104 144/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9383 - loss: 0.2101 173/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9385 - loss: 0.2098 201/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9387 - loss: 0.2095 227/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9389 - loss: 0.2094 254/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9391 - loss: 0.2093 281/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9391 - loss: 0.2093 308/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9392 - loss: 0.2091 334/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9393 - loss: 0.2090 361/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9394 - loss: 0.2089 387/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9394 - loss: 0.2089 414/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9394 - loss: 0.2089 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9393 - loss: 0.2095 - val_accuracy: 0.9732 - val_loss: 0.0948 Epoch 19/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9531 - loss: 0.1942 29/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9391 - loss: 0.2178 57/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9387 - loss: 0.2163 84/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9382 - loss: 0.2165 113/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9379 - loss: 0.2163 142/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9380 - loss: 0.2152 170/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9380 - loss: 0.2147 199/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9380 - loss: 0.2144 227/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9381 - loss: 0.2142 255/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9383 - loss: 0.2138 283/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9384 - loss: 0.2134 310/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9385 - loss: 0.2131 334/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9386 - loss: 0.2128 362/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9387 - loss: 0.2127 391/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9387 - loss: 0.2126 419/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9387 - loss: 0.2125 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9392 - loss: 0.2100 - val_accuracy: 0.9742 - val_loss: 0.0929 Epoch 20/20 1/422 ━━━━━━━━━━━━━━━━━━━━ 7s 17ms/step - accuracy: 0.9531 - loss: 0.1664 28/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9485 - loss: 0.2085 55/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9448 - loss: 0.2107 83/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9431 - loss: 0.2105 111/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9422 - loss: 0.2105 138/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9418 - loss: 0.2101 166/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9416 - loss: 0.2098 192/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9415 - loss: 0.2094 215/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9413 - loss: 0.2091 243/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9412 - loss: 0.2087 269/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9412 - loss: 0.2082 293/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9411 - loss: 0.2080 318/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9411 - loss: 0.2077 346/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9410 - loss: 0.2075 373/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9410 - loss: 0.2073 401/422 ━━━━━━━━━━━━━━━━━━━━ 0s 2ms/step - accuracy: 0.9410 - loss: 0.2072 422/422 ━━━━━━━━━━━━━━━━━━━━ 1s 2ms/step - accuracy: 0.9404 - loss: 0.2053 - val_accuracy: 0.9730 - val_loss: 0.0918 Test accuracy: 0.9656999707221985
plt.figure(figsize=(12, 6))
plt.subplot(121)
plt.plot(history.history['loss'], '-r', label="Training")
plt.plot(history.history['val_loss'], '-b', label="Validation")
plt.xlabel('Epoch #')
plt.ylabel('Loss')
plt.legend()
plt.subplot(122)
plt.plot(history.history['accuracy'], '-r', label="Training")
plt.plot(history.history['val_accuracy'], '-b', label="Validation")
plt.xlabel('Epoch #')
plt.ylabel('Accuracy')
plt.legend()
plt.show()
Initialize the ANN-to-SNN converter
We first create an instance of the ANN-to-SNN conversion object. The function receives the input_encoding parameter, which is the type of input encoding we want to use.
By default, there are intrinsically bursting (IB), phase shift oscillation (PSO) and Poisson (poisson) available.
from ANNarchy.extensions.ann_to_snn_conversion import ANNtoSNNConverter
snn_converter = ANNtoSNNConverter(
input_encoding='IB',
hidden_neuron='IaF',
read_out='spike_count',
)ANNarchy 5.0 (5.0.2) on linux (posix).
After that, we provide the TensorFlow model stored as a .keras file to the conversion tool. The print-out of the network structure of the imported network is suppressed when show_info=False is provided to load_keras_model.
net = snn_converter.load_keras_model("runs/mlp.keras", show_info=True)WARNING: Dense representation is an experimental feature for spiking models, we greatly appreciate bug reports.
* Input layer: input_layer, (784,)
* InputLayer skipped.
* Dense layer: dense, 128
weights: (128, 784)
mean -0.0041567073203623295, std 0.05278029292821884
min -0.35302844643592834, max 0.22772397100925446
* Dropout skipped.
* Dense layer: dense_1, 128
weights: (128, 128)
mean 0.003172823693603277, std 0.10160214453935623
min -0.2697935104370117, max 0.37318307161331177
* Dropout skipped.
* Dense layer: dense_2, 10
weights: (10, 128)
mean -0.0007412515697069466, std 0.21533343195915222
min -0.4427783191204071, max 0.490240216255188
When the network has been built successfully, we can perform a test using all MNIST training samples. Using duration_per_sample, the duration simulated for each image can be specified. Here, 200 ms seem to be enough.
predictions_snn = snn_converter.predict(X_test, duration_per_sample=200) 0%| | 0/10000 [00:00<?, ?it/s] 1%|▌ | 73/10000 [00:00<00:13, 720.48it/s] 2%|█▏ | 151/10000 [00:00<00:13, 749.83it/s] 2%|█▋ | 228/10000 [00:00<00:12, 754.86it/s] 3%|██▎ | 306/10000 [00:00<00:12, 761.63it/s] 4%|██▉ | 384/10000 [00:00<00:12, 764.89it/s] 5%|███▌ | 462/10000 [00:00<00:12, 766.42it/s] 5%|████ | 539/10000 [00:00<00:12, 766.70it/s] 6%|████▋ | 616/10000 [00:00<00:12, 764.73it/s] 7%|█████▎ | 693/10000 [00:00<00:12, 758.10it/s] 8%|█████▊ | 770/10000 [00:01<00:12, 761.60it/s] 8%|██████▍ | 847/10000 [00:01<00:12, 759.19it/s] 9%|███████ | 924/10000 [00:01<00:11, 760.42it/s] 10%|███████▌ | 1001/10000 [00:01<00:11, 761.78it/s] 11%|████████ | 1078/10000 [00:01<00:11, 764.08it/s] 12%|████████▋ | 1155/10000 [00:01<00:11, 765.02it/s] 12%|█████████▏ | 1232/10000 [00:01<00:11, 764.26it/s] 13%|█████████▊ | 1309/10000 [00:01<00:11, 764.30it/s] 14%|██████████▍ | 1386/10000 [00:01<00:11, 762.80it/s] 15%|██████████▉ | 1463/10000 [00:01<00:11, 756.27it/s] 15%|███████████▌ | 1539/10000 [00:02<00:11, 744.81it/s] 16%|████████████ | 1614/10000 [00:02<00:11, 746.26it/s] 17%|████████████▋ | 1692/10000 [00:02<00:11, 754.42it/s] 18%|█████████████▎ | 1770/10000 [00:02<00:10, 759.34it/s] 18%|█████████████▊ | 1847/10000 [00:02<00:10, 761.71it/s] 19%|██████████████▍ | 1924/10000 [00:02<00:10, 762.77it/s] 20%|███████████████ | 2002/10000 [00:02<00:10, 767.63it/s] 21%|███████████████▌ | 2079/10000 [00:02<00:10, 768.33it/s] 22%|████████████████▏ | 2156/10000 [00:02<00:10, 765.37it/s] 22%|████████████████▋ | 2233/10000 [00:02<00:10, 761.69it/s] 23%|█████████████████▎ | 2310/10000 [00:03<00:10, 763.43it/s] 24%|█████████████████▉ | 2387/10000 [00:03<00:10, 759.21it/s] 25%|██████████████████▍ | 2463/10000 [00:03<00:09, 757.69it/s] 25%|███████████████████ | 2539/10000 [00:03<00:09, 754.70it/s] 26%|███████████████████▌ | 2616/10000 [00:03<00:09, 756.29it/s] 27%|████████████████████▏ | 2692/10000 [00:03<00:09, 749.46it/s] 28%|████████████████████▊ | 2767/10000 [00:03<00:09, 747.67it/s] 28%|█████████████████████▎ | 2842/10000 [00:03<00:09, 744.36it/s] 29%|█████████████████████▉ | 2919/10000 [00:03<00:09, 750.22it/s] 30%|██████████████████████▍ | 2997/10000 [00:03<00:09, 758.72it/s] 31%|███████████████████████ | 3075/10000 [00:04<00:09, 764.25it/s] 32%|███████████████████████▋ | 3153/10000 [00:04<00:08, 768.33it/s] 32%|████████████████████████▏ | 3230/10000 [00:04<00:08, 767.25it/s] 33%|████████████████████████▊ | 3308/10000 [00:04<00:08, 770.05it/s] 34%|█████████████████████████▍ | 3386/10000 [00:04<00:08, 766.61it/s] 35%|█████████████████████████▉ | 3463/10000 [00:04<00:08, 767.13it/s] 35%|██████████████████████████▌ | 3540/10000 [00:04<00:08, 764.87it/s] 36%|███████████████████████████▏ | 3618/10000 [00:04<00:08, 766.75it/s] 37%|███████████████████████████▋ | 3696/10000 [00:04<00:08, 767.96it/s] 38%|████████████████████████████▎ | 3773/10000 [00:04<00:08, 768.10it/s] 38%|████████████████████████████▉ | 3850/10000 [00:05<00:08, 761.26it/s] 39%|█████████████████████████████▍ | 3927/10000 [00:05<00:07, 761.77it/s] 40%|██████████████████████████████ | 4004/10000 [00:05<00:07, 758.73it/s] 41%|██████████████████████████████▌ | 4080/10000 [00:05<00:07, 753.52it/s] 42%|███████████████████████████████▏ | 4156/10000 [00:05<00:07, 748.42it/s] 42%|███████████████████████████████▋ | 4233/10000 [00:05<00:07, 754.70it/s] 43%|████████████████████████████████▎ | 4311/10000 [00:05<00:07, 759.88it/s] 44%|████████████████████████████████▉ | 4389/10000 [00:05<00:07, 762.76it/s] 45%|█████████████████████████████████▍ | 4466/10000 [00:05<00:07, 762.78it/s] 45%|██████████████████████████████████ | 4543/10000 [00:05<00:07, 758.73it/s] 46%|██████████████████████████████████▋ | 4619/10000 [00:06<00:07, 756.33it/s] 47%|███████████████████████████████████▏ | 4696/10000 [00:06<00:06, 760.09it/s] 48%|███████████████████████████████████▊ | 4773/10000 [00:06<00:06, 759.35it/s] 48%|████████████████████████████████████▎ | 4849/10000 [00:06<00:06, 758.79it/s] 49%|████████████████████████████████████▉ | 4925/10000 [00:06<00:06, 758.41it/s] 50%|█████████████████████████████████████▌ | 5001/10000 [00:06<00:06, 756.18it/s] 51%|██████████████████████████████████████ | 5077/10000 [00:06<00:06, 752.88it/s] 52%|██████████████████████████████████████▋ | 5153/10000 [00:06<00:06, 743.41it/s] 52%|███████████████████████████████████████▏ | 5230/10000 [00:06<00:06, 748.57it/s] 53%|███████████████████████████████████████▊ | 5305/10000 [00:06<00:06, 744.85it/s] 54%|████████████████████████████████████████▎ | 5380/10000 [00:07<00:06, 740.10it/s] 55%|████████████████████████████████████████▉ | 5455/10000 [00:07<00:06, 737.55it/s] 55%|█████████████████████████████████████████▍ | 5529/10000 [00:07<00:06, 732.36it/s] 56%|██████████████████████████████████████████ | 5603/10000 [00:07<00:06, 728.28it/s] 57%|██████████████████████████████████████████▌ | 5676/10000 [00:07<00:06, 718.58it/s] 57%|███████████████████████████████████████████ | 5749/10000 [00:07<00:05, 720.86it/s] 58%|███████████████████████████████████████████▋ | 5822/10000 [00:07<00:05, 722.24it/s] 59%|████████████████████████████████████████████▏ | 5895/10000 [00:07<00:05, 712.76it/s] 60%|████████████████████████████████████████████▊ | 5967/10000 [00:07<00:05, 708.61it/s] 60%|█████████████████████████████████████████████▎ | 6038/10000 [00:08<00:05, 705.52it/s] 61%|█████████████████████████████████████████████▊ | 6109/10000 [00:08<00:05, 699.70it/s] 62%|██████████████████████████████████████████████▎ | 6179/10000 [00:08<00:05, 692.73it/s] 63%|██████████████████████████████████████████████▉ | 6251/10000 [00:08<00:05, 700.63it/s] 63%|███████████████████████████████████████████████▍ | 6324/10000 [00:08<00:05, 708.55it/s] 64%|███████████████████████████████████████████████▉ | 6397/10000 [00:08<00:05, 712.87it/s] 65%|████████████████████████████████████████████████▌ | 6471/10000 [00:08<00:04, 718.38it/s] 65%|█████████████████████████████████████████████████ | 6546/10000 [00:08<00:04, 726.92it/s] 66%|█████████████████████████████████████████████████▋ | 6626/10000 [00:08<00:04, 747.02it/s] 67%|██████████████████████████████████████████████████▎ | 6702/10000 [00:08<00:04, 750.10it/s] 68%|██████████████████████████████████████████████████▊ | 6778/10000 [00:09<00:04, 741.19it/s] 69%|███████████████████████████████████████████████████▍ | 6853/10000 [00:09<00:04, 736.67it/s] 69%|███████████████████████████████████████████████████▉ | 6930/10000 [00:09<00:04, 744.04it/s] 70%|████████████████████████████████████████████████████▌ | 7005/10000 [00:09<00:04, 736.48it/s] 71%|█████████████████████████████████████████████████████ | 7079/10000 [00:09<00:03, 732.63it/s] 72%|█████████████████████████████████████████████████████▋ | 7153/10000 [00:09<00:03, 728.61it/s] 72%|██████████████████████████████████████████████████████▏ | 7228/10000 [00:09<00:03, 733.58it/s] 73%|██████████████████████████████████████████████████████▊ | 7302/10000 [00:09<00:03, 733.60it/s] 74%|███████████████████████████████████████████████████████▎ | 7376/10000 [00:09<00:03, 728.59it/s] 75%|███████████████████████████████████████████████████████▉ | 7452/10000 [00:09<00:03, 735.91it/s] 75%|████████████████████████████████████████████████████████▍ | 7528/10000 [00:10<00:03, 742.63it/s] 76%|█████████████████████████████████████████████████████████ | 7603/10000 [00:10<00:03, 740.68it/s] 77%|█████████████████████████████████████████████████████████▌ | 7679/10000 [00:10<00:03, 746.30it/s] 78%|██████████████████████████████████████████████████████████▏ | 7754/10000 [00:10<00:03, 737.89it/s] 78%|██████████████████████████████████████████████████████████▋ | 7828/10000 [00:10<00:03, 716.73it/s] 79%|███████████████████████████████████████████████████████████▎ | 7900/10000 [00:10<00:02, 701.56it/s] 80%|███████████████████████████████████████████████████████████▊ | 7971/10000 [00:10<00:02, 697.89it/s] 80%|████████████████████████████████████████████████████████████▎ | 8041/10000 [00:10<00:02, 692.30it/s] 81%|████████████████████████████████████████████████████████████▊ | 8112/10000 [00:10<00:02, 694.43it/s] 82%|█████████████████████████████████████████████████████████████▎ | 8183/10000 [00:10<00:02, 697.71it/s] 83%|█████████████████████████████████████████████████████████████▉ | 8256/10000 [00:11<00:02, 704.93it/s] 83%|██████████████████████████████████████████████████████████████▍ | 8331/10000 [00:11<00:02, 714.40it/s] 84%|███████████████████████████████████████████████████████████████ | 8404/10000 [00:11<00:02, 717.05it/s] 85%|███████████████████████████████████████████████████████████████▌ | 8476/10000 [00:11<00:02, 716.18it/s] 85%|████████████████████████████████████████████████████████████████ | 8548/10000 [00:11<00:02, 712.75it/s] 86%|████████████████████████████████████████████████████████████████▋ | 8621/10000 [00:11<00:01, 716.46it/s] 87%|█████████████████████████████████████████████████████████████████▏ | 8694/10000 [00:11<00:01, 719.93it/s] 88%|█████████████████████████████████████████████████████████████████▊ | 8767/10000 [00:11<00:01, 706.58it/s] 88%|██████████████████████████████████████████████████████████████████▎ | 8841/10000 [00:11<00:01, 715.16it/s] 89%|██████████████████████████████████████████████████████████████████▊ | 8913/10000 [00:12<00:01, 708.23it/s] 90%|███████████████████████████████████████████████████████████████████▍ | 8987/10000 [00:12<00:01, 716.27it/s] 91%|███████████████████████████████████████████████████████████████████▉ | 9059/10000 [00:12<00:01, 715.48it/s] 91%|████████████████████████████████████████████████████████████████████▍ | 9131/10000 [00:12<00:01, 710.51it/s] 92%|█████████████████████████████████████████████████████████████████████ | 9203/10000 [00:12<00:01, 704.33it/s] 93%|█████████████████████████████████████████████████████████████████████▌ | 9277/10000 [00:12<00:01, 713.09it/s] 94%|██████████████████████████████████████████████████████████████████████▏ | 9356/10000 [00:12<00:00, 734.46it/s] 94%|██████████████████████████████████████████████████████████████████████▋ | 9431/10000 [00:12<00:00, 737.56it/s] 95%|███████████████████████████████████████████████████████████████████████▎ | 9507/10000 [00:12<00:00, 744.01it/s] 96%|███████████████████████████████████████████████████████████████████████▊ | 9582/10000 [00:12<00:00, 739.62it/s] 97%|████████████████████████████████████████████████████████████████████████▍ | 9659/10000 [00:13<00:00, 746.54it/s] 97%|█████████████████████████████████████████████████████████████████████████ | 9735/10000 [00:13<00:00, 749.83it/s] 98%|█████████████████████████████████████████████████████████████████████████▌ | 9812/10000 [00:13<00:00, 753.66it/s] 99%|██████████████████████████████████████████████████████████████████████████▏| 9891/10000 [00:13<00:00, 764.14it/s]100%|██████████████████████████████████████████████████████████████████████████▊| 9968/10000 [00:13<00:00, 764.09it/s]100%|██████████████████████████████████████████████████████████████████████████| 10000/10000 [00:13<00:00, 742.44it/s]
Using the recorded predictions, we can now compute the accuracy using scikit-learn for all presented samples.
from sklearn.metrics import classification_report, accuracy_score
print(classification_report(t_test, predictions_snn))
print("Test accuracy of the SNN:", accuracy_score(t_test, predictions_snn)) precision recall f1-score support
0 0.97 0.99 0.98 980
1 0.98 0.98 0.98 1135
2 0.96 0.96 0.96 1032
3 0.94 0.96 0.95 1010
4 0.97 0.95 0.96 982
5 0.95 0.95 0.95 892
6 0.96 0.97 0.96 958
7 0.96 0.96 0.96 1028
8 0.95 0.95 0.95 974
9 0.95 0.95 0.95 1009
accuracy 0.96 10000
macro avg 0.96 0.96 0.96 10000
weighted avg 0.96 0.96 0.96 10000
Test accuracy of the SNN: 0.9606
For comparison, here is the performance of the original ANN in keras:
print(classification_report(t_test, predictions_keras.argmax(axis=1)))
print("Test accuracy of the ANN:", accuracy_score(t_test, predictions_keras.argmax(axis=1))) precision recall f1-score support
0 0.97 0.99 0.98 980
1 0.98 0.98 0.98 1135
2 0.97 0.96 0.96 1032
3 0.96 0.97 0.96 1010
4 0.96 0.97 0.97 982
5 0.96 0.96 0.96 892
6 0.96 0.97 0.96 958
7 0.96 0.97 0.97 1028
8 0.97 0.95 0.96 974
9 0.97 0.94 0.96 1009
accuracy 0.97 10000
macro avg 0.97 0.97 0.97 10000
weighted avg 0.97 0.97 0.97 10000
Test accuracy of the ANN: 0.9657

