Skip to content

parkjjoe/SNN-aware-dropout

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 

Repository files navigation

SNN-aware-dropout

Motivation

However, applying standard dropout techniques directly to SNNs is challenging. Unlike traditional DNNs, SNNs attempt to physically replicate human brain signals and deliver information based on the temporal patterns of neuron activity. So, a neuron generates a spike only when its membrane potential exceeds a certain threshold. In SNNs, the output value after the activation layer using an activation function, which imparts non-linearity to the output of the previous layer and passes it to the next layer, appears as 0 in most cases. As a result, applying standard dropout techniques to SNNs could be ineffective, as neurons that don't generate spikes would be unaffected, and the sparse spikes generated by active neurons could be further reduced, negatively impacting learning. That's why different approaches are needed in SNNs, leading to research that adjusts removal probabilities using membrane potential or synaptic weights.

However, the positive impact of the dropout layer cannot be ignored. So, inspired by the role of the dropout layer, I propose several SNN-aware noise addition layers.

Example Usage

import layers_new

x = layers_new.dropout_custom.Dropout_custom10(0.5, name='dropout_custom')(x)

layers_new and dropout_custom are the directory and .py file name, respectively.
Dropout_custom10 is the customized dropout class name you want to use.
0.5 is the dropout rate.

About

Develop SNN-aware Noise Addition Layers

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages