🔬 Nano size Theano LSTM module
-
Updated
Nov 16, 2016 - Python
🔬 Nano size Theano LSTM module
TensorFlow implementation of the method from Variational Dropout Sparsifies Deep Neural Networks, Molchanov et al. (2017)
Implementation of "Variational Dropout and the Local Reparameterization Trick" paper with Pytorch
Fraternal Dropout (Research)
Neural Machine Translation with Fraternal Dropout
Implementation of key concepts of neuralnetwork via numpy
Pytorch implementation of Adaptative Dropout a.ka Standout.
Code for the paper "Curriculum Dropout", ICCV 2017
MNIST classification using Convolutional NeuralNetwork. Various techniques such as data augmentation, dropout, batchnormalization, etc are implemented.
Investigating the Behaviour of Deep Neural Networks for Classification
Implementation of DropBlock in Pytorch
Feed Forward Neural Network to classify the FB post likes in classes of low likes or moderate likes or high likes, back propagtion is implemented with decay learning rate method
Attempt to reproduce the toy experiment of http://bit.ly/2C9Z8St with an ensemble of nets and with dropout.
Short description for quick search
repo that holds code for improving on dropout using Stochastic Delta Rule
Training and Deployment Neural Networks model (using keras) to recognize handwritten digits
Two hidden layer Neural Network with 99% Accuracy. Dropout Regularization scheme is also implemented and available as an option. Please read the report for full implemantation Description.
Neural Network implementation in Numpy and Keras. Batch Normalization, Dropout, L2 Regularization and Optimizers
Add a description, image, and links to the dropout topic page so that developers can more easily learn about it.
To associate your repository with the dropout topic, visit your repo's landing page and select "manage topics."