NN Type  Features  Library used  Application  Link  Added on 
Feedforward NN  Sigmoid activation function, grad descent method  theano theano.tensor theano.tensor.nnet  XOR problem  Beginner Tutorial: Neural Nets in Theano  20161010 

Feedforward NN  L1 and L2 Regularization Stochastic gradient descent optimization sigmoidal (or tanh) activation function 1 hidden layer  theano theano.tensor  MNIST digit classification  Multilayer Perceptron  20161010 

Reccurent NN  Backpropagation through time (grad descent) graph output for error  theano theano.tensor cPickle  Computing the dot product of two vectors  Implementing a recurrent neural network in python  20161010 
Feedforward NN  Backpropogation algorithm is using Adam, as efficient variant of Gradient Descent 1 hidden layer with input units = 28*28, training and testing on subset of images.  TensorFlow  image recognition, to identify digits from a given 28 x 28 image  An Introduction to Implementing Neural Networks using TensorFlow  20161010 
Deep Belief Network  Uses series of hidden layers each hidden layer is an unsupervised Restricted Boltzmann Machine.
The output of each RBM in the hidden layer sequence is used as input to the next. The final hidden layer then connects to an output layer.  sklearn nolearn.dbn  to classify images from the MNIST dataset  Getting Started with Deep Learning and Python  20161010 
Feedforward NN  Backpropagation algorithm with very short python implementation. Sigmoid activation function.  numpy  Predicting column Y based on input columns X, similar like in XOR operation  A Neural Network in 11 lines of Python A bare bones neural network implementation to describe the inner workings of backpropagation.  20161010 
Convolutional Neural Network  2 step training for an image classification problem: Feature Extraction: extracting new features.
Model Training: utilizing a clean dataset composed of the images features and the corresponding labels. Transfer learning also used for training convolutional neural networks  Caffe  Cat/dog image classifier. The dataset is from Kaggle and is comprised of 25,000 images of dogs and cats.  A Practical Introduction to Deep Learning with Caffe and Python  20161010 
Feedforward NN  Stochastic gradient descent learning algorithm. Gradients are calculated using backpropagation.  numpy  image classification to recognize handwritten digits  Using neural nets to recognize handwritten digits  20161010 
Convolutional Neural Network  Supports several layer types (fully connected, convolutional, max pooling, softmax),
and activation functions (sigmoid, tanh, and rectified linear units, with more easily added).
Can run on CPU/GPU  theano theano.tensor cPickle gzip  image classification  Deep learning  20161010 
Feedforward NN  Gradient descent, backpropogation, 3 layers (1 input, 1 output, 1 hidden layer)  sklearn  Data classification  IMPLEMENTING A NEURAL NETWORK FROM SCRATCH IN PYTHON ? AN INTRODUCTION  20161010 
Recurrent neural network toolbox for Python and Matlab  Levenberg Marquardt algorithm (a secondorder QuasiNewton optimization method) for training, which is much faster than firstorder methods like gradient descent.
RealTime Recurrent Learning (RTRL) algorithm and Backpropagation Through Time (BPTT) algorithm are implemented  pyrenn  pyrenn allows to create a wide range of (recurrent) neural network configurations, examples also include feed forward neural net  A Recurrent Neural Network Toolbox for Python and MatlabPyrenn  20161010 
LSTM Recurrent Neural Network  Long ShortTerm Memory Network (LSTM), deep network architecture using stacked LSTM networks  Keras, sklearn  Time series prediction  Time Series Prediction with LSTM Recurrent Neural Networks in Python with Keras  20161010 
LSTM Recurrent Neural Network  Long ShortTerm Memory Network (LSTM), naive LSTM network  Keras  sequence prediction problem of learning the alphabet. Given a letter of the alphabet, predict the next letter of the alphabet.  Understanding Stateful LSTM Recurrent Neural Networks in Python with Keras  20161010 
LSTM Recurrent Neural Network  Long ShortTerm Memory Network (LSTM), one or two hidden LSTM layers, dropout, the output layer is a Dense layer using the softmax activation function, DAM optimization algorithm is used for speed  Keras  Text Generation. Generation new sequences of characters.  Text Generation With LSTM Recurrent Neural Networks in Python with Keras  20161010 
LSTM Recurrent Neural Network  Long ShortTerm Memory Network (LSTM), Various layers are used: Embedded layer for representing each word, Dropout Layer, onedimensional CNN and max pooling layers, LSTM layer, Dense output layer with a single neuron and a sigmoid activation.
Log loss is used as the loss function (binary_crossentropy in Keras).
ADAM optimization  Keras  Sequence classification.  Sequence Classification with LSTM Recurrent Neural Networks in Python with Keras  20161010 
Feedforward NN  Two hidden layers Softmax activation function Model is trained using Stochastic Gradient Descent (SGD)  Keras, sklearn.preprocessing, sklearn.cross_validation  Image classification  A simple neural network with Python and Keras  20161010 
Convolutional Neural Network  Transfer Learning gradient descent  Theano theano.tensor Keras  Image recognition, to identify digits from a given 28 x 28 image.  Finetuning a Keras model using Theano trained Neural Network & Introduction to Transfer Learning  20161010 
Convolutional Neural Network  Convolutional Neural Networks (CNNs) pretrained on the ImageNet dataset.  Keras
Imagenet_utils  image classification  image classification with Python and Keras  20161010 
Convolutional Neural Network  DROPOUT LAYER implementation of summary to track of and visualize various quantities during training and evaluation  Tensorflow  Sentence Classification  IMPLEMENTING A CNN FOR TEXT CLASSIFICATION IN TENSORFLOW  20161010 
Convolutional Neural Network  CNN is using smaller network parameters, Word2Vec is trained on taining dataset  Keras Theano  Sentence Classification  Convolutional Neural Networks for Sentence Classification  20161010 
Convolutional Neural Network  Dropout regularization  mxnet  sentence sentiment classification  Text Classification Using a Convolutional Neural Network on MXNet  20161010 
Convolutional Neural Network and RNN  Recurrent Neural Networks (LSTM, GRU, Attentional RNN), Convolutional Neural Networks  Keras  Question Answering  Deep Language Modeling for Question Answering using Keras  20161010 
Convolutional Neural Network  Extracting feature vectors from different layers of CNN  Caffe  Extract Feature Vectors using CNN  How To Extract Feature Vectors From Deep Neural Networks In Python Caffe  20161010 
Convolutional neural network  Building a simple ConvNet architecture with some convolutional and pooling layers Training ConvNet as feature extractor and then use it to extract features before feeding them into different models Prediction and Confusion Matrix, Filter Visualization  Lasagne nolearn  Digit image classification  Deep learning  20161010 
LSTM Recurrent Neural Network  Adagrad method for optimizations weighted trainning denoising.  Theano  Learning and predicting sine waves  Predict Time Sequence with LSTM  20161010 
MxNET  Extracts the features from the images that are used to train supervised classifier
Trained on the Net Image datase  MxNet Library  Image classification  Network of pretrained neurons applied to the classification of images (French)  20170303 
Feedforward NN  Feed Forward Pass, Backward Propagation  Theano  XNOR function  Practical Guide to implementing Neural Networks in Python (using Theano)  20170303 
Feedforward NN   SciKit Learn 0.18  Predict type of tumor based on Breast Cancer Data Set  which has several features of tumors with a labeled class indicating wh  A Beginner Guide to Neural Networks with Python and SciKit Learn 0.18!  20170303 
Feedforward NN  Gradient descent, backpropogation  numpy  Predict test score based on how many hours we sleep and how many hours we study the night before.  ARTIFICIAL NEURAL NETWORK (ANN)  INTRODUCTION  20170303 
ADAPTIVE LINEAR NEURON (Adaline)  LINEAR (IDENTITY) ACTIVATION FUNCTION WITH STOCHASTIC GRADIENT DESCENT (SGD)  numpy    SINGLE LAYER NEURAL NETWORK : ADAPTIVE LINEAR NEURON USING LINEAR (IDENTITY) ACTIVATION FUNCTION WITH STOCHASTIC GRADIENT DESCENT (S  20170303 
Feedforward NN  Different types of neural networks are considered  numpy, Keras, Theano  Image Classification of MNIST images  ARTIFICIAL NEURAL NETWORK (ANN) 9  DEEP LEARNING II : IMAGE RECOGNITION (IMAGE CLASSIFICATION)  20170303 
neural network library for python  Interface to use train algorithms form scipy.optimize
Flexible network configurations and learning algorithms. You may change: train, error, initialization and activation functions
Unlimited number of neural layers and number of neurons in layers
Variety of supported types of Artificial Neural Network and learning algorithms  neurolab  Different types of neural networks can be created  neurolab 0.3.5  20170303 
Generative Adversarial Networks (GAN)  GAN has two competing neural network models: Generator takes noise as input and generates samples. Discriminator receives samples from both the generator and the training data, and has to be able to distinguish between the two sources.  tensorflow  Learn to create data that is similar to data that we give them  An introduction to Generative Adversarial Networks (with code in TensorFlow)  20170505 
Feedforward NN  Rectifier (relu) activation function on the first two layers and the sigmoid function in the output layer
gradient descent algorithm ?adam?  Keras  Binary classification problem (onset of diabetes as 1 or not as 0)  Develop Your First Neural Network in Python With Keras StepByStep  20170505 
Spiking Neural Netorks (SNN)  The site has tutorial with math explanation  p y l a b s c i p y . s p a r s e  Spiking neural network simulation  PYTHON TUTORIAL: HOW TO WRITE A SPIKING NEURAL NETWORK SIMULATION FROM SCRATCH  20170505 
Spiking Neural Netorks (SNN)  Includes the modified learning and prediction rules which could be realised on hardware and are enegry efficient
SpikeTime Dependent Plasticity (STDP) algorithm is used to train the network.  numpy  Classification  Pure python implementation of SNN  20170505 
Spiking Neural Netorks (SNN)  open source  brian2  Simulator for spiking neural networks  The Brian spiking neural network simulator  20170505 
Spiking Neural Netorks (SNN)  Different Spiking Neuron Models: Reflex neuron model, Habituation, Positive feedback neuron response and many others   Different Spiking Neuron Models  Spiking Neural Netorks (SNN)  20170505 
SelfOrganising Maps (SOM)  The input data is randomly initialized 3D colours
Normalization is included  numpy pyplot patches  Dimension reduction, Converting 3D colors into 2D Map  SelfOrganising Maps: In Depth  20170505 
Convolutional Neural Network  tflearn  Deep learning library featuring a higherlevel API for TensorFlow.  tflearn  Objects recognition in images using deep learning  Machine Learning is Fun! Part 3: Deep Learning and Convolutional Neural Networks  20170505 
TensorFlow Neural Network  3 layer deep neural network  tensorflow  Image classification of MNIST images (set of 28x28 pixel grayscale images which represent handwritten digits)  Python TensorFlow Tutorial  Build a Neural Network  20170505 
Feedforward NN  scaling, one hidden layer  sklearn  Image classification of MNIST images (set of 28x28 pixel grayscale images which represent handwritten digits)  Neural Networks Tutorial  A Pathway to Deep Learning  20170505 
Keras Neural Network  Rectifier activation in the hidden layer ADAM gradient descent optimization algorithm with a logarithmic loss function
 pandas keras sklearn  Multiclass classification problems for Iris dataset  MultiClass Classification Tutorial with the Keras Deep Learning Library  20170505 
Echo Recurrent Neural Network  Visualization input is a random binary vecto the output is the ?echo? of the input, shifted echo_step steps to the right  tensorflow  EchoRNN that remembers the input data and then echoes it after a few timesteps  How to Build a Recurrent Neural Network in TensorFlow  20170505 
LSTM Recurrent Neural Network  Multilayered LSTM deep architecture  tensorflow  TensorFlow MultilayerLSTM  Using the Multilayered LSTM API in TensorFlow  20170505 
Convolutional Neural Network  ReLU node activations softmax classification layer to output the 10 digit probabilities
 tensorflow  Digit Image classification of MNIST  Convolutional Neural Networks Tutorial in TensorFlow  20171028 
LSTM Recurrent Neural Network  Network structure: 1 input layer (consisting of a sequence of size 50) which feeds into an LSTM layer with 50 neurons, that in turn feeds into another LSTM layer with 100 neurons which then feeds into a fully connected normal layer of 1 neuron with a linear activation function which will be used to give the prediction of the next time step  Keras  LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION  LSTM NEURAL NETWORK FOR TIME SERIES PREDICTION  20171028 
Convolutional Neural Network  Showing also how to download trained model from the community in the Caffe Model Zoo and use it  caffe  image classification  Deep learning tutorial on Caffe technology : basic commands, Python and C++ code.  20171028 
Convolutional Neural Network  Convolutional Neural Network with caffe  caffe  image classification  Deep Learning With Caffe In Python  Part I: Defining A Layer  20171028 
CNN, RNN  2 stacked LSTM  keras  financial time series forecasting, stock data forecasting
 Neural networks for algorithmic trading. Part One???Simple time series forecasting  20171028 
Convolutional Neural Network  twostreamcnn  ensorflow (framework of choice)
Gensim (nlp library fo  REDICTING SOCIAL MATCHES IN LINKEDINDATA  TWO STREAM CONVOLUTIONAL NETWORK FOR PREDICTING SOCIAL MATCHES IN LINKEDINDATA  20171028 
Convolutional Neural Network  Convolutional layer is added to the WalkForward Analysis.  CNTK  Time Series  Convolutional Neural Network for Time Series  20171028 
Convolutional Neural Network  2D convolutional layer, 2D max pooling layer  keras  To classify the MNIST handwritten digit dataset  Keras tutorial  build a convolutional neural network in 11 lines  20171028 
Convolutional Neural Network  TensorFlow Library. The post includes example of TensorFlow NN and CNN  TensorFlow  To classify the MNIST handwritten digit dataset  First steps with TensorFlow using Python  20171028 
LSTM Recurrent Neural Network  multilabel classification  Tensorflow  Time series classification  MultilabeltimeseriesclassificationwithLSTM  20171028 
LSTM Recurrent Neural Network   Keras  Time Series Prediction  Time Series Analysis using Recurrent Neural Networks LSTM  20171028 
CNN   TensorFlow  Human Activity Recognition  Implementing a CNN for Human Activity Recognition in Tensorflow  20171028 
RNN   N/A  Simple toy example  Anyone Can Learn To Code an LSTMRNN in Python (Part 1: RNN)  20171028 
CNN  2D also included  Keras  Time Series prediction  Example of using Keras to implement a 1D convolutional neural network (CNN) for timeseries prediction.  20171028 
RCNN   Lasagne  This project provides the solution of team daheimao for the Kaggle GraspandLift EEG Detection Competition.  Kaggle GraspandLift EEG Detection Competition  20171028 
Generative Adversarial Networks (GAN)  The dataset of images is used (black and white)  Keras keras_adversarial  Generate digits by training a GAN on Identify the Digits dataset  Introductory guide to Generative Adversarial Networks (GANs) and their promise!  20171028 
MultiLayer Perceptron  2 hidden layers neurons  tensorflow  using the MNIST database of handwritten digits  aymericdamien/TensorFlowExamples  20171028 
MultiLayer Perceptron  variation of diffent hyperparameters is explored  Keras  Time Series Forecasting (Sales Data)  Exploratory Configuration of a Multilayer Perceptron Network for Time Series Forecasting  20171028 
MultiLayer Perceptron  In this post a multilayer perceptron (MLP) class based on the TensorFlow library is discussed.  TensorFlow  Stock Market Prediction  Stock Market Prediction Using MultiLayer Perceptrons With TensorFlow  20171028 
MultiLayer Perceptron  Tensorflow vs Theano Benchmark  Theano Tensorflow   MultiLayer Perceptron Networks in Theano and TensorFlow: An Implementation and Benchmark  20171028 
CNN  There is also great tutorial  keras  Image classification  Architecture of Convolutional Neural Networks (CNNs) demystified  20171028 
RNN  The dataset consists of thousands of fivesentence stories (dataset is from ROCStories)  keras  The task is to predict the final sentence in each story  A simple recurrent neural network language model with Keras  20171028 
Keras Neural Network  multilayer perceptrons NN  keras  Classification  Predicting Wine Types: Red or White  Keras Tutorial: Deep Learning in Python  20171028 
TensorFlow  sparse softmax cross entropy between logits and labels  TensorFlow  Traffic sign visualization  TensorFlow Tutorial For Beginners  20171028 
TensorFlow  FeedForward Neural Network (FFNN)  TensorFlow  Binary classification problem to classify colors into either red or blue based on the three RGB color channels  TensorFlow: Building FeedForward Neural Networks StepbyStep  20171028 
CNN  The project is using FER2013 Faces Database, a set of 28,709 pictures of people displaying 7 emotional expressions (angry, disgusted, fearful, happy, sad, surprised and neutral). 67% accuracy is reported  TensorFlow  Recognition of emotions from images  Emotion recognition using DNN with tensorflow, mood recognition using convolutional neural network  20180105 
LSTM  The project is using One Billion Word Benchmark for Measuring Progress in Statistical Language Modeling for data input to NN. For misspelled words artificial noise was used  keras  Spelling correction  Deep Spelling Rethinking spelling correction in the 21st?century  20180105 
RNN    Build a Language Model using a Recurrent Neural Network. For the sentence of m words a language model allows to predict the pro  Recurrent Neural Networks Tutorial, Part 2  Implementing a RNN with Python, Numpy and Theano  20180105 
LSTM  Dual encoder LSTM   retrievalbased neural network model that can assign scores to potential responses given a conversation context.  Deep Learning for Chatbots, Implementing a RetrievalBased Model in Tensorflow  20180105 
CNN    Text Classification  Implementing a CNN for Text Classification in TensorFlow  20180105 
CNN  3layered convolution neural network with 2 dense layers.
Layers: Embeddings, Convolution1D, Flatten, Dropout, Dens
Accuracy 87%  keras  Sentiment analysis  How to implement Sentiment Analysis using word embedding and Convolutional Neural Networks on Keras.  20180105 
CNN  Each word is represented by an embedded vector  keras  Short Text Categorization using Deep Neural Networks and WordEmbedding Models  Short Text Categorization using Deep Neural Networks and WordEmbedding?Models  20180105 
CNN  Neural Network is combined with reinforcement learning for game development  TensorFlow  Game Development  Using Machine Learning Agents in a real game: a beginner guide  20180105 
CNN  LeNet model  keras  handwritten digit classification for MNIST dataset  Visualizing the Convolutional Filters of the LeNet Model  20180105 
CNN  ImageNet model  keras  image classification  How convolutional neural networks see the world  20180105 