https://sites.google.com/site/minggaoshomepage/links/dee

 

Deep Learning (http://deeplearning.net/)



1. Deep Learning Research Groups

University of Toronto - Machine Learning Group (Geoff Hinton, Rich Zemel, Ruslan Salakhutdinov, Brendan Frey, Radford Neal)

Université de Montréal - Lisa Lab (Yoshua Bengio, Pascal Vincent, Aaron Courville, Roland Memisevic)

New York University – Yann Lecun‘s and Rob Fergus‘ group

Stanford University – Andrew Ng‘s group

UBC – Nando de Freitas‘s group

Google Research – Jeff Dean, Samy Bengio, Jason Weston, Marc’Aurelio Ranzato, Dumitru Erhan, Quoc Le et al

Microsoft Research – Li Deng et al

SUPSI – IDSIA (Schmidhuber’s group)

UC Berkeley – Bruno Olshausen‘s group

University of Washington – Pedro Domingos‘ group

IDIAP Research Institute - Ronan Collobert‘s group

University of California Merced – Miguel A. Carreira-Perpinan‘s group

University of Helsinki - Aapo Hyvärinen‘s Neuroinformatics group

Université de Sherbrooke – Hugo Larochelle‘s group

University of Guelph – Graham Taylor‘s group

University of Michigan – Honglak Lee‘s group

Technical University of Berlin – Klaus-Robert Muller‘s group

Baidu – Kai Yu‘s group

Aalto University – Juha Karhunen‘s group

U. Amsterdam – Max Welling‘s group

U. California Irvine – Pierre Baldi‘s group

Ghent University – Benjamin Shrauwen‘s group

University of Tennessee – Itamar Arel‘s group

IBM Research – Brian Kingsbury et al

University of Bonn – Sven Behnke’s group

Gatsby Unit @ University College London – Maneesh Sahani, Yee-Whye Teh, Peter Dayan

Computational Cognitive Neuroscience Lab @ University of Colorado Boulder


2. Deep Learning Softwares

Theano – CPU/GPU symbolic expression compiler in python (from LISA lab at University of Montreal)

Pylearn2 - Pylearn2 is a library designed to make machine learning research easy.

Torch – provides a Matlab-like environment for state-of-the-art machine learning algorithms in lua (from Ronan Collobert, Clement Farabet and Koray Kavukcuoglu)

DeepLearnToolbox – A Matlab toolbox for Deep Learning (from Rasmus Berg Palm)

Cuda-Convnet – A fast C++/CUDA implementation of convolutional (or more generally, feed-forward) neural networks. It can model arbitrary layer connectivity and network depth. Any directed acyclic graph of layers will do. Training is done using the back-propagation algorithm.

Deep Belief Networks. Matlab code for learning Deep Belief Networks (from Ruslan Salakhutdinov).

RNNLM- Tomas Mikolov’s Recurrent Neural Network based Language models Toolkit.

RNNLIB-RNNLIB is a recurrent neural network library for sequence learning problems. Applicable to most types of spatiotemporal data, it has proven particularly effective for speech and handwriting recognition.

matrbm. Simplified version of Ruslan Salakhutdinov’s code, by Andrej Karpathy (Matlab).

deepmat- Deepmat, Matlab based deep learning algorithms.

Estimating Partition Functions of RBM’s. Matlab code for estimating partition functions of Restricted Boltzmann Machines using Annealed Importance Sampling (from Ruslan Salakhutdinov).

Learning Deep Boltzmann Machines Matlab code for training and fine-tuning Deep Boltzmann Machines (from Ruslan Salakhutdinov).

The LUSH programming language and development environment, which is used @ NYU for deep convolutional networks

Eblearn.lsh is a LUSH-based machine learning library for doing Energy-Based Learning. It includes code for “Predictive Sparse Decomposition” and other sparse auto-encoder methods for unsupervised learning. Koray Kavukcuoglu provides Eblearn code for several deep learning papers on this page.

Nengo-Nengo is a graphical and scripting based software package for simulating large-scale neural systems.

Eblearn is a C++ machine learning library with a BSD license for energy-based learning, convolutional networks, vision/recognition applications, etc. EBLearn is primarily maintained by Pierre Sermanet at NYU.

cudamat is a GPU-based matrix library for Python. Example code for training Neural Networks and Restricted Boltzmann Machines is included.

Gnumpy is a Python module that interfaces in a way almost identical to numpy, but does its computations on your computer’s GPU. It runs on top of cudamat.

The CUV Library (github link) is a C++ framework with python bindings for easy use of Nvidia CUDA functions on matrices. It contains an RBM implementation, as well as annealed importance sampling code and code to calculate the partition function exactly (from AIS lab at University of Bonn).

3-way factored RBM and mcRBM is python code calling CUDAMat to train models of natural images (from Marc’Aurelio Ranzato).

Matlab code for training conditional RBMs/DBNs and factored conditional RBMs (from Graham Taylor).

CXXNET – An implementation of deep convolution neural network in C++.

mPoT is python code using CUDAMat and gnumpy to train models of natural images (from Marc’Aurelio Ranzato).

neuralnetworks is a java based gpu library for deep learning algorithms.


3. Deep Learning Tutorial


A. Survey Papers on Deep Learning

Yoshua Bengio, Learning Deep Architectures for AI, Foundations and Trends in Machine Learning, 2(1), pp.1-127, 2009.

Yoshua Bengio, Aaron Courville, Pascal Vincent, Representation Learning: A Review and New Perspectives, Arxiv, 2012.

Deep Learning Code Tutorials

The Deep Learning Tutorials are a walk-through with code for several important Deep Architectures (in progress; teaching material for Yoshua Bengio’s IFT6266 course).

Unsupervised Feature and Deep Learning

Stanford’s Unsupervised Feature and Deep Learning tutorials has wiki pages and matlab code examples for several basic concepts and algorithms used for unsupervised feature learning and deep learning.

An Article about History of Deep Learninghttp://www.wired.com/wiredenterprise/2014/01/geoffrey-hinton-deep-learning


Deep Learning Tutorials – examples of how to do Deep Learning with Theano (from LISA lab at University of Montreal)


B. Videos

  • Deep Learning Representations
Yoshua Bengio’s Google tech talk on Deep Learning Representations at Google Montreal (Google Montreal, 11/13/2012)
  • Deep Learning with Multiplicative Interactions

Geoffrey Hinton’s talk at the Redwood Center for Theoretical Neuroscience (UC Berkeley, March 2010).

  • Recent developments on Deep Learning

Geoffrey Hinton’s GoogleTech Talk, March 2010.

  • Learning Deep Hierarchies of Representations 

general presentation done by Yoshua Bengio in September 2009, also at Google.

  • A New Generation of Neural Networks 

Geoffrey Hinton’s December 2007 Google TechTalk.

  • Deep Belief Networks

Geoffrey Hinton’s 2007 NIPS Tutorial [updated 2009] on Deep Belief Networks 3 hour video , ppt, pdf , readings

  • Training deep networks efficiently

Geoffrey Hinton’s talk at Google about dropout and “Brain, Sex and Machine Learning”.

  • Deep Learning and NLP
 Yoshua Bengio and Richard Socher’s talk, “Deep Learning for NLP(without magic)” at ACL 2012.
  • Tutorial on Learning Deep Architectures
Yoshua Bengio and Yann LeCun’s presentation at “ICML Workshop on Learning Feature Hiearchies” on June 18th 2009.

Energy-based Learning

[LeCun et al 2006]A Tutorial on Energy-Based Learning, in Bakir et al. (eds) “Predicting Structured Outputs”, MIT Press 2006: a 60-page tutorial on energy-based learning, with an emphasis on structured-output models. The tutorial includes an annotated bibliography of discriminative learning, with a simple view of CRF, maximum-margin Markov nets, and graph transformer networks.

A 2006 Tutorial an Energy-Based Learning given at the 2006 CIAR Summer School: Neural Computation & Adaptive Perception.[Energy-Based Learning: Slides in DjVu (5.2MB), Slides in PDF (18.2MB)] [Deep Learning for Generic Object Recognition:Slides in DjVu (3.8MB), Slides in PDF (11.6MB)]

ECCV 2010 Tutorial

Feature learning for Image Classification (by Kai Yu and Andrew Ng): introducing a paradigm of feature learning from unlabeled images, with an emphasis on applications to supervised image classification.

NIPS 2010 Workshop

Deep Learning and Unsupervised Feature Learning: basic concepts about unsupervised feature learning and deep learning methods with links to papers and code.

Summer Schools

Graduate Summer School: Deep Learning, Feature Learning: IPAM summer school about deep learning.

Online Courses

Geoffrey Hinton’s Online Neural networks Course on Coursera.

International Conference on Learning Representations (ICLR 2014)https://www.youtube.com/playlist?list=PLhiWXaTdsWB-3O19E0PSR0r9OseIylUM8

Posted by uniqueone
,