Invite you to join the workshop on Neural Networks and Learning Systems (NNLS), on September 24-28, 2018
We would like to invite you to join the workshop on Neural Networks and Learning Systems (NNLS), starting on Monday, September 24, 2018 in the afternoon, until Friday, September 28, 2018. Please see the attached for more details.
Learning is a process by which a system improves its performance from experience. Machine learning is the field of study that provides computers the ability to learn without being explicitly programmed. Machine learning is used in cases where plenty of data is available and there is an intuition that a certain rule exists; but the rule is not explicitly known or cannot be expressed mathematically. Neural networks (NNs) are a class of model for machine learning capable of achieving human like performance. Based on the types of learning, supervised or unsupervised, various NNs are designed. Multilayer Perceptron (MLP), Radial Basis Function (RBF) NN operate in supervised mode; whereas, Kohonen's Self Organizing Feature Map (SOFM), Adaptive Resonance Theory (ART) work in unsupervised manner. Support Vector Machine (SVM) and Deep Neural Networks (DNNs), enhanced versions of NNs, are popularly used recent learning models exhibiting encouraging performance.
In this series of lectures, focus will be given on introducing neural networks as machine learning tools. A thorough discussion will be made on some of the popular neural networks working both in supervised and unsupervised frameworks. This will be followed by discussion on today's thrust area: Deep Learning. Applications of these topics on problems in pattern recognition and image processing arena will also be presented.
Mon24/09/2018, Training Room4/1
01:00 P.M.-05:00 P.M. Machine Learning
Defining a learning problem, introduction to machine learning, relation between machine learning, statistics and artificial intelligence, Types of learning (based on information available, role of learner, type of output), training, testing, cross validation, measuring prediction performance, overfitting, general model of learning system and its components, eager and lazy learner, batch versus online learning, applications
Tue25/09/2018, Computer Room 1/3
10:00 A.M.-12:00 P.M. Linear Classifiers
Minimum distance, k nearest neighbor, Perceptron, Support Vector Machine
Wed26/09/2018, Training Room4/1
09:00A.M.-12:00 P.M. Neuro-computing
Brain versus digital computer, biological neural network (NN) and artificial NN, general framework of NNs, activation function, characteristics of NNs, advantages of NNs, popularly used NN models for supervised and unsupervised learning, Perceptron, linear separatability, cascading layers, Multi- layer Perceptron, Parameter updating, example and applications
01:30P.M.-03:30 P.M. Deep Learning:Convolution Neural Network-CNN
Thu27/09/2018, Training Room4/1
10:00 A.M.-12:00 P.M. Kohonen Self Organizing Feature Map Neural Networks
Hebbian learning, competitive learning, self organization, feature map, properties of feature map, self organizing feature map, architecture of kohonen's net, learning algorithm, structure of neighborhood and neighborhood functions, tuning of parameters, pros and cons of SOFM, applications
01:30 P.M.-03:30 P.M Kernel Tricks, RBF Networks
10:00 A.M.-12:00 P.M. Applications of Neural Networks at Training Room4/1
01:30 P.M.-03:30 P.M. Workshop Wrap-up at Meeting Room3/1