Fundamentals of Supervised learning with a focus on Deep learning

Activity: Participating in or organising an event typesParticipation in workshop, seminar, course

Description

In this lecture, we expose the theory of machine learning according to the framework developed by Vapnik. In particular, we present the notion of consistency that guarantees the learning of a prediction function. This study leads to the second principle called structural risk minimization, which states that learning is a compromise between a weak empirical error and a strong function class capability. This first stage will serve as a basis for our description of some classical machine learning algorithms including Neural Networks, recently referred to as Deep learning. For this purpose, we will draw a historical perspective of the field, introducing the main challenges, concepts and its evolutions. We will present what distinguishes these models from other machine learning or statistical techniques, by describing some of the recent advances and by trying to put in evidence some future challenges.
Period29 Oct 201831 Oct 2018
Event typeEducation
LocationLouvain-la-Neuve, BelgiumShow on map