Statistical Physics of Deep Learning

Activité: Participation ou organisation d'un événementParticipation à un atelier/workshop, un séminaire, un cours

Description

The success enjoyed over the last decade by deep neural networks in machine learning tasks raises fundamental questions about their working principle. Their surprising generalization capability and the reason why they work optimally in a strongly overparametrised regime are still largely open problems. The architecture and training protocol for a given task are still chosen mainly by empirical knowledge, rather than based on theoretical guidelines.

Methods rooted in statistical physics, such as disordered systems, phase transitions or chaos theory, have begun to provide conceptual insights into these questions. At the same time, deep learning algorithms have been applied to a wide array of different problems faced by physicists, ranging from particle physics and cosmology to many-body physics and biological physics. Neural networks are thus becoming an important topic in the training of physicists, computer scientists and applied mathematicians. The school is designed to give an overview of the statistical-mechanical principles underlying deep learning.

The school is aimed primarily at the growing audience of early-stage researchers (graduate students, advanced master students and postdocs) interested in fundamental aspects of machine learning, beyond a simple black-box approach. Our lectures will provide a critical introduction to deep neural network rooted in a statistical physics perspective. It will expose the participants to applications to a wide range of problems, mainly of physical nature.

The school is addressed to PhD students and postdocs and it is designed to have lectures, hands on tutorials and some seminars on specific topics.
Période17 juin 2022
Type d'événementCours
LieuComo, ItalieAfficher sur la carte
Niveau de reconnaissanceInternational