• Welcome to "A rigorous approach to AI/Machine Learning, with applications"

  • Description:

    The course consists of three interconnected parts. The first part will review some of the fundamental mathematical tools which are needed in order to understand the reason for why approximation with neural networks works. The second part will take this further and investigate in more detail what are the best rates of convergence for a given data. The third part will focus on some concrete neural networks and their implementation, and also make the connection with other approximation methods used in Topological Data Analysis. Here are some more details:

    1. Horia Cornean, November 3 and 4, 2022: The basic mathematical concepts behind the approximation of continuous functions with neural networks. The main reference is the paper Multilayer Feedforward Networks With a Nonpolynomial Activation Function Can Approximate Any Function, Neural Networks 6(6),861-867 (1993). Very roughly said, the main result states that any ”nice” function can be arbitrarily well approximated by a neural network which has at least one hid-den layer, provided that the hidden layer consists of sufficiently many neurons and the activation function is not a polynomial. In order to ease the understanding of the arguments, I will review/explain/motivate a few fundamental ingredients like the convergence of discrete Fourier series, the Stone-Weierstrass approximation theorem, and the Baire category theorem.

    2. Morten Nielsen, November 7 and 8, 2022: General approximation with deep neural networks. The main reference is the paper Approximation Spaces of Deep Neural Networks, Constructive Approximation (2021), https://doi.org/10.1007/s00365-021-09543-4. We will review some classical results in (nonlinear) approximation theory with a particular focus on spline approximation. Based on this, we are going to measure a network’s complexity by its number of connections or by its number of neurons, and consider the class of functions for which the error of best approximation with networks of a given complexity decays at a certain rate when increasing the complexity budget. It will be shown that some functions of very low smoothness can nevertheless be well approximated by neural networks, if these networks are sufficiently deep.

    3. Christophe Biscio, November 14,15,16, 21 and 22, 2022. Neural networks and clustering algorithms in Python. The main references are Deep Learning with PyTorch by Eli Stevens, Luca Antiga, and Thomas Viehmann, and Persistence Theory: From Quiver Representations to Data Analysis by Steve Y. Oudot. These lectures target newcomers in machine learning and AI. We will in a first part learn how to use Python and its library Pytorch to import data in a suitable format. We will then show on various examples how to implement several neural networks, including the ones viewed in the past lectures. In a second part, we will review some clustering algorithms and show how mathematical topology may be used to improve existing clustering methods by determining the number of clusters or improve robustness against noise. To that end we will present the ToMaTo algorithm and illustrate its use on several datasets using the Python’s library GHUDI.

    Prerequisites: Basic knowledge in mathematics and statistics given by standard courses like Calculus and Linear Algebra. Also, the student is expected to be familiar with (but not expert in) concepts like continuity, differentiability, convergence of series, metric spaces, abstract vector spaces.

    Evaluation: The students are expected to participate for at least 7 days in the course, and to actively engage in a number of exercises.

    Organizer: Professor Horia Cornean, e-mail: cornean@math.aau.dk

    Lecturers: Professor Horia Cornean, Professor Morten Nielsen, Associate Professor Christophe Biscio

    ECTS: 5

    Time: 3, 4, 7, 8, 14, 15, 16, 21, 22 November 2022, each day 8:15-16:15

    Place: Department of Mathematical Sciences, Skjernvej 4A, 9220 Aalborg, Room ??. We plan the course in a hybrid form, the lectures will also be streamed via zoom

    Number of seats: 40

    Deadline: 13 October 2022