Course title: A rigorous approach to AI/Machine Learning, with applications in Topological Data Analysis and Computational Algebraic Topology.
Course description: The course consists of three interconnected parts. The first part will review some of the fundamental mathematical tools which are needed in order to understand the reason for why approximation with neural networks works. The second part will take this further and investigate in more detail what are the best rates of convergence for a given data. The third part will focus on some concrete neural networks and their implementation, and also make the connection with other approximation methods used in Topological Data Analysis. Here are some more details:
1. Horia Cornean, November 9 and 10, 2023: The basic mathematical concepts behind the approximation of continuous functions with neural networks. The main reference is the paper:
H.N. Mhaskar: Approximation properties of a multilayered feedforward artificial neural network. Advances in Computational Mathematics 1, 61-80 (1993). Link: https://link.springer.com/article/10.1007/BF02070821
Very roughly said, the main result states that any ”nice” function can be arbitrarily well approximated by a neural network which has at least one hidden layer, provided that the hidden layer consists of sufficiently many neurons and the activation function is not a polynomial. We will analyze in detail the case with two hidden layers with ReLu and sigmoid activation functions and give quantitative bounds on the expected errors. A couple of basic “dimensionality reduction” methods will be explained. To ease the understanding, I will review a few fundamental ingredients like the convergence of discrete Fourier series and the Stone-Weierstrass approximation theorem. The participants should be aware that a few mathematical proofs will be presented😊
2. Morten Nielsen, November 13 and 14, 2023: General approximation with deep neural networks. The main reference is the paper Approximation Spaces of Deep Neural Networks, Constructive Approximation (2021), https://doi.org/10.1007/s00365-021-09543-4.
We will review some classical results in (nonlinear) approximation theory with a particular focus on spline approximation. Based on this, we are going to measure a network’s complexity by its number of connections or by its number of neurons, and consider the class of functions for which the error of best approximation with networks of a given complexity decays at a certain rate when increasing the complexity budget. It will be shown that some functions of very low smoothness can nevertheless be well approximated by neural networks, if these networks are sufficiently deep.
3. Yossi Bleile & Matteo Pegoraro, November 15,16 and 17, 2023. Introduction to Computational Algebraic Topology and Topological Data Analysis (TDA).
In this course we will get you started in TDA by presenting some mathematical background on computational algebraic topology and seeing how that can be used to extract valuable information from data. The theoretical part of the course will collect ideas from the following paper:
Chazal, Frédéric, and Bertrand Michel: An introduction to topological data analysis: fundamental and practical aspects for data scientists. Frontiers in artificial intelligence 4 (2021): 108. https://www.frontiersin.org/articles/10.3389/frai.2021.667963/full
The theoretic ingredients we will need are essentially two: simplicial homology and persistence. Informally, homology can be thought as a way to measure the number of holes is some space. Persistence extends this idea, by looking at how such information evolves along a family of spaces, which are usually obtained from data in a few different ways. The combination of the two ideas is often referred to as persistent homology.
Having looked at the mathematical formulation of persistent homology, most of the exercise sessions will focus on the practical possibilities offered by some of the most diffused techniques in TDA. Some knowledge of Python will be assumed.
Prerequisites: Basic knowledge in mathematics and statistics given by standard courses like Calculus and Linear Algebra. Also, the student is expected to be familiar with (but not expert in) concepts like continuity, differentiability, convergence of series, metric spaces, abstract vector spaces.
Evaluation: The students are expected to participate for at least 5 days in the course, and to actively engage in a number of exercises.
Organizer: Professor Horia Cornean, e-mail: cornean@math.aau.dk
Lecturer(s): Professor Horia Cornean, Professor Morten Nielsen, Postdoc Matteo Pegoraro and Postdoc Yossi Bleile.
ECTS: 4
Time: November 9, 10, 13, 14, 15, 16, 17; each day 09:00-15:00. Lectures in the morning, exercises in the afternoon.
Place: Department of Mathematical Sciences, Skjernvej 4A, 9220 Aalborg, Room ??.
Number of seats: 40
Deadline: 19 October 2023
- Teacher: Yossi Bleile
- Teacher: Horia Cornean
- Teacher: Morten Nielsen
- Teacher: Matteo Pegoraro