**Description:**

An intelligent system is expected to generate policies autonomously in order to
achieve a goal, which is mostly to maximize a given reward function or minimize
a given cost function. Reinforcement
learning is a set of methods in machine learning that can produce such
policies. In order to learn optimal actions in an environment that is not fully
comprehensible to itself, an intelligent system can use reinforcement
algorithms to leverage its experience to figure out optimal policies. Nowadays,
reinforcement learning techniques are successfully applied in various
engineering fields, including robotics (DeepMind’s walking robot) and computers
playing games (AlphaGo and TD-Gammon). Developed independently from
reinforcement learning, dynamic programming is a set of algorithms in optimal
control theory that generate policies assuming that the environment is fully
comprehensible to the intelligent system. Therefore, dynamic programming
provides an essential base to learn reinforcement learning. The course aims at
building a fundamental understanding of both methods based on their intimate
relations to each other and on their applications to similar problems.

The course consists of the following topics: Markov decision processes, dynamic programming for infinite time and stopping-time, reinforcement learning, and its convergence proofs.

**Organizer:** Rafal Wisniewski

**Lecturers:** Rafal Wisniewski, Zheng-Hua Tan

**ECTS:** 2.0

**Time:** 3 - 7 October 2022, from 9 to 12 and 13 to 15 every day

**Place:** TBA

**Number
of seats:** 40

**Deadline:** 12 September 2022

**Important
information concerning PhD courses:**

We have over some time experienced problems with no-show for both project and
general courses. It has now reached a point where we are forced to take action.
Therefore, the Doctoral School has decided to introduce a no-show fee of DKK
3.000 for each course where the student does not show up. Cancellations are
accepted no later than 2 weeks before start of the course. Registered illness
is of course an acceptable reason for not showing up on those days.
Furthermore, all courses open for registration approximately four months before
start. This can hopefully also provide new students a chance to register for
courses during the year. We look forward to your registrations.

- Teacher: Zheng-Hua Tan
- Teacher: Rafal Wisniewski

**Description:**

Virtually all physical systems are nonlinear in nature. Sometimes, however, it is possible to describe the operation of a physical system by a linear model which varies with respect to some parameters. In this course we introduce methods for describing, analyzing and controlling such systems by means of the theory of linear parameter varying (LPV) systems. LPV system theory has proven to be very successful in real world application in cases where linear system theory is insufficient. This is due to the fact that the methods for verifying stability and designing control with LPV models are numerically tractable as it is based on linear matrix inequalities LMIs.

The goal of the course is to introduce fundamental concepts from the theory of linear parameter varying systems. In particular, we address the following subjects linear matrix inequalities (LMIs), descriptor LPV form, polytopic LPV form, Linear Fractional Representation, stability of LPV systems, performance analysis of LPV systems, and control of LPV Systems. If time permits, we will also discuss system identification and model reduction of LPV systems and basic system-theoretic properties such as minimality, controllability, observability.

**Organizer:** Associate Professor John Leth

**Lecturers:** Associate Professor John Leth and Mihaly Petreczky, CNRS, Ecole Centrale Lille, research group CIRStAL

**ECTS:** 2.0

**Time:** August 15-19, 2022

**Place:** TBA

**Number of seats:** 50

**Deadline:** July 25, 2022

**Important
information concerning PhD courses:**

We have over some time experienced problems with no-show for both project and
general courses. It has now reached a point where we are forced to take action.
Therefore, the Doctoral School has decided to introduce a no-show fee of DKK
3.000 for each course where the student does not show up. Cancellations are
accepted no later than 2 weeks before start of the course. Registered illness
is of course an acceptable reason for not showing up on those days.
Furthermore, all courses open for registration approximately four months before
start. This can hopefully also provide new students a chance to register for
courses during the year. We look forward to your registrations.

- Teacher: John-Josef Leth
- Teacher: Mihaly Petreczky

**Description:**

In Bayesian learning, the involved parameters that describe a learning model
are treated as random entities, and the goal is to model the generative
mechanism that generates the data, than just perform input-output predictions.
In this course, Maximum Likelihood and
Maximum-a- Posteriori estimators are
reviewed, and then the Bayesian method is introduced with the notion of the evidence function, with its efficient
dealing of the complexity-accuracy trade-off. Then, the power of a prior
distribution as a regularizer is exemplified, and the EM algorithms is
introduced in the context of three applications: Regression, Mixture modelling and clustering. Finally,
the variational approximation to the EM is introduced and discussed

**Organizer:** Sergios Theodoridis

**Lecturer:** Sergios Theodoridis

**ECTS:** 3.0

**Time:** May 2-6, 2022, 9 - 12 hrs. all days

**Place:** Seminar room FRB 7B2-107, Fredrik Bajers 7B

**Number
of seats:** 30

**Deadline:** May 1, 2022

**Important
information concerning PhD courses:**

We have over some time experienced problems with no-show for both project and
general courses. It has now reached a point where we are forced to take action.
Therefore, the Doctoral School has decided to introduce a no-show fee of DKK
3.000 for each course where the student does not show up. Cancellations are
accepted no later than 2 weeks before start of the course. Registered illness
is of course an acceptable reason for not showing up on those days.
Furthermore, all courses open for registration approximately four months before
start. This can hopefully also provide new students a chance to register for
courses during the year. We look forward to your registrations.

- Teacher: Sergios Theodoridis

**Description:**

Deep learning is the hottest area of research in machine learning and has recently shown huge success in a variety of areas. The impact on many applications is revolutionary, which ignites intensive studies of this topic. During the past few decades, the prevalent machine learning methods, including support vector machines, conditional random fields, hidden Markov models, and one-hidden-layer multi-layer perceptron, have found a broad range of applications. While being effective in solving simple or well-constrained problems, these methods have one drawback in common, namely they all have shallow architectures. They in general have no more than one or two layers of nonlinear feature transformations, which limits their performance on many real-world applications. On the contrary, the human brain and its cognitive process, being far more complicated, have deep architectures that are organized into many hierarchical layers. The information gets more abstract while going up along the hierarchy. Interests in using deep architectures have been reignited by many amazing applications enabled by deep learning techniques.

During the past years, deep learning methods and applications have witnessed unprecedented success. This course will give an introduction to deep learning both by presenting valuable methods and by addressing specific applications. This course covers both theory and practices for deep learning. The students will also have hands-on exercises experimenting a variety of deep learning architectures for applications. Topics will include:

- Machine learning fundamentals

- Deep learning concepts

- Deep learning methods including deep autoencoders, deep neural networks, long short-term memory recurrent neural networks, convolutional neural networks, and generative adversarial networks.

- Selected applications of deep learning

- Machine learning operations (MLOps)

**Teaching methods:** The course will be taught through a combination of lectures, demos of applications and small exercises.

**Criteria for assessment:** Acceptable exercise solutions and at-least 75% participation are required to pass the course.

**Organizer:** Professor Zheng-Hua Tan, e-mail: zt@es.aau.dk

**Lecturers:** Professor Zheng-Hua Tan, e-mail: zt@es.aau.dk

Dr. Ivan Lopez-Espejo, e-mail: ivl@es.aau.dk

Dr. Sven Ewan Shepstone (Bang & Olufsen)

**ECTS:** 2.0

**Time:** 6, 9 and 11 May 2022

**Place:** : Frederik Bajers Vej 7C/2-209, 9220 Aalborg

**Number of seats:** 50

**Deadline:** 15 April 2022

**Important information concerning PhD courses:**

We have over some time experienced problems with no-show for both project and general courses. It has now reached a point where we are forced to take action. Therefore, the Doctoral School has decided to introduce a no-show fee of DKK 3.000 for each course where the student does not show up. Cancellations are accepted no later than 2 weeks before start of the course. Registered illness is of course an acceptable reason for not showing up on those days. Furthermore, all courses open for registration approximately four months before start. This can hopefully also provide new students a chance to register for courses during the year. We look forward to your registrations.

- Teacher: Ivan Lopez Espejo
- Teacher: Zheng-Hua Tan

**Description:**

Typical attacks on critical infrastructures: How hackers can abuse the Internet
of Things, and how attacks can be prevented and detected.

In the course, we will discuss public-key cryptosystems, which are widely used for secure data transmission. Public key cryptography aims at solving the problem of how two parties can communicate securely when they have not agreed on some secret common key, which is often the case in for example communication through the internet. It includes both public-key encryption, which guarantees the secrecy of a message, and digital signatures, which provide authentication and integrity. We will introduce some of the more classical, but still widely used, public-key cryptosystems, such RSA and El Gamal/ Diffie-Hellman and we will discuss which security properties are usually required from them nowadays. Subsequently, we will investigate the problem of secure multi-party computation, which studies how to "compute on encrypted data": how several mutually distrustful parties can collaborate to jointly perform computations involving private data without needing to actually reveal their private information to others. We will show how to develop secure optimization and machine-learning algorithms.

**Organizer:** Rafal Wisniewski

**Lecturers:** Rafal Wisniewski, Rasmus Løvenstein
Olsen, Jens Myrup Pedersen, Jaron Skovsted Gundersen and Niels Peter Anglov

**ECTS:** 3.0

**Time:** February 28 to March 4, 2022

**Place:** TBA

**Number
of seats:** 40

**Deadline:** February 7, 2022

**Important
information concerning PhD courses:**

We have over some time experienced problems with no-show for both project and
general courses. It has now reached a point where we are forced to take action.
Therefore, the Doctoral School has decided to introduce a no-show fee of DKK
3.000 for each course where the student does not show up. Cancellations are
accepted no later than 2 weeks before start of the course. Registered illness
is of course an acceptable reason for not showing up on those days.
Furthermore, all courses open for registration approximately four months before
start. This can hopefully also provide new students a chance to register for
courses during the year. We look forward to your registrations.

- Teacher: Niels Peter Anglov
- Teacher: Jaron Skovsted Gundersen
- Teacher: Rasmus Løvenstein Olsen
- Teacher: Jens Myrup Pedersen
- Teacher: Rafal Wisniewski

**Description:**

Virtually all physical systems are nonlinear in nature. Sometimes, however, it is possible to describe the operation of a physical system by a linear model e.g., as a set of ordinary linear differential equations. This is the case, for example, if the mode of operation of the physical system does not deviate too much form the nominal set of operating conditions i.e., one can linearize the system. Thus the analysis of linear systems occupies an important place in system theory. But in analyzing the behavior of a physical system, one often encounters situations where the linear (or linearized) model is inadequate or inaccurate; that is the time when concepts from this course may prove useful.

The goal of the course is to address several main subjects in nonlinear control theory. We extend the notion of Lyapunov stability to systems with input i.e., input-to-state stability, show how Lyapunov stability is related to input-output stability and present/apply various nonlinear control design tools; passivity-based control, integral control, gain scheduling, Lyapunov redesign, sliding mode control, etc.

If time permits we will discuss controllability, observability and minimality of non-linear systems, elements of geometric control theory such as feedback linearization, etc., stabilization of control systems via discontinuous state feedback (sample-and-hold) and via continuous time-varying state feedback.

**Organizer:** Associate Professor John Leth, jjl@es.aau.dk

**Lecturer:** Associate Professor John Leth and Mihaly Petreczky, CNRS, Ecole
Centrale Lille, research group CIRStAL

**ECTS:** 2

**Time:** August 22-26, 2022

**Place:** TBA

**Number of seats:** 50

**Deadline:** August 1, 2022

**Important information concerning PhD courses:**

We have over some time experienced problems with no-show for both project and general courses. It has now reached a point where we are forced to take action. Therefore, the Doctoral School has decided to introduce a no-show fee of DKK 3.000 for each course where the student does not show up. Cancellations are accepted no later than 2 weeks before start of the course. Registered illness is of course an acceptable reason for not showing up on those days. Furthermore, all courses open for registration approximately four months before start. This can hopefully also provide new students a chance to register for courses during the year. We look forward to your registrations.

- Teacher: John-Josef Leth
- Teacher: Mihaly Petreczky