Skip to main content

Enrolment options

Course image
Course summary text:

Welcome to Statistical learning theory for time-series

Description:

This course provides a ...

Electrical and Electronic Engineering (2026)
Introduction:

Welcome to Statistical learning theory for time-series

Description:

This course provides a comprehensive introduction to the theory, algorithms, and statistical guarantees for learning dynamical systems and performing time-series prediction. The course combines classical modeling techniques with modern deep learning architectures, addressing both the estimation of unknown physical quantities and the construction of predictive models for applications such as forecasting the yearly energy consumption of buildings.

The theoretical foundation lies at the intersection of system identification, econometrics, statistics, and machine learning. Students will study both linear models and nonlinear sequence models, including:

  • Autoregressive and state-space models
  • Parameter estimation and subspace methods
  • Recurrent Neural Networks (RNNs)
  • Transformers for sequential data
  • Deep Structured State-Space Models (SSMs), including Mamba and related architectures architectures

In addition to understanding these models, the course emphasizes statistical performance guarantees, covering:

  • Asymptotic consistency for learning linear autoregressive and state-space models
  • Finite-sample error bounds for algorithms based on linear regression and subspace methods
  • Probably Approximately Correct (PAC) and PAC-Bayesian guarantees for learning sequential models, including deep architecture
The course is conducted as an intensive lecture series with physical attendance. Evaluation will be based on attendance and homework assignments.

Modern time-series prediction increasingly relies on deep learning methods such as RNNs, transformers, and structured state-space models, yet these models are often used as black boxes without rigorous performance understanding. This course bridges classical dynamical systems theory with modern sequence modeling, providing both practical tools and theoretical foundations to evaluate when and why these methods work, and under what statistical guarantees they can be trusted.

Learning Objectives:

After completing the course, students will be able to

  • Formulate and analyze linear and nonlinear dynamical system models for time-series prediction
  • Understand and implement RNNs, transformers, and deep SSMs (e.g., Mamba) for sequential data
  • Derive and interpret asymptotic consistency results for dynamical system estimators
  • Establish finite-sample performance bounds for classical and modern learning algorithms
  • Apply PAC and PAC-Bayesian frameworks to quantify the reliability of deep sequence models
  • Critically evaluate learning methods with provable performance guarantees

Prerequisites: A basic knowledge of mathematics as obtained through undergraduate engineering studies.

Organizer: John Leth

Lecturers: John Leth and Mihaly Petreczky

ECTS: 4

Date: 1, 2, 3, 4, 5 June 2026

Place: Aalborg University

City: Aalborg

Number of seats: 50

Deadline: 11 May 2026

Important information concerning PhD courses: 

There is a no-show fee of DKK 3,000 for each course where the student does not show up. Cancellations are accepted no later than 2 weeks before the start of the course. Registered illness is of course an acceptable reason for not showing up on those days. Furthermore, all courses open for registration approximately four months before start of the course.

We cannot ensure any seats before the deadline for enrolment, all participants will be informed after the deadline, approximately 3 weeks before the start of the course.

For inquiries regarding registration, cancellation or waiting list, please contact the PhD administration at phdcourses@adm.aau.dk. When contacting us please state the course title and course period. Thank you.


Year: 2026
ECTS points: 4
Open in new window