Welcome to Introduction to Bayesian Learning (2021)


Description: This short course is dedicated to Bayesian learning. The Bayesian approach to Machine Learning adopts a radically different viewpoint to the more “standard” approaches. The latter are based on models that are parametrized in terms of a set of unknown parameters, which are considered to be deterministic variables. That is, each parameter, although unknown, it corresponds to a specific fixed value. 

In contrast, in the Bayesian world, the unknown parameters are treated as random variables.  This was a revolutionary idea at the time it was introduced by the mathematician and philosopher Bayes and later on used by the great mathematician Laplace. Even now, after more than two centuries, it may seem strange to assume that a physical phenomenon/mechanism is controlled by a set of random parameters.

However, there is a subtle point here. Treating the underlying set of parameters as random variables, we do not really imply a random nature for them. The associated randomness, in terms of prior distribution, encapsulates our uncertainty about their values, prior to receiving any observations. Stated differently, the prior distribution represents our belief about the different possible values, although only one of them is actually true. From this perspective, probabilities are viewed in a more “open-minded” way, that is, as measures of uncertainty.

The outline of this course is as following:

 

  • Maximum Likelihood and Maximum a-Posteriori Estimators: A Revision
  • The Bayesian Approach: Basic Concepts
  • The Bayesian Approach to Linear Regression: The Gaussian Case
  • The Evidence Function and Occam's Razor Rule
  • The Laplacian Approximation and the Evidence Function
  • Latent Variables and the Expectation-Maximization (EM) Algorithm
  • Linear Regression and the EM Algorithm
  • Gaussian Mixture Models  and the k-Means Algorithm
  • The Lower Bound Interpretation of the EM Algorithm
  • Exponential Family of Distributions
  • Variational Approximation and the Mean Field Approximation Concept: A Discussion



Organizer: Jan Østergaard

Lecturers: Sergios Theodoridis

ECTS: 2.0

Time: 10-12 May 2021 and 25-26 May, 9:00 - 12:00.

Place: Aalborg University

Zip code: 
9220

City: Aalborg

Number of seats: 50

Deadline: 19 April 2021


Important information concerning PhD courses: We have over some time experienced problems with no-show for both project and general courses. It has now reached a point where we are forced to take action. Therefore, the Doctoral School has decided to introduce a no-show fee of DKK 3.000 for each course where the student does not show up. Cancellations are accepted no later than 2 weeks before start of the course. Registered illness is of course an acceptable reason for not showing up on those days. Furthermore, all courses open for registration approximately four months before start. This can hopefully also provide new students a chance to register for courses during the year. We look forward to your registrations.