Description:
In Bayesian learning, the involved parameters that describe a learning model
are treated as random entities, and the goal is to model the generative
mechanism that generates the data, than just perform input-output predictions.
In this course, Maximum Likelihood and
Maximum-a- Posteriori estimators are
reviewed, and then the Bayesian method is introduced with the notion of the evidence function, with its efficient
dealing of the complexity-accuracy trade-off. Then, the power of a prior
distribution as a regularizer is exemplified, and the EM algorithms is
introduced in the context of three applications: Regression, Mixture modelling and clustering. Finally,
the variational approximation to the EM is introduced and discussed
Organizer: Sergios Theodoridis
Lecturer: Sergios Theodoridis
ECTS: 3.0
Time: May 2-6, 2022, 9 - 12 hrs. all days
Place: Seminar room FRB 7B2-107, Fredrik Bajers 7B
Number of seats: 30
Deadline: May 1, 2022
Important
information concerning PhD courses:
We have over some time experienced problems with no-show for both project and
general courses. It has now reached a point where we are forced to take action.
Therefore, the Doctoral School has decided to introduce a no-show fee of DKK
3.000 for each course where the student does not show up. Cancellations are
accepted no later than 2 weeks before start of the course. Registered illness
is of course an acceptable reason for not showing up on those days.
Furthermore, all courses open for registration approximately four months before
start. This can hopefully also provide new students a chance to register for
courses during the year. We look forward to your registrations.
- Teacher: Sergios Theodoridis