Welcome to Causal Information Theory and Data Compression in Feedback Systems

Description:

Information theory provides the means for characterizing fundamental limits in signal processing, communications, and control problems. For example, information theoretic quantities have recently been used to establish lower and upper bounds for the minimal data rates required, when quantizing and compressing control signals subject to stability and performance constraints in networked control systems.

In this course, the main focus will be on the concept of causal information theory, which is not only important in engineering but also in e.g., economics, bioinformatics, neuroscience, etc. The main idea is that when considering systems with feedback, then it is not only the amount of shared (mutual) information between random signals that is important but the direction of the information flow is also important. We begin the course by a short introduction to non-causal information theory such as discrete and continuous entropy, mutual information, information divergence, entropy gain of (non-minimum) phase filters, randomized lattice quantization theory, and rate-distortion of (non)-stationary processes. We then introduce the notion of directed (mutual) information, and address the causal counter parts to rate-distortion theory e.g., sequential, causal, and zero-delay source coding. We are now in a position to better understand interactions between random signals in feedback systems. We use this knowledge to relate the mutual information between signals outside the loop of a feedback system to the directed information between signals inside the feedback loop.

We apply the developed theory to important signal processing and networked control problems.

Prerequisites:

Knowledge of stochastic processes and probability theory as obtained through undergraduate engineering studies.

Course literature:

  • Introductory chapters on information measures and typicality in either “Elements of Information Theory” by Cover and Thomas, or “A First course in information theory” by R.W. Yeung.
  • Sections 1,3, and 7 in “The Entropy Gain of Linear Time-Invariant Filters and Some of its Implications”. Available on arxiv: http://arxiv.org/abs/1512.03655
  • Sections 1,2, and 3 in “Fundamental Inequalities and Identities Involving Mutual and Directed Informations in Closed-Loop Systems”. Available on arxiv: http://arxiv.org/abs/1301.6427
  • Sections 1,2, and 6 in “Improved Upper Bounds to the Causal Quadratic Rate-Distortion Function for Gaussian Stationary Sources”, IEEE Transactions on Information Theory, Vol.58, No.5, May 2012.

Organizer: Associate Professor Jan Østergaard
Lecturer: Associate Professor Jan Østergaard

ECTS: 4.0

Time: March 14-18, 2016 (8:30-15:30 every day)

Place:

Monday: Fredrik Bajers Vej 7 – lokale B2-107  8:30 – 12:00 og lokale B2-104 12:30 – 16:30

Tuesday: Niels Jernes Vej 14 – lokale 3-119 – 8:30 – 16:30

Wednesday: Fredrik Bajers Vej 7 – lokale B2-107  8:30-12:00 og lokale B2-104  12:30 – 16:30

Thursday: Niels Jernes Vej 14 – lokale 3-119 – 8:30 – 16:30

Friday: Fredrik Bajers Vej 7 – lokale B2-107  8:30 – 16:30 

Zip code: 9220

City:
Aalborg East

 

Number of seats: 30

Deadline: February 22, 2016

Important information concerning PhD courses: We have over some time experienced problems with no-show for both project and general courses. It has now reached a point where we are forced to take action. Therefore, the Doctoral School has decided to introduce a no-show fee of DKK 5,000 for each course where the student does not show up. Cancellations are accepted no later than 2 weeks before start of the course. Registered illness is of course an acceptable reason for not showing up on those days. Furthermore, all courses open for registration approximately three months before start. This can hopefully also provide new students a chance to register for courses during the year. We look forward to your registrations.