Upper and lower bounds for the mutual information in dynamical networks

sala 0.06
Friday, 3 December, 2010 - 14:00

In this seminar, I will introduce some recent results for the calculation of upper and lower bounds for the rate of information exchanged between two nodes (or a group of nodes) in a dynamical network, a quantity also known as the mutual information rate (MIR), without having to calculate probabilities but rather Lyapunov exponents. The derivation of these bounds for the MIR employs the same ideas as the one considered by Ruelle when showing that the sum of the positive Lyapunov exponents of a dynamical system is an upper bound for its Kolmogorov-Sinai entropy. Since no probabilities need to be calculated, these equations provide a simple way to state whether two nodes are information-correlated and can be conveniently used to understand the relationship between structure and function (information) in dynamical networks. If the equations of motion of the dynamical network are known, an upper and lower bounds for the MIR might be analytically or semi-analytically calculated. If the equations of motion are not known, we can employ our equations to measure how much information is shared between two data sets.

Speaker: 

Murilo Baptista - University of Aberdeen, UK