Skip to content

Synergy and redundancy – a technical note

May 24, 2012

There has been a bit of discussion about how to define synergy and redundancy in the information carried by set of random variables X1, X2, … Xn about a random variable Y.  Now V. Griffith and C. Koch have entered the debate with another measure which has several attractive features.  The surprising result is that a set of variables can be both synergistic and redundant.  This seems counterintuitive, but it makes sense as part of the information about Y could be carried synergistically, and another part redundantly in the set {Xi}.  The definition is fairly intuitive: Define a variable Y* which has minimal entropy among all variables Z dependent on Y,  such that I(Xi : Y) = I(Xi : Z) for all i.  Synergy is now the difference between the mutual information I(X1, X2, … Xn : Y) and the variable I(X1, X2, … Xn : Y*).  Redundancy can also be defined in a related way. Unfortunately, the fact that we need to minimize entropy means that analytical expressions for the synergy may be difficult to obtain.

Advertisements

From → Uncategorized

Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: