Math Is Fun Forum

  Discussion about math, puzzles, games and fun.   Useful symbols: ÷ × ½ √ ∞ ≠ ≤ ≥ ≈ ⇒ ± ∈ Δ θ ∴ ∑ ∫ • π ƒ -¹ ² ³ °

You are not logged in.

#1 2007-10-01 21:58:27

TomAnthony
Member
Registered: 2007-10-01
Posts: 1

Mutual Information from Joint Probability Distribution

Given the following joint probability distribution for 2 Random Variables:

         X
             0          1           2
Y       -------------------------------
   0    | 1 / 3   |    0     |     0      |
        -------------------------------
   1    | 1 / 12 |   1 / 6 |  1 / 12  |
        -------------------------------
   2    |   0      |    0     |  1 / 3   |
        -------------------------------


Can anyone confirm to me the Mutual Information  [ I( X;Y) ] between the variables? I have an answer, I am just unsure it is correct. Thanks!

Offline

#2 2007-10-02 02:39:44

John E. Franklin
Member
Registered: 2005-08-29
Posts: 3,588

Re: Mutual Information from Joint Probability Distribution

Were you taught to use logarithm base 2, e, or 10 for this problem?
Since you used a semicolon, is this discrete, not continuous, just 0, 1, and 2, not 1.483429 ?


igloo myrtilles fourmis

Offline

Board footer

Powered by FluxBB