• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/6

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

6 Cards in this Set

  • Front
  • Back

What are the 6 points on Shannon entropy pt 1

•if two events are independent, probability is multiplicative so info gained is additive



•if the event is less likely more info gained



•if there are d=2^r equally likely, information is r bit's


What are the 6 points on Shannon entropy pt 2

•if an event is certain no info gained



•if an event is impossible, infinetly surprised if it happens



•if all d results are equally probable, H=log2d giving max entropy

What is the Shannon entropy conditional entropy

H(X|Y) =H(X,Y)-H(Y)

What is the Shannon entropy for shared information

H(X:Y) = H(Y)-H(Y|X)

What is an interpretation

Cannot get less information by two measurements compared to one

What is the Shannon entropy for uncorrelated information

H(X|Y) = H(X)