Monday, February 06, 2006

EDD chapter one

Here we go.

Lot's of stuff going on
I'll try to capture some of it here.
It's such the surprise to find one's own journal is so packed with lies.
The self may be the easiest to trick
Watch out.

Just out of the monday meeting, after a crazy weekend of time travel.
It's fifteen years until yesterday and I should stop going so far with only a couple of days and no real recovery time to speak of.
Anyway it's worth it.
Following Fridays Blasting.
I'm still a bit woozy (Latin: "Vertigo" the sensation of instability
I must admit. I don't think anyone notices. Doesn't matter anyway. It's like I'm mining a road with those snakes that spring out of fake can of peanut brittle. Or a roadside cream pie attack.
Humor with lethal intent.

Infomation entropy

The basic concept of entropy in information theory has to do with how much randomness (or, alternatively, 'uncertainty') there is in a signal or random event. An alternative way to look at this is to talk about how much information is carried by the signal.

As an example consider some English text, encoded as a string of letters, spaces, and punctuation (so our signal is a string of characters). Since some characters are not very likely (e.g. 'z') while others are very common (e.g. 'e') the string of characters is not really as random as it might be. On the other hand, since we cannot predict what the next character will be, it does have some 'randomness'. Entropy is a measure of this randomness, suggested by Claude E. Shannon in his 1948 paper A Mathematical Theory of Communication.

0 Comments:

Post a Comment

<< Home