One type of information is *surprise*. If a message is surprising, then it is news and informative to the extent that it is new or unexpected. The opposite of this is the ordinary and expected, which can be filtered out like the carrier of a signal. This type of information is measured by *entropy*: the greater the entropy of a signal or series of messages, the greater its unpredictability. For *n* = 2,

*H* = −*p* log2(*p*) − (1−*p*) log2 (1*−p*),

which is a minimum at *p* = 0 or 1 and a maximum at *p* = 1/2.

The greatest unpredictability is noise, which is a random message. As the news media produces updates by the minute (and social media runs wild), the flood of surprise approaches noise. The news media has become the *noise media*.

This leads to information as *unsurprise*. In a flood of noise the presence of something recognizable is a reduction of surprise and entropy. Sufficiently reduce the noise and the result is a coherent signal. With the expansion of mass and social media today, there is an increasing need for filters and editors to extract meaning. This is measured by *shifted entropy*, in which noise is the minimum and a constant signal the maximum. For *n* = 2,

*N* = −|*p*−1/2| log2(|*p*−1/2|) − (1 − |*p*−1/2|) log2(1 − |*p*−1/2|),

which is a minimum at *p* = 1/2 and a maximum at *p* = 0 or 1.

The fullest information is both surprising and meaningful, a mean between the expected and the unexpected, the carrier and the noise of a channel of communication. This is measured by the mean of entropy and shifted entropy. For *n* = 2,h

*M* = ½( −*p* log2(*p*) − (1−*p*) log2(1−*p*) − (|*p*−1/2|) log2(|*p*−1/2|) − (1 − |*p*−1/2|) log2(1 − |*p*−1/2|),

which is a minimum at *p* = 0, 1/2, or 1 and a maximum at *p* = 1/4 or 3/4.