One type of information is *surprise*. If a message is surprising, then it is news and informative to the extent that it is new or unexpected. The opposite of this is the ordinary and expected, which can be filtered out like the carrier of a signal. This type of information is measured by *entropy*: the greater the entropy of a signal or series of messages, the greater its unpredictability. For *n* = 2,

*H* = – Σ_{i} p_{i} log_{2}(p_{i}) = −*p* log_{2}(*p*) − (1−*p*) log_{2} (1*−p*),

which is a minimum at *p* = 0 or 1 and a maximum at *p* = ½.

The greatest unpredictability is noise, which is a random message. As the news media produces updates by the minute (and social media runs wild), the flood of surprise approaches noise. The news media has become the *noise media*.

This leads to information as *unsurprise*. In a flood of noise the presence of something recognizable is a reduction of surprise and entropy. Sufficiently reduce the noise and the result is a coherent signal. With the expansion of mass and social media today, there is an increasing need for filters and editors to extract meaning. This is measured by *shifted entropy*, in which noise is the minimum and a constant signal the maximum. For *n* = 2,

*N* = −|*p*−1/2| log_{2}(|*p*−½|) − (1 − |*p*−½|) log_{2}(1 − |*p*−½|),

which is a minimum at *p* = ½ and a maximum at *p* = 0 or 1.

A measure of *balanced information* measures surprise against a background of bare signal and static, the carrier and the noise of a channel of communication. Balanced information is measured by the mean of entropy and shifted entropy. It is different from the entropy or Shannon information. For *n* = 2,

*M* = ½(−*p* log_{2}(*p*) − (1−*p*) log_{2}(1−*p*) − (|*p*−½|) log_{2}(|*p*−½|) − (1 − |*p*−½|) log_{2}(1 − |*p*−½|)),

which is a minimum at *p* = 0, ½, or 1 and a maximum at *p* = 1/4 or 3/4.