Medicipate! Conserve your knowledge 
Medicipate!
Conserve your knowledge
 
Write article
Please log in to edit this article.

Information

The concept information covers signs, signals and messages with their syntactic, semantic and pragmatic aspects.

Uncertainty and thus information content of a random event i is quantified as negative logarithm of its probability:

i = - ld pi ,

where ld denotes the dual logarithm and pi the probability of the associated event.

The quantity of an information stream produced by an ergodic source (entropy) is defined by Shannon's equation.

To comment on this article, .

Click here for creating a new article in the DocCheck Flexikon.
0 rating(s) (0 ø)
Share

2.549 Views

Follow DocCheck: