From Encyclopedia of Mathematics
Jump to: navigation, search

A binary unit of information, numerically equal to the amount of information obtained during a trial with two mutually exclusive equally probable alternatives ():

the logarithms being taken to the base 2. The bit is the most frequently used unit, but other units of information are also employed — the "Hartley" or the "nit" , the definitions of which involve decimal and natural logarithms respectively.


The definition as given comes from information theory. In computer science the term "bit" usually refers to the representation of "0" or "1" by a suitable physical device which can be (exclusively) in two alternative states, or even to that device itself. Nits or Hartley's are unknown in the West.

How to Cite This Entry:
Bit. A.V. Prokhorov (originator), Encyclopedia of Mathematics. URL:
This text originally appeared in Encyclopedia of Mathematics - ISBN 1402006098