Bit – definition – A binary digit. The values of a bit are either “0” or “1”. Eight bits form a byte. The bit is the most basic unit of information in computing and digital communications. The name is a contraction of binary digit. The bit represents a logical state with one of two possible values. These values are most commonly represented as either “1”or”0″, but other representations such as true/false, yes/no, +/−, or on/off are common.
The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program. It may be physically implemented with a two-state device.
The symbol for the binary digit is either bit per recommendation by the IEC 80000-13:2008 standard, or the lowercase character b, as recommended by the IEEE 1541-2002 and IEEE Std 260.1-2004 standards.
Units of information |
A contiguous group of binary digits is commonly called a bit string, a bit vector, or a single-dimensional (or multi-dimensional) bit array. A group of eight binary digits is called one byte, but historically the size of the byte is not strictly defined. Frequently, half, full, double and quadruple words consist of a number of bytes which is a low power of two. Bit – definition
In information theory, one bit is the information entropy of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known. As a unit of information, the bit is also known as a shannon, named after Claude E. Shannon. Bit – definition