bitwidths
Bitwidth is the number of bits used to represent a value in a given context, and it influences the range, precision, and storage cost of that value. It applies to integers, floating-point numbers, and other binary encodings, and can be fixed by a language or hardware architecture or vary with the program’s data types.
Common fixed bitwidths include 8, 16, 32, and 64 bits. These widths appear in integers and pointers
For integers, unsigned and signed representations differ. With n bits, unsigned integers span 0 to 2^n −
Floating-point numbers use bitwidths that encode sign, exponent, and mantissa. Common widths are 32-bit (single precision)
Bitwidths also affect performance and portability. They influence memory usage, alignment, and the interface between software