How can I most efficiently count the number of bits required by an integer (log base 2) in C#? For example:
int bits = 1 + log2(100);
=> bits == 7
See Question&Answers more detail:osHow can I most efficiently count the number of bits required by an integer (log base 2) in C#? For example:
int bits = 1 + log2(100);
=> bits == 7
See Question&Answers more detail:osSlight improvement to Guffa's answer... Since the amount you are adding to the result is always a power of two using bit operations can produce slight improvement on some architectures. Also since our context is bit patterns it is slightly more readable to use hexadecimal. In this case it is useful to shift the arithmetic by a power of 2.
int bits = 0;
if (n > 0xffff) {
n >>= 16;
bits = 0x10;
}
if (n > 0xff) {
n >>= 8;
bits |= 0x8;
}
if (n > 0xf) {
n >>= 4;
bits |= 0x4;
}
if (n > 0x3) {
n >>= 2;
bits |= 0x2;
}
if (n > 0x1) {
bits |= 0x1;
}
Further a check for n==0 should be added since the above will yield a result of 0 and Log(0) is undefined (regardless of base).
In ARM assembly this algorithm produces very compact code as the branch after comparison can be eliminated with conditional instructions which avoids pipeline flushing. For Example:
if (n > 0xff) {
n >>= 8;
bits |= 0x8;
}
becomes (let R0 = n, R1 = bits)
CMP R0, $0xff
MOVHI R0, R0, LSR $8
ORRHI R1, R1, $0x8