I am interested, which is the optimal way of calculating the number of bits set in byte by this way
template< unsigned char byte > class BITS_SET
{
public:
enum {
B0 = (byte & 0x01) ? 1:0,
B1 = (byte & 0x02) ? 1:0,
B2 = (byte & 0x04) ? 1:0,
B3 = (byte & 0x08) ? 1:0,
B4 = (byte & 0x10) ? 1:0,
B5 = (byte & 0x20) ? 1:0,
B6 = (byte & 0x40) ? 1:0,
B7 = (byte & 0x80) ? 1:0
};
public:
enum{RESULT = B0+B1+B2+B3+B4+B5+B6+B7};
};
Maybe it is optimal when value of byte is known at run-time? Is it recommended use this in code?
See Question&Answers more detail:os