STL bitset count() implementation

I am wondering how the STL bitset<N> function count() is implemented, that is, how are the number of set bits counted.

For example, there is a simple approach where we intersect the unsigned long representation with 2^i and see whether the result is 2^i, in which case the ith bit is set, and we count the number of such bits.

This is a simplistic approach and there are known ways that are better.
Last edited on
You'll need to look up the complexity requirements for bitset; it will either be constant or linear.

SGI's reference implementation runs in linear time with respect to the number of bytes needed to store the bits.
It does this by creating a static array of 256 integers. The value stored at ith index in the array is the number of bits set in the value i.
Topic archived. No new replies allowed.