Converting a 'long' type into a binary String

590 views Asked by At

My objective is to write an algorithm that would be able to convert a long number into a binary number stored in a string.

Here is my current block of code:

#include <iostream>
#define LONG_SIZE 64; // size of a long type is 64 bits

using namespace std;

string b10_to_b2(long x)
{
    string binNum;
    if(x < 0) // determine if the number is negative, a number in two's complement will be neg if its' first bit is zero.
    {
            binNum = "1";
    }
    else
    {
            binNum = "0";
    }
    int i = LONG_SIZE - 1;
    while(i > 0)
    {
            i --;
            if( (x & ( 1 << i) ) == ( 1 << i) )
            {
                    binNum = binNum + "1";
            }
            else
            {
                    binNum = binNum + "0";
            }
    }
    return binNum;
}

int main()
{
    cout << b10_to_b2(10) << endl;
}

The output of this program is:

00000000000000000000000000000101000000000000000000000000000001010

I want the output to be:

00000000000000000000000000000000000000000000000000000000000001010

Can anyone identify the problem? For whatever reason the function outputs 10 represented by 32 bits concatenated with another 10 represented by 32 bits.

2

There are 2 answers

9
tejas On BEST ANSWER

why would you assume long is 64 bit? try const size_t LONG_SIZE=sizeof(long)*8;

check this, the program works correctly with my changes http://ideone.com/y3OeB3

Edit: and ad @Mats Petersson pointed out you can make it more robust by changing this line

if( (x & ( 1 << i) ) == ( 1 << i) )

to something like

if( (x & ( 1UL << i) ) ) where that UL is important, you can see his explanation the the comments

0
Mats Petersson On

Several suggestions:

  1. Make sure you use a type that is guaranteed to be 64-bit, such as uint64_t, int64_t or long long.
  2. Use above mentioned 64-bit type for your variable i to guarantee that the 1 << i calculates correctly. This is caused by the fact that shift is only guaranteed by the standard when the number of bits shifted are less or equal to the number of bits in the type being shifted - and 1 is the type int, which for most modern platforms (evidently including yours) is 32 bits.
  3. Don't put semicolon on the end of your #define LONG_SIZE - or better yet, use const int long_size = 64; as this allows all manner of better behaviour, for example that you in the debugger can print long_size and get 64, where print LONG_SIZE where LONG_SIZE is a macro will yield an error in the debugger.