Hex input as a main parameter

I have to make a program where you give the input as a parameter of main (you start the program from cmd) and one of the parameters has to be a 128 bit number written in hex. Then in the program I operate on bytes. And so my question is: how do I implement that? What kind of variable should I use and how do I ask for parameters which have to be given in console? I guess it's just my lack of knowledge about the syntax, I will be very grateful for help
Thanks a lot :) So then I have a string argument which is a very big number and I have to cut it to make bytes. Btw later I have to write these bytes in a 2-dimensional array 4x4, maybe I should somehow do it straightly just when I'm 'taking' the parameters into the program? Is it possible to write them directly like this? Any idea how to do it, anyone?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
#include <iostream>
#include <string>
#include <cstdint>
#include <iomanip>

int main( int argc, char* argv[] )
{
    const std::size_t N = 16 ;
    const std::size_t NHEXDIGITS = N * 2 ;

    if( argc != 2 )
    {
        std::cerr << "usage: " << argv[0] << " <" << N*8 << "-bit value as " << NHEXDIGITS << " hex digits>\n" ;
        return 1 ;
    }

    std::string hex = argv[1] ;
    if( hex.size() > NHEXDIGITS )
    {
        std::cerr << "too many hex digits\n" ;
        return 1 ;
    }

    std::cout << "command line argument: " << hex << '\n' ;
    // if less than N digits, pad with zeroes on the left
    if( hex.size() < NHEXDIGITS ) hex = std::string( NHEXDIGITS - hex.size(), '0' ) + hex ;

    std::uint8_t octets[N]{} ;

    for( std::size_t i = 0 ; i < N ; ++i )
    {
        try
        {
            // convert each set of two characters to an integer (octet) 
            // http://en.cppreference.com/w/cpp/string/basic_string/stoul
            octets[i] = std::stoul( hex.substr( i*2, 2 ), nullptr, 16 ) ;
        }
        catch( const std::exception& )
        {
            std::cerr << "command line parse error\n" ;
            return 1 ;
        }
    }

    std::cout << "octets: " << std::hex ;
    for( unsigned int v : octets ) std::cout << std::setw(2) << std::setfill('0') << v << ' ' ;
    std::cout << '\n' ;
}

http://coliru.stacked-crooked.com/a/fe9168d7f1eb3886
worry about splitting them up first i guess.
look at this:
http://www.cplusplus.com/reference/ios/hex/

or to convert to binary do something like this:
1
2
3
4
	int inputNumber = 1231234;

	std::bitset<32> x(inputNumber);
	std::cout << x;

(change that 32 to 128 though if needed)
bitset info:
http://www.cplusplus.com/reference/bitset/bitset/bitset/

this is assuming you're doing this in C++ and not C.
Last edited on
Thank you both but now I actually know more exactly what I need. I get a string argument in a function like e.g. 00112233445566778899aabbccddeeff and need to make an unsigned char array of it which looks like this:

unsigned char array[16]= {0x00 ,0x11 ,0x22 ,0x33 ,0x44 ,0x55 ,0x66 ,0x77 ,0x88 ,0x99 ,0xaa ,0xbb ,0xcc ,0xdd ,0xee ,0xff};

Yup, I am writing in C++ but I don't think I have to convert it to binary? I tried using strcpy, but I know I can't use hex with it and I have no idea how to do it another way.
Topic archived. No new replies allowed.