Instead of 0 and 1, incomprehensible characters are displayed

Pages: 12
In output (below) in the encode line, I get incomprehensible characters. Although, in theory, the result of the calculation should be 0110 ^ 1111. What is the problem?

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
#include <bits/stdc++.h>

unsigned foo(unsigned l2, unsigned start_state2, std::string& key2) {
    uint32_t state = 0b1111111111111111111111111111111;
    uint32_t start_state{ state };
    uint32_t period{ 0 };
    uint32_t bit2 { 0 };
    do
    {
        bit2 = ((state >> 31) ^ (state >> 30) ^ (state >> 29) ^ (state >> 25) ^ state);
        key2[period] = bit2;
        //std::cout << std::bitset<1>(key2[period]) << " ";
        state = (state >> 1) | (bit2 << 31);

        std::cout << std::bitset<32>(state) << std::endl;
        period++;

    } while ((period != 15) && (start_state != state));

    std::cout << "Period: " << period << std::endl;
}

std::string encode2(std::string data, std::string& key2) {
    std::string output = data;
    for(int i=0; i < data.size(); ++i) {
        output[i] = data[i] ^ (key2[i]) /*+ '0'*/;
    }
    return output;
}

int main() {
    unsigned bit2;
    unsigned start_state2;
    unsigned l2 = start_state2;
    std::string key2 = " ";
    foo(l2, start_state2, key2);


    std::string data2 = "1111";
    std::cout << "data: " << data2 << std::endl;

    std::cout << "key:  ";
    for(int i=0; i < data2.size(); ++i) {
        key2[i] += '0';
        std::cout << std::bitset<1>(key2[i]);
    }
    std::cout << std::endl;

    std::string Encode2 = encode2(data2, key2);
    std::string decode2 = encode2(Encode2, key2);
    std::cout << "encode: " << Encode2 << std::endl;
    std::cout << "decode: " << decode2 << std::endl;
}



00111111111111111111111111111111
10011111111111111111111111111111
11001111111111111111111111111111
01100111111111111111111111111111
00110011111111111111111111111111
10011001111111111111111111111111
01001100111111111111111111111111
00100110011111111111111111111111
10010011001111111111111111111111
11001001100111111111111111111111
11100100110011111111111111111111
01110010011001111111111111111111
00111001001100111111111111111111
00011100100110011111111111111111
10001110010011001111111111111111
Period: 15
data: 1111
key:  0110
encode: ├ ╓¤
decode: 1111

prog2222 wrote:
Although, in theory, the result of the calculation should be 0110 ^ 1111. What is the problem?
You seem to be getting confused between your data types.

edit:

I started working on a Linear Feedback Shift Register but I'm not entirely sure I understand what you want to do with it. Have you got something in mind or are you just exploring cryptography?

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
#include <iostream>
#include <bitset>
#include <sstream>

template <typename T>
void show_binary_rep(const T& val)
{
    const char* beg = reinterpret_cast<const char*>(&val);
    const char* end = beg + sizeof(val);
    while (beg != end)
        std::cout << std::bitset<CHAR_BIT>(*beg++);
    std::cout << '\n';
}

uint32_t LFSR()
{
    static uint32_t SiftRegister = 43722;
    SiftRegister = ((((SiftRegister >> 31)
        ^ (SiftRegister >> 6)
        ^ (SiftRegister >> 4)
        ^ (SiftRegister >> 2)
        ^ (SiftRegister >> 1)
        ^ SiftRegister)
        & 1)
        << 31)
        | (SiftRegister >> 1);
    return SiftRegister & 1;
}

std::string Encode(const std::string plainText, const std::string key)
{
    auto index{ 0 };
    const auto key_length = key.length();
    std::stringstream ss;
    for (const char c : plainText)
    {
        ss << static_cast<char>(c ^ key[index % key_length]);
        index++;
    }
    return ss.str();
}

int main()
{
    std::byte i{ 1 };
    std::byte t{'1'};
    std::byte b{ 0b00000110 };
    
    std::cout << "Binary representation of the number 1:    "; show_binary_rep(i);
    std::cout << "Binary representation of the charecter 1: "; show_binary_rep(t);
    std::cout << "Binary representation of a bit pattern:   "; show_binary_rep(b);
    
    std::cout << "-------------------------------------------------------------\n\n";

    std::string text{ "The quick brown fox jumped over the lazy dog." };
    std::string key{ "GreyWolf" };
    std::string cypher{ Encode(text, key) };

    std::cout << "Plain text:   " << text << '\n';
    std::cout << "Cypher text:  " << cypher << '\n';
    std::cout << "Decoded text: " << Encode(cypher, key) << '\n';
}

Binary representation of the number 1:    00000001
Binary representation of the charecter 1: 00110001
Binary representation of a bit pattern:   00000110
-------------------------------------------------------------

Plain text:   The quick brown fox jumped over the lazy dog.
Cypher text:  ‼→Y&→♣♣,R♂8↑☻F!↔↔Y=→☺▬"▬E▬!
▲F3→Y;♫▬▼g▬
▲y
Decoded text: The quick brown fox jumped over the lazy dog.
Last edited on
#include <bits/stdc++.h>


Please don't do this, just include the file from whatever container or algorithm you need to use. Let the compiler decide what it needs from bits directory.
@OP
As people have already written, your code is an uncoordinated mish-mash of strings, characters, integers and bitsets.

In common language it's all over the place like a dog's breakfast ...

You need to organize your thinking by coding something like the following steps, and don't move a millimeter into the next step until the preceding step works perfectly the way you want:

1. Get a string
2. Convert it to a bitset ( #include <bitset> )

3. Get a key as a string
4. Convert it to a bitset

5. Encode via bit-shifting with the 2 bitsets to get the ciphertext.
6. Decode the bitset cipertext to get the plaintext.
@OP
Along those lines in the previous and playing around with it ...

The aspect that now interests me is developing he key from a string to ensure there is a further element of randomness, which is fairly easy. There is also no reason why SIZE can't be a much larger number than 8, and with shifts etc more complex use can be made of just parts of a bitset to encode ASCII.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
#include <bitset>
#include <string>
#include <iostream>
#include <vector>
#include <iomanip>

const uint SIZE{8};

int main()
{
    // DATA
    std::string data;
    //    std::cout << "Enter data string: ";
    //    std::cin >> data;
    data = "12345ABcdef";
    std::cout << "Data string: " << data << '\n';
    
    for(int i = 0; i < data.length(); i++)
    {
        std::bitset<SIZE> d(data[i]);
        
        std::cout << d << ' ';
    }
    std::cout << '\n';
    
    // KEY
    char key;
    //    std::cout << "Enter key string: ";
    //    std::cin >> key;
    key = '5';
    std::cout << "Key string: " << key << '\n';
    
    std::bitset<SIZE> k(key);
    
    // ENCODE
    std::vector<std::bitset<SIZE>> vec_bitset_data;
    
    for(int i = 0; i < data.length(); i++)
    {
        std::bitset<SIZE> d(data[i]);
        std::bitset<SIZE> encoded = d^k;
        vec_bitset_data.push_back(encoded);
        
        std::cout << encoded << ' ';
    }
    std::cout << '\n';
    
    // DECODE
    for(int i = 0; i < vec_bitset_data.size(); i++)
    {
        std::bitset<SIZE> decrypted = vec_bitset_data[i] ^ k;
        std::cout
        << std::setw(SIZE + 1) << std::left << decrypted;
    }
    std::cout << '\n';
    
    for(int i = 0; i < vec_bitset_data.size(); i++)
    {
        std::bitset<SIZE> decrypted = vec_bitset_data[i] ^ k;
        std::cout
        << std::setw(SIZE + 1) << std::left
        << static_cast<unsigned char>( decrypted.to_ulong() );
    }
    std::cout << '\n';
    
    return 0;
}



Data string: 12345ABcdef
00110001 00110010 00110011 00110100 00110101 01000001 01000010 01100011 01100100 01100101 01100110 
Key string: 5
00000100 00000111 00000110 00000001 00000000 01110100 01110111 01010110 01010001 01010000 01010011 
00110001 00110010 00110011 00110100 00110101 01000001 01000010 01100011 01100100 01100101 01100110 
1        2        3        4        5        A        B        c        d        e        f        
Program ended with exit code: 0
@againtry thanks! Can you explain any results? For example we have key = 5 (00000101). We also have 1(00000100). And we do 00110001 ^ 00000101 = 00110100. But programm gives 00000100. Why?
Last edited on
You are still getting confused with types.

The 'key' that is used is a ASCII char 5, this has the bit pattern 00110101.
The first char of the data string is an ASCII char1 that has the pattern 00110001.

00110001 - '1'
00110101 - '5'
======== - ^
00000100 - result

Last edited on
Thank. I just use 00000101 before. Not ascii
Thank. I just use 00000101 before. Not ascii


@prog2222
Keep in mind that the way you generate your key in the first place doesn't have to be based on the ASCII '5' value, which is 53 which give 0011 0101.
Your key is based on the numerical integer 5 value, ie 00000101

As long as you have consistency as far as encryption and decryption with respect to keys, the value for the key can be anything you like. The more arbitrary and random it is the better.
@againtru I try to rewrite code. But I have wrong result in function ENCODE. But in main() I have true value. It is not seems to me that problem is a wrong type...?

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
#include <iostream>
#include <bitset>
#include <string>
#include <vector>
#include <iomanip>

unsigned LFSR2(unsigned& lfsr2, unsigned& start_state2, std::string& key2) {
    uint32_t state = 0b1111111111111111111111111111111;
    uint32_t start_state{ state };
    uint32_t period{ 0 };
    char bit2 { 0 };
    do {
        bit2 = ((state >> 31) ^ (state >> 30) ^ (state >> 29) ^ (state >> 25) ^ state);
        key2[period] = bit2;
        std::cout << std::bitset<1>(key2[period]) << " ";
        state = (state >> 1) | (bit2 << 31);

        std::cout << std::bitset<32>(state) << std::endl;
        period++;

    } while ((period != 15) && (start_state != state));

    std::cout << "Period: " << period << std::endl;
}

const int SIZE{8};
std::string ENCODE(std::string& data, std::string& key2) {
    std::string output = " ";
    for(int i=0; i < data.size(); ++i) {
        output[i] = data[i] ^ key2[i /*% key2.size()*/];
        std::cout << std::bitset<SIZE>(output[i]) << " ";
    }
    return output;
}

int main() {
    unsigned start_state2;
    std::string key2 = " ";
    unsigned lfsr2 = start_state2;

    LFSR2(lfsr2, start_state2, key2);

    std::string data;
    data = "Hello";

    std::cout << "\nData string: " << data << '\n';
    for(int i = 0; i < data.size(); i++) {
        std::bitset<SIZE> d(data[i]);
        std::cout << d << ' ';
    }
    std::cout << '\n';

    std::cout << "Key: ";
    for(int i=0; i < 8; ++i) {
        std::cout << std::bitset<1>(key2[i]);
    }
    std::cout << '\n';




//    std::cout << "TRUE VALUE 1:";
//    for(int i=0; i < 8; ++i) {
//        std::cout << std::bitset<1>((data[i] ^ (key2[i/*% key2.size()*/])));
//    }
//    std::cout << '\n';



    ENCODE(data, key2);

    std::cout << '\n';
    return 0;
}
L28. You are setting output to " " (size 1), but the code assumes that it has the same size as data.

Have L28 as:

 
std::string output;


Then L30/31 becomes:

1
2
output += data[i] ^ key2[i];
std::cout << std::bitset<SIZE>(output.back()) << " ";


PS There is the same problem L14/15
Last edited on
@seeplus
But I have the same result

Data string: Hello
01001000 01100101 01101100 01101100 01101111
Key: 01100100
TRUE:00101100
10001010 10000100 11011011 11110000 10100001
Last edited on
prog2222, are you trying to use an LFSR to generate a single byte to use in an xor encryption function?
@TheGreyWolf
I use only the last bits of sequences of lfsr. It is key. Now it is 01100100...
Last edited on
As long as the bitset SIZE's are compatible, the source of the key is immaterial. Here's a hard coded version that bears no relation to anything other than what I typed in.

All the jargon going on doesn't seem to be relevant at all to the problem you have in confusing types.

Of course you might have some sort of cyclic key in mind - the next Zodiac killer breakthrough - and good luck with that.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
#include <bitset>
#include <string>
#include <iostream>
#include <vector>
#include <iomanip>

const uint SIZE{20}; // <--

int main()
{
    // DATA
    std::string data;
    //    std::cout << "Enter data string: ";
    //    std::cin >> data;
    data = "12345ABcdef";
    std::cout << "Data string: " << data << '\n';
    
    for(int i = 0; i < data.length(); i++)
    {
        std::bitset<SIZE> d(data[i]);
        std::cout << d << ' ';
    }
    std::cout << '\n';
    
    // KEY
    char key;
    //    std::cout << "Enter key string: ";
    //    std::cin >> key;
    
    //    key = '5';
    //    std::bitset<8> k(key);
    
    std::bitset<SIZE> k = 0b11011011001100111010; // A RANDOM HARD-CODED KEY
    std::cout
   // << "Key string: " << key << '\n'
    << k << '\n';
    
    // ENCODE
    std::vector<std::bitset<SIZE>> vec_bitset_data;
    
    for(int i = 0; i < data.length(); i++)
    {
        std::bitset<SIZE> d(data[i]);
        std::bitset<SIZE> encoded = d^k;
        vec_bitset_data.push_back(encoded);
        
        std::cout << encoded << ' ';
    }
    std::cout << '\n';
    
    // DECODE
    for(int i = 0; i < vec_bitset_data.size(); i++)
    {
        std::bitset<SIZE> decrypted = vec_bitset_data[i] ^ k;
        std::cout
        << std::setw(SIZE + 1) << std::left << decrypted;
    }
    std::cout << '\n';
    
    for(int i = 0; i < vec_bitset_data.size(); i++)
    {
        std::bitset<SIZE> decrypted = vec_bitset_data[i] ^ k;
        std::cout
        << std::setw(SIZE + 1) << std::left
        << static_cast<unsigned char>( decrypted.to_ulong() );
    }
    std::cout << '\n';
    
    return 0;
}


Data string: 12345ABcdef
00000000000000110001 00000000000000110010 00000000000000110011 00000000000000110100 00000000000000110101 00000000000001000001 00000000000001000010 00000000000001100011 00000000000001100100 00000000000001100101 00000000000001100110 
Key string: 
11011011001100111010
11011011001100001011 11011011001100001000 11011011001100001001 11011011001100001110 11011011001100001111 11011011001101111011 11011011001101111000 11011011001101011001 11011011001101011110 11011011001101011111 11011011001101011100 
00000000000000110001 00000000000000110010 00000000000000110011 00000000000000110100 00000000000000110101 00000000000001000001 00000000000001000010 00000000000001100011 00000000000001100100 00000000000001100101 00000000000001100110 
1                    2                    3                    4                    5                    A                    B                    c                    d                    e                    f                    
Program ended with exit code: 0

@againtry thank you very much
But I have any problems else...
Now this code give me key2 like 0110... but when I use std::bitset<1>(kye2) for zero I have 0. When I use std::bitset<8>(kye2) for zero I have 11000010. And results of ^ is true.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
#include <iostream>
#include <bitset>
#include <string>
#include <vector>
#include <iomanip>

std::vector<char> foo(std::vector<char>& key2) {
    uint32_t state = 0b1111111111111111111111111111111;
    uint32_t start_state2{ state };
    uint32_t period {0};
    uint32_t bit2 {0};
    do {
        bit2 = ((state >> 31) ^ (state >> 30) ^ (state >> 29) ^ (state >> 25) ^ state);
        //bit2 = key2[period];
        (key2[period]) = (bit2);
        std::cout << std::bitset<8>(key2[period]) << " ";
        state = (state >> 1) | (bit2 << 31);

        std::cout << std::bitset<32>(state) << std::endl;
        period++;

    } while ((period != 15) && (start_state2 != state));

    std::cout << "Period: " << period << std::endl;
    return key2;
}

const int SIZE{8};
void ENCODE(std::string data, std::vector<char>& key2) {
    std::vector<std::bitset<SIZE>> vec_bitset_data;

        for(unsigned i=0; i < data.size(); ++i) {
            std::bitset<SIZE> d(data[i]);
            std::bitset<SIZE> k(key2[i]);

            std::bitset<SIZE> encoded = d^k;
           std::cout << encoded << ' ';
        }
        std::cout << " ";
}

int main() {
    std::vector<char> key2(32);


    foo(key2);

    std::string data;
    data = "Hello";

    std::cout << "\nData string: " << data << '\n';
    for(unsigned i = 0; i < data.size(); i++) {
        std::bitset<8> d(data[i]);
        std::cout << d << ' ';
    }
    std::cout << '\n';

    std::cout << "Key: ";
    for(unsigned i=0; i < data.size(); ++i) {
        std::cout << std::bitset<8>(key2[i]) << " ";
    }
    std::cout << '\n';

    ENCODE(data, key2);
    return 0;
}



OUTPUT
1
2
3
4
Data string: Hello
01001000 01100101 01101100 01101100 01101111          //SYMBOLS OF EACH LETTER 
Key: 11000010 11100001 10110111 10011100 11001110   //KEY FOR STD :: BITSET <8> 
10001010 10000100 11011011 11110000 10100001          //RESULT OF ^ 
prog2222 wrote:
But I have any problems else...
Now this code give me key2 like 0110... but when I use std::bitset<1>(kye2) for zero I have 0. When I use std::bitset<8>(kye2) for zero I have 11000010. And results of ^ is true.
I'm not sure I understand what you are saying here...but from your earlier posts I don't think your foo() function is doing what you think it is.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
#include <iostream>
#include <bitset>
#include <string>
#include <sstream>
#include <iomanip>
#include <cstddef>
//
//
using LFSR_t = uint8_t;
static LFSR_t shift_register{ 165 };
//
//-------------------------------------------------------------------
LFSR_t set_shift_register(uint8_t seed)
{
    if (seed == 0) return 0;
    shift_register = seed;
    return shift_register;
}
//
//-------------------------------------------------------------------
LFSR_t Step_LFSR()
{
    LFSR_t lsb = shift_register & 1 ;
   
    shift_register = (shift_register >> 1) | 
                   ( ( (shift_register >> 0)
                     ^ (shift_register >> 4)
                     ^ (shift_register >> 5)
                     ^ (shift_register >> 6)
                     & LFSR_t(1))
                       << 7);
    return lsb;
}
//
//-------------------------------------------------------------------
uint8_t test_LFSR()
{
    LFSR_t start_state{ shift_register };
    uint8_t period{ 0 };
    do
    {
        std::cout << static_cast<int>(Step_LFSR()) << ' ';
        period++;
        if (period % 32 == 0) std::cout << '\n';

    } while (start_state != shift_register);
    std::cout << '\n';
    return period;
}
//
//-------------------------------------------------------------------
std::string Encode(const std::string plainText, const std::string key)
{
    auto index{ 0 };
    const auto key_length = key.length();
    std::stringstream ss;
    for (const char c : plainText)
    {
        ss << static_cast<char>(c ^ key[index % key_length]);
        index++;
    }
    return ss.str();
}
//
//-------------------------------------------------------------------
int main() 
{
    // Test the LFSR.  This should step through all the values of
    // the LFSR until it returned back to the starting state.
    // The maximum period for an 8-bit LFSR is 255.
    std::cout << "Testing LFSR...\n";
    int period{ test_LFSR() };
    std::cout << "8-bit LFSR period: " << period << '\n';
    //
    // Using the 'bits' from an LFSR requires that you store them in some way
    // 
    std::bitset<8> key_8{};
    for (int i{ 0 }; i < 8; i++)
    {
        key_8 <<= 1; 
        key_8 |= Step_LFSR();
    }
    for (int i = 0; i < 8; i++)
    {
        std::cout << "Bit " << i << " of " << key_8 << '\n';
        std::cout << std::right << std::setw(17 - i) << key_8.test(i) << '\n';
        std::cout << '\n';
    }
    
    // Set the shift register back to the initial value
    // for comparison with the first
    set_shift_register(165);
    //
    // Generate a key in a char
    char key_char{};
    for (int i{ 0 }; i < 8; i++)
    {
        key_char <<= 1;
        key_char |= Step_LFSR();
    }
    //
    //
    set_shift_register(165);
    //
    std::bitset<32> key_32{};
    for (int i = 0; i < 32; i++)
    {
        key_32 <<= 1;
        key_32 |= Step_LFSR();
    }
    // Horrible kludge to get a bitset to a char.
    // but it is only to show that the bits are the same 
    std::cout << "Generated key_8:    " << key_8 << " as char " << static_cast<char>(key_8.to_ulong()) << '\n';
    std::cout << "Generated key_char: " << std::bitset<8>(key_char) << " as char " << key_char << '\n';
    std::cout << "Generated key_32:   " << key_32 << '\n';

    return 0;
}

Testing LFSR...
1 0 1 0 0 1 0 1 0 0 0 1 0 0 1 0 1 1 0 1 0 0 0 1 1 0 0 1 1 1 0 0
1 1 1 1 0 0 0 1 1 0 1 1 0 0 0 0 1 0 0 0 1 0 1 1 1 0 1 0 1 1 1 1
0 1 1 0 1 1 1 1 1 0 0 0 0 1 1 0 1 0 0 1 1 0 1 0 1 1 0 1 1 0 1 0
1 0 0 0 0 0 1 0 0 1 1 1 0 1 1 0 0 1 0 0 1 0 0 1 1 0 0 0 0 0 0 1
1 1 0 1 0 0 1 0 0 0 1 1 1 0 0 0 1 0 0 0 0 0 0 0 1 0 1 1 0 0 0 1
1 1 1 0 1 0 0 0 0 1 1 1 1 1 1 1 1 0 0 1 0 0 0 0 1 0 1 0 0 1 1 1
1 1 0 1 0 1 0 1 0 1 1 1 0 0 0 0 0 1 1 0 0 0 1 0 1 0 1 1 0 0 1 1
0 0 1 0 1 1 1 1 1 1 0 1 1 1 1 0 0 1 1 0 1 1 1 0 1 1 1 0 0 1 0
8-bit LFSR period: 255
Bit 0 of 10100101
                1

Bit 1 of 10100101
               0

Bit 2 of 10100101
              1

Bit 3 of 10100101
             0

Bit 4 of 10100101
            0

Bit 5 of 10100101
           1

Bit 6 of 10100101
          0

Bit 7 of 10100101
         1

Generated key_8:    10100101 as char Ñ
Generated key_char: 10100101 as char Ñ
Generated key_32:   10100101000100101101000110011100
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
#include <iostream>
#include <bitset>
#include <string>
#include <vector>
#include <random>

const int SIZE{15};
typedef std::vector< std::bitset<SIZE> > BITVECTOR;

void display_bitvector(BITVECTOR&, std::string, int);
void encode_plain(BITVECTOR&, std::string);
void generate_key(BITVECTOR&);

BITVECTOR encrypt(BITVECTOR&, BITVECTOR& );
BITVECTOR decrypt(BITVECTOR&, BITVECTOR&);

void convert_bitset_to_char(BITVECTOR&, std::string);

int main()
{
    // PLAIN TEXT
    std::string plain_text;
    plain_text = "Hello to you! What are you doing today? 1234567890/#";
    std::cout << "Plain text:\n" << plain_text << "\n\n";
    
    
    // ENCODE TEXT
    BITVECTOR vec_plain;
    encode_plain(vec_plain, plain_text);
    display_bitvector(vec_plain, "Encoded plain text", 80/SIZE);
    
    
    // GENERATE ENCRYPTION KEY(s)
    BITVECTOR vec_key;
    generate_key(vec_key);
    display_bitvector(vec_key, "Key", 80/SIZE);
    
    // ENCRYPT ENCODED TEXT
    BITVECTOR vec_encrypted;
    vec_encrypted = encrypt(vec_plain, vec_key);
    display_bitvector(vec_encrypted, "Encrypted", 80/SIZE);
    
    
    // DECRYPT ENCRYPTED TEXT
    BITVECTOR vec_decrypted;
    vec_decrypted = encrypt(vec_encrypted, vec_key);
    display_bitvector(vec_decrypted, "Decrypted", 80/SIZE);
    
    convert_bitset_to_char(vec_encrypted, "Encrypted text");
    convert_bitset_to_char(vec_decrypted, "Plain text");
    
    return 0;
}

void display_bitvector(BITVECTOR& v, std::string msg, int cols)
{
    std::cout << msg << ":\n";
    int count{0};
    for(auto i: v)
    {
        std::cout << i << ' ';
        count++;
        if (count % cols == 0)
            std::cout << '\n';
    }
    std::cout << "\n\n";
}


void encode_plain(BITVECTOR& vec_plain, std::string plain_text)
{
    for(auto i: plain_text)
    {
        std::bitset<SIZE> d(i);
        vec_plain.push_back(d);
    }
}


void generate_key(BITVECTOR& vec_key)
{
    std::default_random_engine generator;std::random_device rd;
    std::mt19937 gen(rd());
    std::uniform_int_distribution<> distrib(0, 1);
    
    std::bitset<SIZE> bits;
    for(int j = 0; j < SIZE; j++)
    {
        for (int n = 0; n < SIZE; ++n)
        {
            bits[n] = distrib(gen);
        }
        vec_key.push_back(bits);
    }
}


BITVECTOR encrypt(BITVECTOR& src, BITVECTOR& k )
{
    BITVECTOR bv;
    for(int i = 0; i < src.size(); i++)
    {
        bv.push_back(src[i] ^ k[i % SIZE]);
    }
    return bv;
}


BITVECTOR decrypt(BITVECTOR& src, BITVECTOR& k )
{
    BITVECTOR bv = encrypt( src, k );
    return bv;
}

void convert_bitset_to_char(BITVECTOR& src, std::string msg)
{
    std::cout << msg << ":\n";
    
    for(int i = 0; i < src.size(); i++)
    {
        std::cout << static_cast<unsigned char>( src[i].to_ulong() );
    }
    std::cout << "\n\n";
}



Plain text:
Hello to you! What are you doing today? 1234567890/#

Encoded plain text:
000000001001000 000000001100101 000000001101100 000000001101100 000000001101111 
000000000100000 000000001110100 000000001101111 000000000100000 000000001111001 
000000001101111 000000001110101 000000000100001 000000000100000 000000001010111 
000000001101000 000000001100001 000000001110100 000000000100000 000000001100001 
000000001110010 000000001100101 000000000100000 000000001111001 000000001101111 
000000001110101 000000000100000 000000001100100 000000001101111 000000001101001 
000000001101110 000000001100111 000000000100000 000000001110100 000000001101111 
000000001100100 000000001100001 000000001111001 000000000111111 000000000100000 
000000000110001 000000000110010 000000000110011 000000000110100 000000000110101 
000000000110110 000000000110111 000000000111000 000000000111001 000000000110000 
000000000101111 000000000100011 

Key:
110010001101011 010010110111000 110101001011001 100010010100001 111100000010101 
101110110001101 011001101010000 110000001101011 100001101101100 101000100000101 
100000011100001 001000110110101 010101011100011 111101111110011 001111000001011 


Encrypted:
110010000100011 010010111011101 110101000110101 100010011001101 111100001111010 
101110110101101 011001100100100 110000000000100 100001101001100 101000101111100 
100000010001110 001000111000000 010101011000010 111101111010011 001111001011100 
110010000000011 010010111011001 110101000101101 100010010000001 111100001110100 
101110111111111 011001100110101 110000001001011 100001100010101 101000101101010 
100000010010100 001000110010101 010101010000111 111101110011100 001111001100010 
110010000000101 010010111011111 110101001111001 100010011010101 111100001111010 
101110111101001 011001100110001 110000000010010 100001101010011 101000100100101 
100000011010000 001000110000111 010101011010000 111101111000111 001111000111110 
110010001011101 010010110001111 110101001100001 100010010011000 111100000100101 
101110110100010 011001101110011 

Decrypted:
000000001001000 000000001100101 000000001101100 000000001101100 000000001101111 
000000000100000 000000001110100 000000001101111 000000000100000 000000001111001 
000000001101111 000000001110101 000000000100001 000000000100000 000000001010111 
000000001101000 000000001100001 000000001110100 000000000100000 000000001100001 
000000001110010 000000001100101 000000000100000 000000001111001 000000001101111 
000000001110101 000000000100000 000000001100100 000000001101111 000000001101001 
000000001101110 000000001100111 000000000100000 000000001110100 000000001101111 
000000001100100 000000001100001 000000001111001 000000000111111 000000000100000 
000000000110001 000000000110010 000000000110011 000000000110100 000000000110101 
000000000110110 000000000110111 000000000111000 000000000111001 000000000110000 
000000000101111 000000000100011 

Encrypted text:
#\3355\315z\255$L|\216\300\302\323\\331-\201t\3775K&j\224\225
\207\234b\337y\325z\3511S%Ї\320\307>]\217a\230%\242s

Plain text:
Hello to you! What are you doing today? 1234567890/#

Program ended with exit code: 0
Last edited on
@The Grey Wolf
@againtry
Thanks!

But can I do:
1
2
3
4
5
6
7
8
9
10
LFSR_t Step_LFSR() {
    LFSR_t lsb = shift_register & 1 ;

    shift_register = (shift_register >> 1) | ( (shift_register 
                          ^ (shift_register >> 1)
                          ^ (shift_register >> 18)
                          ^ (shift_register >> 20) 
                          ^ (shift_register >> 39) & LFSR_t(1)) << 40);
    return lsb;
}


Because I have then:

Testing LFSR...
1 0 1 0 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
...
...

If I write
1
2
3
4
5
6
7
8
9
10
LFSR_t Step_LFSR() {
    LFSR_t lsb = shift_register & 1 ;

    shift_register = (shift_register >> 1) | ( (shift_register 
                          ^ (shift_register >> 1) 
                          ^ (shift_register >> 18)
                           ^ (shift_register >> 20) 
                          ^ (shift_register >> 39) & LFSR_t(1)) << 7);
    return lsb;
}

I have

Testing LFSR...
1 0 1 0 0 1 0
Last edited on
Pages: 12