Lottery Problem

The code is written to solve a lottery problem. In this problem the user must guess 2 digits and: 1. If the user input matches the lottery in exact order, the award is $10,000 2. If the user matches the lottery, the award is $3000. 3. If the user matches one number the award is $1000.

So to generate a lottery he uses
srand(time(o));
int lottery = rand() % 100

I am guessing this generates a random number that is greater than 100?

Ok onto part 2 where we now find the digits from the lottery:

int lotteryDigit1 = lottery / 10;
int lotteryDigit2 = lottery % 10;

Now this is where I am confused. I don't understand why you divide the lottery by 10 for the first digit and then use a different process for obtaining the second digit, shouldn't the method be the same?
Topic archived. No new replies allowed.