Decltype.

When i run the following program, I cant figure out why the value of a is changed when the value of d is incremented. I think it's got to be due to a reference in 'decltype ((b)) d = a', But i just cant work out how a is 'tied' to d in this statement.


#include <iostream>

int main()
{
int a = 3;
int b = 4;
decltype (a) c = a;
decltype ((b)) d = a;
std::cout << a << " " << b << " " << c << " " << d << std::endl; // Output 3 4 3 3
++c;
std::cout << a << " " << b << " " << c << " " << d << std::endl; // Output 3 4 4 3
++d;
std::cout << a << " " << b << " " << c << " " << d << std::endl; // Output 4 4 4 4
return 0;
}
Last edited on
Algebra is for humans not for computers.
c=a means memory addres in that stored c = memory addres in that stored a.
d is a reference to a.

decltype (a) c = a; => int c = a ; ie. decltype (a) is int

decltype ( (b) ) d = a; => int& d = a ; ie. decltype ( (b) ) is reference to int
Thank's the replies, Sorted.
Topic archived. No new replies allowed.