a) nope, "int" has allways been CPU-specific, and was 16bit on 16bit CPUs.
b) dunno, I can printf (U)LONG, what compiler are you useing ?
c)

? it seems that you/the compiler is interpreting an unsigned
int as an signed int. Have a real close look at your sources,
or post them here.
d) you mean 1500000000+1000000000 doesn't end up negative ?
Probraly the CPU truncating everything that would go beond
0xafffffff.