I've got the following function whose default values are being corrupted when it is called:
IError * SetSize( unsigned long nWd, unsigned long nHt,
long nEx1 = -1, long nEx2 = -1, long nEx3 = -1, long nEx4 = -1 );
When this is called like this:
// nWd=4028, nHt=1324, nExtra=20
SetSize( nWd, nHt, nExtra );
Then the actual values that I see inside the function are as follows:
nWd=4028, nHt=1324, nEx1=20, nEx2=0, nEx3=0, nEx4=0
But nEx2, nEx3 and nEx4 should all be set to -1.
I have no idea why this is happening. The same code works fine from CodeWarrior and on Windows using Visual Studio. Why are GCC (3.3) & Xcode (1.5) getting messed up?
Has anyone else seen anything like this? I tried creating a simple test case but I cannot reproduce the problem with it. I expect it's something really nasty so I'm hoping someone else has seen this and can point me in the right direction.
What optimization level are you running at? If you have optimized code, the default assignments may not be made until the variables are actually used in the function.