Possible GCC float comparison bug? Or am I doing it wrong?
Possible GCC float comparison bug? Or am I doing it wrong?
- Subject: Possible GCC float comparison bug? Or am I doing it wrong?
- From: David Rogers <email@hidden>
- Date: Tue, 19 Feb 2008 21:27:25 -0600
This may be completely obvious but I can't tell what's going on. As far as I know the comparison should be true. I've tried it with Xcode 3.0 on both Intel and PPC and get the same results on both.
-0.001 is less than -0.1, right? Or does the compiler see things differently? Here's a sample app to illustrate the behavior that I'm seeing.
#include <iostream>
int main (int argc, char * const argv[]) { float a = -0.001f; if(a < -0.1f) { std::cout << "I should see this message"; }
if(a < 0.0f) { std::cout << "I should see this message too"; }
return 0; }
And the output
[Session started at 2008-02-19 21:19:57 -0600.] I should see this message too The Debugger has exited with status 0.
What am I doing wrong? I really doubt that this is a compiler bug but I suppose that it's possible.
Thanks in advance for any help,
Dave Rogers
|
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden