Re: NSTimer or double to int cast (Bug?)
Re: NSTimer or double to int cast (Bug?)
- Subject: Re: NSTimer or double to int cast (Bug?)
- From: Marco Binder <email@hidden>
- Date: Sat, 11 Jan 2003 15:35:11 +0100
I am using the same binary, but compiling it again on the iBook (both
have dec devtools and gcc 3.1) doesnt change anything.
OK, just now I4ve done some comparison and calculation stuff:
[timer timeInterval]<1.0 return NO on my G4 and YES on my iBook!
([timer timeInterval]-1.0) * 10000 returns 0.00000 on my G4 and
-0.00150 on my iBook!
So the returned timeInterval is NOT equal to the value passed for
initialization on all machines. I still consider that to be a bug,
altough one with easy workarounds.
Somehow it seems to be correlated with the machines speed, as Cameron4s
600 MHz iBook works as expected (that is: correct), whereas my 500 MHz
iBook, my girlfriends 400 MHz G3 iMac and another developers 400 MHz
iMac all three show the deviation.
Strange.
marco
PS to Dietmar: as it turned out, you were right. The question remaining
is, why is the timer not returning the same value as I have set?!
Am Freitag, 10.01.03 um 21:03 Uhr schrieb Chris Kane:
On Friday, January 10, 2003, at 07:49 AM, Chris Ridd wrote:
You're passing a double to NSLog but only using the %f (float) format
specifier. That doesn't address the fact that value does seem to be
less
than 1, but it might be worth seeing if using %lf made a difference.
%f is for doubles, as is %e and %g. All floats are promoted to
doubles in C/ObjC when they occur in the varargs region of a function
call with varargs (as with printf()).
Marco - have you tried using something other than NSLog() to do the
print-out? Also, try assigning the cast value to a local, then using
the local as an argument to print out, as in:
int casted = (int)[timer timeInterval]; // could also try
(int)([timer timeInterval] + 0.001)
NSLog(......, casted);
and see if that makes a difference. You could also try to do some
arithmetic comparisons with the double value, like:
double ti = [timer timeInterval];
if (ti < 0.9) printf(...);
else if (ti < 1.0) printf(...);
else if (ti < 1.01) printf(...);
else printf(...);
and see what happens. It's possible difference processors have
different behavior wrt converting, say, a 0.999999999 value to an int.
It's also possible that such a difference, if true, is perfectly
legitimate (that is, within the bounds of undefined or unspecified
behavior in C). [I don't remember any more if you said whether or not
you are using the same compiler on the two machines, or the same
binary.]
Chris Kane
Cocoa Frameworks, Apple
_______________________________________________
cocoa-dev mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/cocoa-dev
Do not post admin requests to the list. They will be ignored.
--
|\ /| E-Mail: email@hidden WWW: www.marco-binder.de
| \/ | Telefon: 07531 / 94 19 94 Fax: 07531 / 94 19 92
| |ARCO Snail-Mail: Banater Str. 3 - 78467 Konstanz
BINDER _____________________________________________________
--
|\ /| E-Mail: email@hidden WWW: www.marco-binder.de
| \/ | Telefon: 07531 / 94 19 94 Fax: 07531 / 94 19 92
| |ARCO Snail-Mail: Banater Str. 3 - 78467 Konstanz
BINDER _____________________________________________________
_______________________________________________
cocoa-dev mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/cocoa-dev
Do not post admin requests to the list. They will be ignored.