Re: double precision errors and numbers not adding correctly
Re: double precision errors and numbers not adding correctly
- Subject: Re: double precision errors and numbers not adding correctly
- From: "Erik M. Buck" <email@hidden>
- Date: Sat, 23 Mar 2002 09:53:16 -0600
- Organization: EMB & Assocites Inc.
First, this lame discussion has already taken place. Search the archives
please.
Second, precision is a Computer Science 200 level issue. Doesn't anyone
take a numerical methods course in school anymore ?
----- Original Message -----
From: "Chad Armstrong" <email@hidden>
To: <email@hidden>
Sent: Saturday, March 23, 2002 2:50 AM
Subject: double precision errors and numbers not adding correctly
>
I have two problems.
>
>
With a calculator I'm designing, it works great for the most part,
>
except for a small precision error. If the number which is typed in is
>
between 64 and 99 and either a 1 or 6 is typed for the decimal spot for
>
the first number, it comes up with an unusual number. For example, 65.1
>
would show up like 65.09999999999999 and 93.6 would be
>
93.59999999999999. Is there any method which will round to the closest
>
tenth?
>
>
But another weird problem is I've even tried to have the number 65.1 be
>
put into the display by hard coding in the number. It still comes up
>
with the 65.09999... number. When I added 65.0 + 0.1, it came up with
>
65.1, and did a printf() statement, then 65.1 would print out, but if I
>
had the precision to print out like %2.14f, then the 65.099999... answer
>
would appear. Is there a way to set the floating/double precision using
>
Objective-C or C? This value is actually a double, not a float.
>
>
Another odd part is that numbers like 169 are not affected, but any
>
number from 64.0 to 99 are affected (including their negative
>
counterparts).
>
>
//
>
-----------------------------------------------------------------------
>
>
My second problem involves the changing (or not changing) of variables
>
in a class. In my EGView class which deals with the information drawn
>
to a custom NSView, I have one function which is called by my controller
>
class. When the EGView variables are incremented like so:
>
>
myVar += 1.0;
>
>
I then print out the variable, and it starts out at 0.0, BUT when myVar
>
is initialized it starts out at a different value other than 0.0. So
>
when I print out the value of myVar from the method which is
>
incrementing, it shows it is being incremented. However, when I look at
>
the value from the drawRect function, it sees the original initialized
>
value. If I change the myVar value in the drawRect function, then it
>
increments fine, but this does not affect the 'other' myVar. What is
>
going on here? Things which should seem fairly straightforward are
>
becoming very irritating.
>
>
On one bright note, I did figure out how to get 'external' variables of
>
the drawRect function to work properly. I needed to call a self =
>
[super initWithFrame:frame]; in my initialization method for EGView.
>
>
Anyway, these seem to be some very odd and irritating errors, and I hope
>
someone can figure out how they might be corrected, if possible.
>
>
Chad Armstrong
>
email@hidden
>
_______________________________________________
>
cocoa-dev mailing list | email@hidden
>
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/cocoa-dev
>
Do not post admin requests to the list. They will be ignored.
_______________________________________________
cocoa-dev mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/cocoa-dev
Do not post admin requests to the list. They will be ignored.