• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
double precision errors and numbers not adding correctly
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

double precision errors and numbers not adding correctly


  • Subject: double precision errors and numbers not adding correctly
  • From: Chad Armstrong <email@hidden>
  • Date: Sat, 23 Mar 2002 01:50:55 -0700

I have two problems.

With a calculator I'm designing, it works great for the most part, except for a small precision error. If the number which is typed in is between 64 and 99 and either a 1 or 6 is typed for the decimal spot for the first number, it comes up with an unusual number. For example, 65.1 would show up like 65.09999999999999 and 93.6 would be 93.59999999999999. Is there any method which will round to the closest tenth?

But another weird problem is I've even tried to have the number 65.1 be put into the display by hard coding in the number. It still comes up with the 65.09999... number. When I added 65.0 + 0.1, it came up with 65.1, and did a printf() statement, then 65.1 would print out, but if I had the precision to print out like %2.14f, then the 65.099999... answer would appear. Is there a way to set the floating/double precision using Objective-C or C? This value is actually a double, not a float.

Another odd part is that numbers like 169 are not affected, but any number from 64.0 to 99 are affected (including their negative counterparts).

// -----------------------------------------------------------------------

My second problem involves the changing (or not changing) of variables in a class. In my EGView class which deals with the information drawn to a custom NSView, I have one function which is called by my controller class. When the EGView variables are incremented like so:

myVar += 1.0;

I then print out the variable, and it starts out at 0.0, BUT when myVar is initialized it starts out at a different value other than 0.0. So when I print out the value of myVar from the method which is incrementing, it shows it is being incremented. However, when I look at the value from the drawRect function, it sees the original initialized value. If I change the myVar value in the drawRect function, then it increments fine, but this does not affect the 'other' myVar. What is going on here? Things which should seem fairly straightforward are becoming very irritating.

On one bright note, I did figure out how to get 'external' variables of the drawRect function to work properly. I needed to call a self = [super initWithFrame:frame]; in my initialization method for EGView.

Anyway, these seem to be some very odd and irritating errors, and I hope someone can figure out how they might be corrected, if possible.

Chad Armstrong
email@hidden
_______________________________________________
cocoa-dev mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/cocoa-dev
Do not post admin requests to the list. They will be ignored.

  • Follow-Ups:
    • Re: double precision errors and numbers not adding correctly
      • From: "Erik M. Buck" <email@hidden>
    • Re: double precision errors and numbers not adding correctly
      • From: Jonathan Feinberg <email@hidden>
  • Prev by Date: Re: Cocoa's Popularity
  • Next by Date: Project Builder 1.1.1 won't start
  • Previous by thread: Re: Cocoa for a living? [WAS: Cocoa's popularity] (Erik M. Buck)
  • Next by thread: Re: double precision errors and numbers not adding correctly
  • Index(es):
    • Date
    • Thread