I am stuck with rounding an number to one decimal, something strange is happening. I have this applescript in Xcode 3.1.4:
line 1: set aantalvragen to contents of text field "aantalvragenveld" as integer -- aantalvragenveld is a field where i can put in the number of questions on a test (voor my students) line 2: set aantalgoed to contents of text field "aantalgoedveld" as integer --aantalgoedveld is a field where i can put the number of correct questions line 3: set resultaat to (aantalgoed * 90 / aantalvragen) + 10 as integer --now the result is between 10 and 100, but it schould be between 1 and 10 (in Holland we like this ;) line 4: set contents of text field "Resultaatveld" to resultaat / 10 --and now its between 1 and 10 with one decimal.
But, the strange thing is when aantalvragen is 10 and aantalgoed is 8 , then in line 3 resultaat is 82 (thats oke), but in line 4 its 8,19999999999999 (this should be 8,2)
So applescript thinks that 82 / 10 = 8,1999999999
It does not work to round things.
As i said, i am stuck
hope you can help |