Re: Broken date math on AppleScript 2.0
Re: Broken date math on AppleScript 2.0
- Subject: Re: Broken date math on AppleScript 2.0
- From: Philip Aker <email@hidden>
- Date: Mon, 18 Feb 2008 08:36:31 -0800
On 2008-02-18, at 08:02:59, Doug McNutt wrote:
Top posted because it is an answer to mail directed to me with only
a Cc: to the list.
Forgiven. Pax vobiscum.
AppleScript Language Guide (For AppleScript 1.3.7) near page 66
The largest value that can be expressed as an integer in
AppleScript is ±536870909, which is equal to ±(2^29- 3). Larger
integers (positive or negative) are converted to real numbers
(expressed in exponential notation) when scripts are compiled.
Thanks.
Perhaps it has changed but not for this 8500 running OS 9.1. A
change like that would surely be covered in a release note or
updated version of the Language Guide. Wouldn't it?
Should have been.
I do believe the limit is in Script Editor and not in the actual
events sent to and from applications.
Right. I've never experienced the limit there. And since a lot of
what I do at that level is transcribing to/from other languages
(which are not integer-representation gelded), I guess I've never had
occasion to check in AppleScript proper.
I remember a dialog on the list involving Chris Nebel that resulted
in a conclusion that lots of languages used the upper bits as
flags. I think OS neXt arrived before he became involved.
Well my conclusion is that the AppleScript team should institute the
full compliment of number types - as detailed in XML Schema datatypes
<http://www.w3.org/TR/xmlschema-2/#built-in-datatypes> and augmented
by the extra c99 types. And then by hook or by crook put the missing
types into CFNumber.h. In my (always candid and frank) opinion, it's
inexcusable of Apple that this hasn't been done. Had to fathom that
the VP of Technology (aka BS) can OK a core API set that has no
unsigned integer types.
Philip Aker
echo email@hidden@nl | tr a-z@. p-za-o.@
Begin copied text
At 18:51 -0800 2/17/08, Philip Aker wrote:
On 2008-02-15, at 19:46:58, Doug McNutt wrote:
At 16:25 -0800 2/15/08, Scott Babcock wrote:
It appears that date math on AppleScript 2.0 (shipped with
Leopard) is not working correctly with large values.
If you have an integer whose value exceeds 0x1FFFFFFF, and you
try to add or subtract this integer from a date, you will not
get the expected value.
Integers in AppleScript have always been limited to 29 bits.
Above that everything gets coerced to floating point.
I think I've missed some obvious documentation. Where did you get
this info? In AEDataModel.h the codes are assigned as follows:
/* Preferred numeric Apple event descriptor types */
enum {
typeSInt16 = 'shor',
typeSInt32 = 'long',
typeUInt32 = 'magn',
typeSInt64 = 'comp',
typeIEEE32BitFloatingPoint = 'sing',
type128BitFloatingPoint = 'ldbl',
typeDecimalStruct = 'decm'
};
These codes have been in effect for quite sometime -- possibly as
early as OS X 10.0. So I can only understand this limitation as
being a side effect of trying to contain the sign in the hi 2
bits for representation in AppleScript. But I don't see how this
would be necessary on any MacOS since ppc (i.e. post 24 bit 68K).
Where I have problems with OS X (in general, not just limited to
AppleScript) is the inexplicable omission of unsigned integer
types (as defined in CFNumber.h). Like half its math brain is
deliberately elidedŠ
_______________________________________________
Do not post admin requests to the list. They will be ignored.
AppleScript-Users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
Archives: http://lists.apple.com/archives/applescript-users
This email sent to email@hidden