Re: Broken date math on AppleScript 2.0
Re: Broken date math on AppleScript 2.0
- Subject: Re: Broken date math on AppleScript 2.0
- From: Doug McNutt <email@hidden>
- Date: Mon, 18 Feb 2008 09:02:59 -0700
Top posted because it is an answer to mail directed to me with only a Cc: to the list.
AppleScript Language Guide (For AppleScript 1.3.7) near page 66
The largest value that can be expressed as an integer in AppleScript is
±536870909, which is equal to ±(2^29- 3). Larger integers (positive or negative)
are converted to real numbers (expressed in exponential notation) when scripts
are compiled.
Perhaps it has changed but not for this 8500 running OS 9.1. A change like that would surely be covered in a release note or updated version of the Language Guide. Wouldn't it?
I do believe the limit is in Script Editor and not in the actual events sent to and from applications.
I remember a dialog on the list involving Chris Nebel that resulted in a conclusion that lots of languages used the upper bits as flags. I think OS neXt arrived before he became involved.
Doug
>>>> Begin copied text
At 18:51 -0800 2/17/08, Philip Aker wrote:
>On 2008-02-15, at 19:46:58, Doug McNutt wrote:
>
>> At 16:25 -0800 2/15/08, Scott Babcock wrote:
>>> It appears that date math on AppleScript 2.0 (shipped with Leopard) is not working correctly with large values.
>
>>> If you have an integer whose value exceeds 0x1FFFFFFF, and you try to add or subtract this integer from a date, you will not get the expected value.
>
>> Integers in AppleScript have always been limited to 29 bits. Above that everything gets coerced to floating point.
>
>I think I've missed some obvious documentation. Where did you get this info? In AEDataModel.h the codes are assigned as follows:
>
>/* Preferred numeric Apple event descriptor types */
>enum {
> typeSInt16 = 'shor',
> typeSInt32 = 'long',
> typeUInt32 = 'magn',
> typeSInt64 = 'comp',
> typeIEEE32BitFloatingPoint = 'sing',
> type128BitFloatingPoint = 'ldbl',
> typeDecimalStruct = 'decm'
>};
>
>These codes have been in effect for quite sometime -- possibly as early as OS X 10.0. So I can only understand this limitation as being a side effect of trying to contain the sign in the hi 2 bits for representation in AppleScript. But I don't see how this would be necessary on any MacOS since ppc (i.e. post 24 bit 68K).
>
>Where I have problems with OS X (in general, not just limited to AppleScript) is the inexplicable omission of unsigned integer types (as defined in CFNumber.h). Like half its math brain is deliberately elided
>
>
>Philip Aker
>echo email@hidden@nl | tr a-z@. p-za-o.@
--
--> From the U S of A, the only socialist country that refuses to admit it. <--
_______________________________________________
Do not post admin requests to the list. They will be ignored.
AppleScript-Users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
Archives: http://lists.apple.com/archives/applescript-users
This email sent to email@hidden