Re: Inconsistent results from Finder and System Events
Re: Inconsistent results from Finder and System Events
- Subject: Re: Inconsistent results from Finder and System Events
- From: Christopher Nebel <email@hidden>
- Date: Thu, 20 Nov 2003 10:21:28 -0800
On Nov 20, 2003, at 7:14 AM, Arthur Knapp wrote:
From: Christopher Nebel <email@hidden>
Subject: Re: Inconsistent results from Finder and System Events
Date: Wed, 19 Nov 2003 14:04:42 -0800
... AppleScript integers are 30-bit twos-complement integers, which
means you've got 29 bits of magnitude to play with, or about 500
million. Any operations that produce results outside that range get
turned into reals. (At least, they're supposed to. There have been
bugs with that in the past.)
This has always struck me as being "conceptually buggy." Why are the
integers 30 bits rather than the expected 32 bits?
It's an old LISP implementation trick to save memory. Internally,
everything is a 32-bit typed "pointer". In most cases, the 32-bit word
really is a pointer, and the bottom few bits (3, in AppleScript's case)
tell you what it's pointing to -- a cons cell, a data block, etc. (You
mask off the bottom bits to get the real address. Since everything has
to be 8-byte-aligned anyway, this works well.) Certain data, however,
such as integers, are essentially small enough to fit into the
"pointer" itself, so they're stored there as what's called "immediate
data" in the remaining 29 bits. The original AppleScript implementors
got extra clever at this point, and defined two different tags for odd
and even integers, so that effectively gives you 30 bits.
--Chris Nebel
AppleScript Engineering
_______________________________________________
applescript-users mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/applescript-users
Do not post admin requests to the list. They will be ignored.