Re: deprecated FOUR_CHAR_CODE and macintel
Re: deprecated FOUR_CHAR_CODE and macintel
- Subject: Re: deprecated FOUR_CHAR_CODE and macintel
- From: Gen Kiyooka <email@hidden>
- Date: Thu, 12 Jan 2006 22:39:13 -0800
On Jan 12, 2006, at 3:21 PM, Eric Albert wrote:
On Jan 12, 2006, at 1:28 PM, Lawrence Gold wrote:
On Jan 12, 2006, at 11:46 AM, Eric Albert wrote:
This is a very common misconception. FOUR_CHAR_CODE has never
done anything useful on the Mac at all. If you look carefully at
ConditionalMacros.h (or whichever header defines it), you'll see
that it only does anything on MIPS or something like that -- some
architecture for which Apple has no products. I don't even know
why the macro exists.
In other words, everything that you write as FOUR_CHAR_CODE
('ABCD') is exactly equivalent to 'ABCD', and has always been
that way. And that's just a 32-bit integer. We've worked hard
to make nearly all APIs in the system take native-endian
arguments, so in just about every case you don't need to byte-
swap four-char codes. The exceptions to this are documented in
the Universal Binary Programming Guidelines, as are the APIs you
can use to byte-swap 32-bit integers.
This is somewhat off-topic, but also somewhat related: We've made
liberal use of such four-character constants in our code, which
assumes that 'ABCD' is represented in big-endian order on a Mac
and little-endian order on a Windows PC. This assumption has
worked fine for CodeWarrior and Xcode, but we've begun
transitioning to Visual C++ for our Windows builds, and that
compiler represents such constants in big-endian order.
Are you sure that's right? I hate to doubt you here, but this is
the first I've heard of this. Presumably anyone working with
QuickTime for Windows would've hit this a long time ago, but I
haven't heard the QuickTime team or any developers using QT on
Windows mention it. Maybe I'm just not talking to the right
developers....
In VC4.2 for the Macintosh (circa 1995) and elsewhere, I have
somewhere seen FOUR_CHAR_CODE definitions
that broke each character into it's own 8-bit value and put them in
the 'right endian spot' in the 32-bit int.
Can't recall exactly where - so much code, so few remaining brain cells.
Gen
-Eric
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
40digigami.com
This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden