Getting the size of type in the preprocessor.
Getting the size of type in the preprocessor.
- Subject: Getting the size of type in the preprocessor.
- From: Tim Little <email@hidden>
- Date: Tue, 2 Sep 2008 18:04:10 -0500
In my portability header I define new types used by my apps like int32
int64 char8, uchar16 etc. so the code always knows what exactly it is
dealing with when going to and from different platforms.
Given that it seems the GCC internal defines for things like
__SIZEOF_SHORT__ (apparently added with GCC 4.3.x) aren't defined with
XCode 3.1 is there a good way to handle this at compilation time? The
__CHAR_BITS__ define does seem to exist so I can take care of that one
easily enough.
Is there an easy way to determine the GCC version number that is being
used in XCode 3.1? I can't find it in the help docs. I'm new to the
Mac and XCode, but by initializing an int to the __GNUC*__ defines it
imperially seems to be 4.0.1.
I can derive the sized from a program and then just add the logic to
the .h with hard coded assumptions but I'd prefer a cleaner way to
handle this. Does any one know the "correct way"?
Thanks,
Tim
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden