Re: Getting the size of type in the preprocessor.
Re: Getting the size of type in the preprocessor.
- Subject: Re: Getting the size of type in the preprocessor.
- From: "Clark Cox" <email@hidden>
- Date: Tue, 2 Sep 2008 16:48:58 -0700
On Tue, Sep 2, 2008 at 4:04 PM, Tim Little <email@hidden> wrote:
> In my portability header I define new types used by my apps like int32 int64
> char8, uchar16 etc. so the code always knows what exactly it is dealing with
> when going to and from different platforms.
Well, instead of relying on GCC internals, one thing that you can do
on the Mac (since GCC on the Mac supports C99) is just use the
appropriate C99 types:
#if __APPLE__
#include <stdint.h>
typedef int32_t int32;
typedef int64_t int64;
#endif
> Given that it seems the GCC internal defines for things like
> __SIZEOF_SHORT__ (apparently added with GCC 4.3.x) aren't defined with XCode
> 3.1 is there a good way to handle this at compilation time? The
> __CHAR_BITS__ define does seem to exist so I can take care of that one
> easily enough.
>
> Is there an easy way to determine the GCC version number that is being used
> in XCode 3.1? I can't find it in the help docs. I'm new to the Mac and
> XCode, but by initializing an int to the __GNUC*__ defines it imperially
> seems to be 4.0.1.
>
> I can derive the sized from a program and then just add the logic to the .h
> with hard coded assumptions but I'd prefer a cleaner way to handle this.
> Does any one know the "correct way"?
--
Clark S. Cox III
email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden