Re: Specifying "enum" data size
Re: Specifying "enum" data size
- Subject: Re: Specifying "enum" data size
- From: Andreas Grosam <email@hidden>
- Date: Tue, 27 Sep 2005 13:50:06 +0200
On 27.09.2005, at 02:00, Rush Manbert wrote:
Mark Wagner wrote:
On 9/24/05, Andreas Grosam <email@hidden> wrote:
On 24.09.2005, at 02:17, Chris Espinosa wrote:
On Sep 23, 2005, at 5:04 PM, Mark Wagner wrote:
How do I specify the size of an "enum" value? I've got a program
that
links to a number of static libraries. In one of them, an "enum"
for
the typedef'd "BoolEnum" type is one bit, while in another, it's 32
bits. Needless to say, this causes problems when passing structs
containing BoolEnums back and forth.
Does GCC recognize the "#pragma enumsalwaysint" pragma? Might the
presence of this in some of the source files be causing problems?
I'm using XCode 1.5 and GCC 3.3, and I'm writing in C.
The only control available is -fshort-enums ("Short Enumeration
Constants") which you can apply on a target or on a single file.
There's no pragma to make this apply only to specific enums within a
file. This will make enums as small as possible for the range of
enums given.
Notice that Mac OS X frameworks often declare structs that contain
enumerated values, so changing the size of enums might make your
code
unable to call Mac OS X system functions.
Are you sure? I thought, enums were only be used to declare constants
and never be used as types. Thus, enums do not have any effect on the
layout or sizeof of structs.
Otherwise, in the C language this would be a very fragile usage. In
C,
i would never ever use enums as types in interfaces since the size of
the enum type is compiler "implementation defined", may depend on
optimization flags, architecture, etc.
Note, that binary compatibility (here the size of enums) may only be
guaranteed when you use the same C compiler.
I also wouldn't use the switch for enums used in interfaces. The
documentation says this:
"Warning: the -fshort-enums switch causes GCC to generate code that
is
not binary compatible with code generated without that switch. Use it
to conform to a non-default application binary interface. "
In your case, it looks like, that your static library has been
compiled
with this switch on, or with another compiler, or other compiler
flags
changing the size of the enum - thus it is not binary compatible
anymore.
Well, to workaround your problem, re-compile the static libraries
using
*your* compiler (gcc 3.3) in order to get ABI compatibility.
Everything was compiled using gcc 3.3, with the -fshort-enums switch
disabled. That's why I'm asking about "#pragma enumsalwaysint" --
it's the only thing I can think of that would be causing the problems.
For the Apple ABI, how long should an enum type with two possible
There is no "Apple ABI", there is one "Itanium ABI", which does not
apply to your problem, though - its only for C++.
values be, anyway? Is the library that thinks it's 32 bits correct,
or is the library that thinks it's one bit correct? Or are they both
wrong?
If you compile with the same compiler and the same switches, it should
be binary compatible. If you get still errors, there might be another
problem - possibly a compiler switch, unintentionally.
I just double checked this to be sure, and I'm just going to quote
from Stroustrup:
"The sizeof an enumeration is the sizeof some integral type that can
hold its range and not larger than sizeof(int), unless an enumerator
cannot be represented as an int or an unsigned int. For example,
sizeof(e1) could be 1 or maybe 4 but not 8 on a machine where
sizeof(int)==4."
I think that if you want to force the minimum size, you will need a
"dummy" value that you specify as something like:
dummyEnumVal = 0xffff, which should force sizeof the enum to be 2 (16
bits). It sounds like the compiler that is using a single bit is
compliant, while the one that uses 32 bits has just defaulted to using
sizeof(int), and really doesn't comply. In this case, it sounds like
you might need to specify a dummy value as 0xffffffff, so that the
enum size is 32 bits on each machine. Of course, then you might have
endian issues to deal with... :-)
Hope it helps,
Rush
Please note, that reagrding enums there are fundamental differences in
C and C++!
Originally, Mark mentioned to program in C.
The current C++ Standard, respectively the C (C99) Standard is the only
reference:
In C++, the underlaying type of an enumeration constant, and thus its
size, is "implementation defined". This means it is defined by the
compiler vendor. It shall be an intergal type and not be larger than an
int unless the value cannot fit in an int or unsigned int.
(C++ Standard, 7.2.5)
In C99, the underlaying type of an enumeration constant is an int.
Each *enumerated type* shall be compatible with an integer type. The
choice of type is implementation-defined, but shall be capable of
representing the values of all the members of the enumeration.
(C99, 6.5.2.2)
Depending on the "implementation dependent" size of an enum type or an
enumeration constant in an interface is bad programming style:
In C:
/* File: mylib.h */
typedef enum Boolean { FALSE = 0, TRUE = 1 };
void foo(Boolean yesNo); /* bader than worse !! */
struct X {
Boolean on;
};
void bar(X v); /* Tests Apple's CrashReporter app */
Using compiler switches to change the default enum size may become
harmful, like changing the default size of int, or wchar. In C++ this
may break ABI conformance, in C it breaks binary compatibility as well.
Please, be **extremely** careful when using such switches in the
command line!
The usage of a "#pragma enumsalwaysint" or such like, wouldn't be a
good solution of your problem since this is implementation defined as
well - and sizeof(int) is also implementaiton defined. You can use this
safely in non-interface, non-portable code of course. But just
commenting it out in existing code to solve your problem may break
existing code, though, and requires more investigation.
To summarize this:
For C:
typedef enum Color { RED, GREEN };
The following is true:
1) sizeof (RED) == sizeof(int)
2) type_of (RED) == signed int
3) Color is a distinct type, IFF typedef is used, like above.
4) sizeof(Color) == ? (may be different on different compilers)
5) sizeof(enum1) may be different than sizeof(enum2)
For C++:
enum Color { RED, GREEN };
The following is true:
1) sizeof (RED) == sizeof(Color)
2) type_of (RED) == Color
3) Color is a distinct type
4) sizeof(Color) == k , where k is specific for Color (if the compiler
conforms to the ABI)
5) sizeof(enum1) may be different than sizeof(enum2)
Regards
Andreas
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden