Re: Subject: gdb & BOOL
Re: Subject: gdb & BOOL
- Subject: Re: Subject: gdb & BOOL
- From: Jonathan del Strother <email@hidden>
- Date: Wed, 30 Nov 2005 13:19:16 +0000
On 30 Nov 2005, at 10:58, Thomas Engelmeier wrote:
gdb seems to be interpreting BOOL types as pointers - in both the
console and the variable viewer, they're being given as (for
instance) 0x1000000.
bool types (note the lowercase) are interpreted normally, and
appear as 'true' or 'false'.
What happens is pretty apparent if you look at the definitions of
BOOL (defined in <objc/objc.h>) and Boolean (<Corefoundation/
CFBase.h>).
Both are typedef'd as {un}signed char, so the debugger has no
chance to display anything but contents of an char.
Except it's not being displayed as the contents of a char.
Here's a code snippet :
char testChar = 'a';
BOOL testBOOL = YES;
bool testBool = true;
and here's some gdb output :
(gdb) p testChar
$1 = 97 'a'
(gdb) p testBOOL
$2 = 0x1612000
(gdb) p testBool
$3 = true
Based on what you're telling me (and what I see for the BOOL
#define), I would've expected this from gdb :
(gdb) p testBOOL
$1 = 1 '\001'
Instead I'm seeing a hex value that appears to bear little relation
to the actual value of the BOOL.
So, is there something odd going on here, or am I missing something?
Jon
Attachment:
smime.p7s
Description: S/MIME cryptographic signature
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden