Re: [Obj-C] if (self) vs. if (self != nil)
Re: [Obj-C] if (self) vs. if (self != nil)
- Subject: Re: [Obj-C] if (self) vs. if (self != nil)
- From: Oleg Krupnov <email@hidden>
- Date: Fri, 24 Feb 2012 21:56:53 +0200
Yes, I'm aware that in practice it doesn't matter, but when I code, I have to choose one or the other way, anyway. My current choice is (self != nil) for the sake of explicitness.
Thanks everyone for an interesting discussion, especially Kyle for such an exhaustive reference :)
P.S. Sorry for posting the question into the wrong list. I'm a bit shaky in the classification of apple lists.
On Friday, February 24, 2012 at 9:47 PM, Kyle Sluder wrote:
> On Fri, Feb 24, 2012 at 6:50 AM, Oleg Krupnov <email@hidden (mailto:email@hidden)> wrote:
> > An interesting question. The following samples are equivalent in terms
> > of compiled code, but which one is more correct from the language's
> > point of view?
> >
> > self = [super init];
> > if (self)
> > {
> > }
> > return self;
> >
> > self = [super init];
> > if (self != nil)
> > {
> > }
> > return self;
> >
> > The Xcode samples promote the first variant, but I'm wondering if the
> > second one is more correct?
> >
> > The "nil" is defined as follows (jumped to definition)
> >
> > #define nil NULL
> > #define NULL ((void*)0)
> >
> > So basically, nil is of type "void*", so the expression "self != nil"
> > compares two pointers and the result is "boolean", which is perfect
> > for testing in the "if" statement. But the "self" alone is of type
> > "pointer" and so when it is tested by the "if" statement, it's
> > implicitly cast to the type "boolean".
> >
> > I also heard that generally speaking NULL is not necessarily always
> > equal to 0 on all architectures.
> >
>
>
> §6.3.2.3 ¶3 defines the null pointer constant as "an integer constant
> expression with the value 0, or such an expression cast to type void
> *". A null pointer is one that has been assigned the value of the null
> pointer constant, or has been assigned the value of another null
> pointer.
>
> But you are correct that conversion of a null pointer to integer is
> NOT defined to compare equal to an integer expression with a value of
> zero. §6.3.2.3 ¶6 makes that clear: "Any pointer type may be converted
> to an integer type. Except as previously specified, the result is
> implementation-defined." However, footnote 56 states that "The mapping
> functions for converting a pointer to an integer or an integer to a
> pointer are intended to
> be consistent with the addressing structure of the execution environment."
>
> But I'm not sure the integer conversion is necessarily relevant. The
> semantics of the if statement are defined by §6.8.4.1 ¶2: "the first
> substatement is executed if the expression compares unequal to 0." It
> is left unspecified if '0' is an integer constant expression or an
> arithmetic expression with an integer value of zero. If the former,
> then the pointer is compared against the null pointer constant per
> §6.3.2.3 ¶4. If the latter, the pointer is converted to integer per
> implementation-defined behavior and the comparison is performed, which
> might itself result in undefined behavior per §6.5 ¶5 since the
> conversion is not guaranteed to produce a value within range of any
> integer type.
>
> In practice, it's a moot point since on all architectures that Apple's
> Objective-C runtime runs on use identical bit patterns for null
> pointers as they do for integer zero.
>
> But absent any information I'm unaware of, the explicit comparison is
> "more correct" in the sense that it doesn't leave any room for
> implementation-defined behavior.
>
> --Kyle Sluder
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden