Re: Strange problem when not declaring the functions in interface
Re: Strange problem when not declaring the functions in interface
- Subject: Re: Strange problem when not declaring the functions in interface
- From: "Ivan S. Kourtev" <email@hidden>
- Date: Mon, 18 Apr 2005 17:16:31 -0400
As far as I understand it, the runtime can tell whether an object
responds to a particular message before sending the message to the
object. But the runtime doesn't match the return type (or check the
type) against what you expect it to be (not to mention it would be
probably hard to tell what you expect).
--
ivan
On Apr 18, 2005, at 5:06 PM, Pradeep Kumar wrote:
BTW, you work around works fine. But it still doesn't explain why is
it assuming that values
are id. Isn't it supposed to be decided during run time?
Thanks
prady
On Tue, 19 Apr 2005 Ivan S.Kourtev wrote :
>The default return type is (id). If you don't declare the functions
you should do:
>
>i = i + (int)[self foo1];
>
>--
>ivan
>
>On Apr 18, 2005, at 4:08 PM, Pradeep Kumar wrote:
>
>> Hi All
>>
>> I am not sure if I am missing something really fundamental or it
is really a bug. But I
have
>> this strange problem that's occurring when functions are not
declared in the interface.
>> Thought of broadcasting this incase some one has any insight on
why this is
happening.
>>
>> Please review the following code snippet. Assume that the
functions foo1 and foo2 are
>> declared in the interface declaration of MyObject.
>>
>> @implementation MyObject
>>
>> -(void)awakeFromNib
>> {
>> NSLog(@"[self foo1] returned %d", [self foo1]);
>> NSLog(@"[self foo2] returned %d", [self foo2]);
>> int foo1 = [self foo1];
>> int foo2 = [self foo2];
>> NSLog(@"Variables foo1 = %d\tfoo2 = %d", foo1, foo2);
>> int i = 100;
>> NSLog(@"i = %d", i);
>> i = i+[self foo1];
>> NSLog(@"Executing i = i+[self foo1] = %d", i);
>> i = i+[self foo2];
>> NSLog(@"Executing i = i+[self foo2]; = %d", i);
>> }
>>
>> -(int)foo1
>> {
>> return 10;
>> }
>>
>> -(int)foo2
>> {
>> return 20;
>> }
>>
>> @end
>>
>> The result you get in the log is
>>
>> [self foo1] returned 10
>> [self foo2] returned 20
>> Variables foo1 = 10 foo2 = 20
>> i = 100
>> Executing i = i+[self foo1] = 110
>> Executing i = i+[self foo2]; = 130
>>
>> The results are perfect.
>>
>> Now remove the declarations of foo1 and foo2 from the interface
file of MyObject. After
>> doing this here's what I get.
>>
>> [self foo1] returned 10
>> [self foo2] returned 20
>> Variables foo1 = 10 foo2 = 20
>> i = 100
>> Executing i = i+[self foo1] = 410
>> Executing i = i+[self foo2]; = 1660
>>
>> See the results of the last two statements. 410 and 1660. Why is
this happening?
>>
>> I am using XCode 1.5 with Component versions Xcode IDE: 389.0,
Xcode Core: 387.0,
>> ToolSupport: 372.0 on 10.3.9 (7W98). I know it is recommended
that functions be
declared
>> in the interface. But can not declaring the functions cause such
a huge difference in
ways
>> you can use non-declared functions?
>>
>> Thanks
>> prady
>>
>>
>>
>><inbox.gif> _______________________________________________
>>Do not post admin requests to the list. They will be ignored.
>>Cocoa-dev mailing list (email@hidden)
>>Help/Unsubscribe/Update your Subscription:
email@hidden
>>
>>This email sent to email@hidden
--
prady
<inbox.gif>
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden