Re: Re: Strange problem when not declaring the functions in interface
Re: Re: Strange problem when not declaring the functions in interface
- Subject: Re: Re: Strange problem when not declaring the functions in interface
- From: "Pradeep Kumar" <email@hidden>
- Date: 18 Apr 2005 21:06:33 -0000
Ok. I did notice the warnings. But isnt Cocoa/Obj C supposed to be fully dynamic?
Even if it is id, why it should return false values..and that too when used in a _expression_.
Why doesnt it return those false values when assigning to a variable?
I tried declaring the variables foo1 and foo2 as id instead of int. The values that got printed
were the correct values i.e. 10 and 20. But when used in a _expression_ it gives incorrect
values.
BTW, you work around works fine. But it still doesn't explain why is it assuming that values
are id. Isn't it supposed to be decided during run time?
Thanks
prady
On Tue, 19 Apr 2005 Ivan S.Kourtev wrote :
>The default return type is (id). If you don't declare the functions you should do:
>
>i = i + (int)[self foo1];
>
>--
>ivan
>
>On Apr 18, 2005, at 4:08 PM, Pradeep Kumar wrote:
>
>> Hi All
>>
>> I am not sure if I am missing something really fundamental or it is really a bug. But I
have
>> this strange problem that's occurring when functions are not declared in the interface.
>> Thought of broadcasting this incase some one has any insight on why this is
happening.
>>
>> Please review the following code snippet. Assume that the functions foo1 and foo2 are
>> declared in the interface declaration of MyObject.
>>
>> @implementation MyObject
>>
>> -(void)awakeFromNib
>> {
>> NSLog(@"[self foo1] returned %d", [self foo1]);
>> NSLog(@"[self foo2] returned %d", [self foo2]);
>> int foo1 = [self foo1];
>> int foo2 = [self foo2];
>> NSLog(@"Variables foo1 = %d\tfoo2 = %d", foo1, foo2);
>> int i = 100;
>> NSLog(@"i = %d", i);
>> i = i+[self foo1];
>> NSLog(@"Executing i = i+[self foo1] = %d", i);
>> i = i+[self foo2];
>> NSLog(@"Executing i = i+[self foo2]; = %d", i);
>> }
>>
>> -(int)foo1
>> {
>> return 10;
>> }
>>
>> -(int)foo2
>> {
>> return 20;
>> }
>>
>> @end
>>
>> The result you get in the log is
>>
>> [self foo1] returned 10
>> [self foo2] returned 20
>> Variables foo1 = 10 foo2 = 20
>> i = 100
>> Executing i = i+[self foo1] = 110
>> Executing i = i+[self foo2]; = 130
>>
>> The results are perfect.
>>
>> Now remove the declarations of foo1 and foo2 from the interface file of MyObject. After
>> doing this here's what I get.
>>
>> [self foo1] returned 10
>> [self foo2] returned 20
>> Variables foo1 = 10 foo2 = 20
>> i = 100
>> Executing i = i+[self foo1] = 410
>> Executing i = i+[self foo2]; = 1660
>>
>> See the results of the last two statements. 410 and 1660. Why is this happening?
>>
>> I am using XCode 1.5 with Component versions Xcode IDE: 389.0, Xcode Core: 387.0,
>> ToolSupport: 372.0 on 10.3.9 (7W98). I know it is recommended that functions be
declared
>> in the interface. But can not declaring the functions cause such a huge difference in
ways
>> you can use non-declared functions?
>>
>> Thanks
>> prady
>>
>>
>>
>><inbox.gif> _______________________________________________
>>Do not post admin requests to the list. They will be ignored.
>>Cocoa-dev mailing list (email@hidden)
>>Help/Unsubscribe/Update your Subscription:
>>
>>This email sent to email@hidden
--
prady
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden