Re: Very strange Cocoa application issue.
Re: Very strange Cocoa application issue.
- Subject: Re: Very strange Cocoa application issue.
- From: Cláudio Rodrigues <email@hidden>
- Date: Mon, 12 Feb 2007 01:07:28 -0500
- Thread-topic: Very strange Cocoa application issue.
Many thanks for the explanation Sherm.
The funny thing is it does work in Debug! Unbelievable. And note I tried the
same code on three different Macs (two Intel, one PPC) and in Debug it
always worked.
After using 'A' of course everything works as expected.
That is what happens when an assembly/VB/RB developer that did not develop
anything in at least 5 years, tries to start learning Objective-C and Cocoa
*without* reading any real documentation, just by taking a look at examples
out there. :-)
Again, thanks all! The issue is now fixed and another lesson I learned
today. Gotta go out and buy a good C book I guess. :-)
CR
> From: Sherm Pendley <email@hidden>
> Date: Mon, 12 Feb 2007 00:45:10 -0500
> To: Cláudio Rodrigues <email@hidden>
> Cc: <email@hidden>
> Subject: Re: Very strange Cocoa application issue.
>
> On Feb 12, 2007, at 12:04 AM, Cláudio Rodrigues wrote:
>
>> I was assigning key1 = (int) "A" for example. This worked in debug
>> but not
>> in release. As soon as I replaced with key1 = 65; then it worked.
>> So if I assign the number directly to the variable instead of (int)
>> letter
>> then it works.
>
> Are you certain it works in debug? It shouldn't, unless the constant
> string "A" happens to be stored at memory address 65 by some miracle
> of coincidence.
>
> A double-quoted string is a *pointer* to an array of const chars. So
> what you're typecasting to int isn't the ASCII value of the first
> character in the string, it's the memory address that's the target of
> the pointer.
>
> If you want an int with the ASCII code for a single literal
> character, use single quotes, like this:
>
> int key1 = 'A';
>
> No typecasting is required - single quotes already produces an
> integer type.
>
> Standard ANSI C limits character constants to one character, but
> Apple extends the limit to four. If memory serves, that's to support
> declaring type and creator codes as character constants.
>
> If you have a pointer variable to a non-literal string - char*
> aString = "A" - you still wouldn't want to typecast the pointer.
> Instead, you want to get the value stored at the memory address it
> points to; you can do that by dereferencing it:
>
> int key1 = *aString;
>
> Again, no need for typecasting, aString is declared as a pointer to
> char, which is already an integer type.
>
> In general, if you're getting "assigning a pointer to an integer type
> without a typecast" warnings, it's a good idea to think hard about
> whether that assignment is what you really want to do. It's tempting
> to simply add a typecast to silence the warning, but the warning
> exists for a good reason; quite often such an assignment is a mistake.
>
> sherm--
>
> Web Hosting by West Virginians, for West Virginians: http://wv-www.net
> Cocoa programming in Perl: http://camelbones.sourceforge.net
>
>
> _______________________________________________
>
> Cocoa-dev mailing list (email@hidden)
>
> Do not post admin requests or moderator comments to the list.
> Contact the moderators at cocoa-dev-admins(at)lists.apple.com
>
> Help/Unsubscribe/Update your Subscription:
>
> This email sent to email@hidden
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden