Re: Initializing unichar variable with a human readable letter
Re: Initializing unichar variable with a human readable letter
- Subject: Re: Initializing unichar variable with a human readable letter
- From: vincent habchi <email@hidden>
- Date: Sun, 18 Jul 2010 10:38:32 +0200
Hi Ken,
> Doing this will probably work:
> unichar foo = L'é';
Thanks, that's perfect.
> With modern compilers, it should be possible to do (roughly) what you want if the source file is UTF-8 encoded. However, note that "é" is often represented as "e" followed by U+0301 COMBINING ACUTE ACCENT. That is, the single grapheme is two characters. So, no matter the encoding, that won't fit into a unichar. There is a single precomposed "é" character, U+00E9 LATIN SMALL LETTER E WITH ACUTE, but it's not always clear which you get with any given input mechanism.
It seems Xcode represents the 'é' with the unicode char and not the combination, so that's fine for me. Besides, to answer your question, I use the latest clang/llvm pair out of MacPorts (I found somewhere on a site the proper file to hack to make Xcode work with these rather than the Apple provided clang/llvm which are always a bit out of phase).
Since I use the unichar to make comparisons, I could also have initialized a custom NSCharacterSet with "é", but, as we say here in France: "it's like using a bulldozer to break a nut shell".
Thanks again, enjoy your Sunday
Vincent_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden