Re: NSDecimal localization
Re: NSDecimal localization
- Subject: Re: NSDecimal localization
- From: Clark Cox <email@hidden>
- Date: Mon, 27 Dec 2004 07:23:33 -0500
On Fri, 24 Dec 2004 00:10:54 -0800, Mark Dawson <email@hidden> wrote:
> Is there ever a case where NSDecimal is NOT a unichar? Its defined as
> a string, but the only two separators that I know of are the point and
> the comma.
Why assume? Just treat it as a string, and automatically be insulated
against future changes, etc. Besides, I just tested it, and System
Preferences allows me to enter decomposed characters as the separator
characters, those will take more than one unichar to represent.
In general, when dealing with Unicode, one should never deal with
individual characters or codepoints, one should always deal with
strings, as there is no one-to-one mapping between what the user sees
as a single character, and what the programmer sees as a codepoint.
--
Clark S. Cox III
email@hidden
http://www.livejournal.com/users/clarkcox3/
http://homepage.mac.com/clarkcox3/
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden