re: Any way to get 16-bit string literals?
re: Any way to get 16-bit string literals?
- Subject: re: Any way to get 16-bit string literals?
- From: Jean-Denis Muys <email@hidden>
- Date: Sat, 03 Oct 2009 02:12:23 +0200
I am currently working in a very similar setting, writing a plug-in
for a host app which uses 16-bit strings in its API.
The plain answer to your question is that there no way to express a 16-
bit string literal that I know of.
Also copying a char[] array to a uint16_t[] array is likely not to do
what you expect.
I guess the destination string is supposed to be UTF-16, while the
source string is likely to be UTF-8. Converting from UTF-8 to UTF-16
is a non trivial process. In my code I packaged that in a lightweight
subclass of the standard C++ std::string class, using code written a
decade or so ago by Mark Davis, and that can be found on the Unicode
organization web site,
With the proper operator overloading I can then do things such as
sending my string class to the console, or even the host app 16-bit
string provided it's typecast to my string first.
The probably better alternative is to replicate what std::wstring is
doing. It's going to start with:
typedef std::basic_string< uint16_t > u16string; // uses 2 bytes per
character
but it's not quite enough, and I haven't done it [yet].
Jean-Denis
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden