Any way to get 16-bit string literals?
Any way to get 16-bit string literals?
- Subject: Any way to get 16-bit string literals?
- From: Jens Alfke <email@hidden>
- Date: Fri, 2 Oct 2009 11:44:49 -0700
I'm looking at how to optimize creation of WebCore::StringImpl objects
from string literals, which use a 16-bit Unicode character type.
Currently the constructor has to loop through the 8-bit ascii literal
copying it into a 16-bit character.
C++ has a 'wide string literal' syntax that looks like L"some string",
which creates an array of wchar_t. Unfortunately wchar_t is 32 bits
wide, not 16. It's possible to use a #pragma to make GCC change
wchar_t to 16 bit, but that breaks compatibility with all the standard
library calls that operate on wide strings.
So: Is there any clever way to get the compiler to generate a 16-bit
string literal? (I've looked at the CFSTR() macro, which has a very
similar purpose, but it seems to use a special-purpose hack in GCC
that only generates CFString objects.)
Failing that, is there some optimal way to copy a char[] array to a
uint16_t[] array? I would imagine this is the kind of thing the SSE
extensions would do really well; is there a library call for it?
—Jens
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden