Re: Tiger Decoding problem
Re: Tiger Decoding problem
- Subject: Re: Tiger Decoding problem
- From: Dustin Voss <email@hidden>
- Date: Mon, 30 May 2005 19:37:07 -0700
On 30 May 2005, at 6:54 PM, Nicholas Crosbie wrote:
Thanks Dustin. I think now I'll be going places. But
before
I go away and play with this, note that my data was
produced for
a mac and is in the form of "\$BYTEORD\4,3,2,1" .
I have no idea what that notation means...
Also, the data I'm trying to read is "a 10-bit
number.....stored in 16-bit space.......leaving 6
empty bits"
...but this means that it is a 2-byte number less than 1024. I
correctly guessed little-endian.
Does this mean that I need to include a bitmask? Do
you know
how to do that?
You do not need to use a bit-mask here. A bit-mask simply removes
certain bits from consideration by setting them to 0. In your case,
the irrelevant bits are already "empty," meaning 0. But...
If you ever do need to construct a bit-mask, you simply have a 1 for
every relevant bit, and a 0 for every irrelevant bit. The bit-mask
for 10 of 16 bits like in your file is:
0b0000001111111111 or 0x03FF
You apply the bit-mask, setting the irrelevant bits to 0, with the &
operator:
#define BITMASK 0x03ff
maskedValue = rawValue & BITMASK;
You alter the relevant bits of a mask like this:
newValue = rawValue & ~BITMASK;
newValue |= /*a bit to set*/ | /*another bit to set*/;
If you know your Boolean logic, you can see how this works.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden