Determining the screen depth using [NSScreen depth] method gives weird results
Determining the screen depth using [NSScreen depth] method gives weird results
- Subject: Determining the screen depth using [NSScreen depth] method gives weird results
- From: Brant Sears <email@hidden>
- Date: Wed, 17 Sep 2003 14:36:52 -0700
Hi. I have the following code:
int depth = NSBitsPerPixelFromDepth([[NSScreen mainScreen] depth]);
printf("depth = %d\n", depth);
It always prints 12 whether I am in thousands or millions. I think it should
be 16 and 24, or maybe 16 and 32. Since my video hardware seems to be in 555
mode at 16 bits, so I could even understand 15 being a reasonable answer,
but 12? That seems broken.
I'm using 10.2.6. If this isn't a bug, can someone please explain this
behavior?
Thanks. I posted this to a different list (macosx-dev) a couple of days ago,
but I didn't get any response.
Brant Sears
_______________________________________________
cocoa-dev mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/cocoa-dev
Do not post admin requests to the list. They will be ignored.