I've changed things a bit where I'm now creating an NSImage in place
of directly calling NSRectFill(). Oddly, I have ended up having to do
two different compositing operations when I draw the image depending
on the situation.
In one situation I'm drawing the NSImage in an overlay view
(transparent window) where I have to use NSCompositeSourceIn in order
to see the view underneath properly, and then in another situation
where I'm compositing this same NSImage over another NSImage I have
to use NSCompositeSourceOver in order to see underneath properly.
Does this sound right?
NSRectFill uses NSCompositeCopy which will blow away the pixels in
the destination. Try using NSRectFillUsingOperation with
I'm drawing into NSImage by drawing another NSImage and then using
NSRectFill() ot overlay a semi-transparent colour. The image is
created fine if I don't draw the semi-transparent fill, but if I
do then the fill covers the image and the image is not visible.
Why is the transparent fill not transparent?
Do not post admin requests to the list. They will be ignored.
Quartz-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden