Drawing to NSView with OpenGL questions
Drawing to NSView with OpenGL questions
- Subject: Drawing to NSView with OpenGL questions
- From: Konstantin Anoshkin <email@hidden>
- Date: Sun, 10 Feb 2008 19:11:01 +0300
Hello.
Is it possible to somehow mix CoreGraphics and OpenGL drawing commands
using a common NSView? What I'd like to do is render my own vertex/
fragment shaders into a CGContext.
1. Currently, I'm drawing to an NSOpenGLPixelBuffer with OpenGL, then
copy the drawing with glReadPixels() to a scratch buffer, from which I
copy byte rows to an NSBitmapImageRep, which I can draw to my
CGContext or NSGraphicsContext. That's a lot of code moving pixels
from VRAM to RAM to GPU, though I admit it works. Is there a more
straight-forward way to draw with OpenGL to CGContext or, maybe, with
CoreGraphics to an NSOpenGLView?
2. I've written and tested several shaders both in OpenGL Shader
Builder and in a test app inside an NSOpenGLView. They work just fine.
However, I don't have any NSOpenGLViews in my app, so I guess it's the
absence of glutCreateWindow() that causes glCreateShader() to return 0
without errors. If I call glutCreateWindow(), I get "GLUT Fatal Error:
redisplay needed for window 1, but no display callback." in the
console, so I have to add display callbacks, but then I'm moving away
from my NSView. What confuses me is why simple OpenGL commands works
while shaders don't.
What is the best practice of mixing CoreGraphics and OpenGL? Ideally,
I'd want to avoid copying pixel data back to RAM because that slows
down my animations a lot.
Thanks in advance.
Regards.
Konstantin.
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden