Re: Pointers and NSImages
Re: Pointers and NSImages
- Subject: Re: Pointers and NSImages
- From: Erik Buck <email@hidden>
- Date: Fri, 3 Nov 2006 08:54:42 -0800 (PST)
I am glad to hear that the issues is resolved. As it happens, my company produces an Open GL 3D game/visualization engine, so your more detailed description of the problem is close to my heart. Our engine is used for architectural visualization including terrain and some games.
Our solution to texture management was to create a very _thin_ class that has the sole purpose of storing raw texture data for efficient use with the graphics card and associating the texture data with an OpenGL texture ID.
We then have subclasses that load, manage, and update texture data based on specialized techniques such as playing Quicktime movies frame by frame into texture data or playing Quartz Composer compositions or loading simple image data in a variety of formats or converting back and forth to NSBitmapImageRep or using Core Image etc. We have Windows versions that use DirectX image load/manipulation features similarly.
The objects that are drawn in a scene are managed in a scene graph and quad-tree that is view frustum culled. The objects in the scene such as meshes and boned models only need the OpenGL texture ID(s) and texture coordinates. They don't need a pointer to the object that stores the texture data, and the way the texture data gets updated in RAM or on the graphics card is of no consequence to the objects that use the textures. We use reference counting when a scene object acquires a texture ID. When the last object that uses that texture ID stops using it, the texture data associated with the ID can be recycled for some other use. In practice, we never actually de-allocate texture storage until the program exist.
The nice thing is that the artists can freely map dynamic textures such as quicktime movies to any object. For a fake example that might clarify, the Predator's shimmering translucency could be accomplished via a texture that is a short looping movie. In our products, rather than creating particle fire and smoke, billboards displaying filmed fire loops and a movie of smoke are used. Weapon impact effects can be a combination of particles and Core Image effects textured onto objects.
We actually use GOES8 satellite weather clips for sky box effects to produce variation and a perception of time-lapse.
In spite of using cross platform Objective-C and Python, we provide very competitive render speed even for very complex scene graphs using multi-texturing. We measure only about 2% of our CPU time spent in the Objective-C runtime and associated framework overhead areas such as -retain, -release and -autorelease. The reason is that the ratio of OpenGL calls per Objective-C message is very high.
For comparison, an earlier C++ version of our engine spend 2% of its time in constructors alone. The reason is that objects were used for small data types like vector3D, and the objects were almost always passed by value rather than by reference. Every function call had to construct/copy all of its object arguments. Every local variable had to be constructed when it entered scope and destructed when it left. Every assignment called a copy constructor. Using const references for function arguments instead would have helped, but you just cant avoid a lot of temporary object construction ...
In the Objective-C version, we use C structures and have macros like EBMakeVector3D(). This approach is much more efficient in my experience. Of course, we could have foregone classes and used the macro approach in C++ too with care to avoid implicit constructorsÂ…
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden