Re: Core Data OpenGL
Re: Core Data OpenGL
- Subject: Re: Core Data OpenGL
- From: Erik Buck <email@hidden>
- Date: Thu, 22 Feb 2007 12:01:41 -0800 (PST)
In my mesh editing application, I am performing computational hydrology. For example, if you dig a ditch over here, will the watter tend to pool more or less over there. What will happen if you pave an area so that it becomes impermeable to water? What erosion pattern can be expected? Where will standing water reach after a 100 year rain or a 50 year rain?
When the terrain mesh is edited, significant computations are needed. Providing a more or less real-time update at a worst case of 2 Hz is a market leading "wow" factor in this field.
More generally though, I use entities for scene graph elements. Core Data is good at expressing relationships in a graph. Some scene elements contain just a single 3D vertex. Other elements contain complex meshes or models imported from other tools.
I use single NSData instance to store an indexed collection of vertexes that may include multiple texture coordinates and substantial meta-data. The NSData is just an archive of relevant Objective-C objects. I store it in Core Data as a big blob because there is a lot of data and it changes very seldom.
Mesh based scene graph elements store an NSData attribute for the mesh itself too. Again, the NSData is an archived Objective-C object (usually just one object...) that identifies points within the mesh by index into the single NSData instance which stores the indexed collection of vertexes. This works well because the NSData is essentially non-mutable. Mesh objects can also use point displacements that are defined in relation to other points. For example, when calculating erosion or soil removal, new points are calculated at progressively lower elevations... If soil collects, progressively higher elevations are generated.
There are obviously situations in which individual float values are stored as attributes. As I think I mentioned earlier, sometimes colors are stored as four separate float attributes composing an RGBA value, but I just don't store that many colors in any particular document. This may be atypical of most data sets intended for use with OpenGL. My application calculates RGBA values on demand instead.
I don't think the granularity at which data is stored in Core Data will have as much impact as the number of entities, but that is just a hunch. It was convenient for me to archive individual objects or small collections of objects and store them as data. It was convenient for me to represent the overall scene graph and relationships between entities in part because the scene graph changes much more often than the data within objects in the scene graph. Different needs might drive a different solution.
Finally, objects like viewing volumes (frustum), pre-selected views (camera positions and attributes), time lapse sequence definitions, thumbnails, textures, QuartzComposer based dynamic textures, etc. are stored in Core Data. This is very handy because if for example a scene graph element upon which a camera position depends is modified, Core Data keeps all of the relationships synchronized for me. Automatic undo/redo and revert provide more "wow" experiences for users who have never had those features before.
As a more real-time example, time lapse sequences can be played at 30Hz (FPS) or faster because most of the data is pre-calculated. Most of the work for time lapse is just updating and swapping double buffered vertex arrays and updating dynamic textures.
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden