Re: QT/OpenGL
Re: QT/OpenGL
- Subject: Re: QT/OpenGL
- From: Ken Tabb <email@hidden>
- Date: Wed, 29 Aug 2001 11:34:08 +0100
And so it was that Brent Gulanowski said on 26/8/01 6:25 am:
>
[Newbie]
>
>
My feeling after looking into the OpenGL part of this is that these media
>
interfaces will always be a layer which operate procedurally, since the only
>
way they can offer performance is to be essentially one-way data pipes: you
>
provide a stream of data and they process it into images/sounds as quickly
>
and/or as smoothly/efficiently as possible. Messaging and 2-way
>
communication only slow down the media presentation. So I figure that you
>
can model your data any way you want, but for presentation, convert it to
>
the array or struct preferred by the API, fire off your pointer, and then go
>
back to managing the model. OpenGL seems to me like a monolithic object
>
which is best thought of as a bottomless well. You can only judge if it's
>
working right by actually seeing what's on screen. Heck, just dropping
>
OpenGL from fullscreen to windowed mode apparently kills performance, and is
>
a prime example of the one-way street of OpenGL dataflow.
>
>
Unfortunately this works only for games and other media where the user is
>
primarily an observer operating a few limited controls. With a 3D modelling
>
app, say, you need to layer on a custom view that is sensitive to the user's
>
actual interactions with models, not just keyboard and mouse polling, so you
>
need to reassociate the 2D transform with the original 3D model: not easy to
>
do without repeating a lot of the work already done to get the 2D
>
rasterization made in the first place.
>
>
I'll be very happy if/when someone (might have to be me in a couple years)
>
writes a QD3D/Quesa-like library for Cocoa. But the 3D feedback will still
>
be stupid and one-way, I bet, because by the time the 3D models are
>
rasterized, the graphics card can't associate them with the mouse (or
>
whatever) any more. I'm wondering what the solution will be to this one-way
>
interactivity problem... I'll be even more happy when that comes around.
>
Seemingly hardware acceleration does not mix well with OOP. Aren't OpenGL,
>
Quicktime, MIDI etc just hardware abstractions?
>
>
For now, I'm guessing that the trick is to minimize the number of objects in
>
an app which are used to communicate with the media APIs -- perhaps even
>
using a single class with a uniqe instance for each media type:
>
MyOpenGLController, MyQuicktimeController, MyMIDIController (?), or
>
what-have-you, perhaps sharing a protocol to do whatever they would all have
>
to be able to do for each API, to let me query the presence/version and
>
installed capabilities of the pipeline before starting to feed it a steady
>
diet of un-encapsulated data.
>
>
Then again, I'm sure there are more gnarly issues from the viewpoint of the
>
experienced Cocoa developers.
Brent,
I totally agree with what you're saying here. Perhaps I didn't make my
original point clearly enough; it's not the non-object-orientedness of
the QT/OpenGL APIs which grate me when using them in Cocoa, it's the fact
that they (especially in QT) rely so heavily on the Carbon API. For
instance, QuickTime uses GWorlds exclusively to do its drawing. While
this is fine it makes it a PITA for Cocoa programmers wishing to do
anything other than play simple movies in their app. In other words, for
programmers who want to actually use QuickTime as a method of
*generation*, rather than just use it to show pre-rendered (no doubt in
Carbon/Windows!) movies.
Unless it was fixed in 10.0.3 or later (I gave up trying at that point),
the NSMovieController is a great example of how much a PITA this
Carbon-dependency is... even Apple couldn't (maybe they still can't?) get
their Movie Controller displaying properly in a Cocoa app without it
vanishing in parts, forgetting to move the time-indicator-slider-thing or
having random noise drawn in its screen space etc. No offence to the
Apple Cocoa folks who at least tried to do this... it was IMHO the fact
that QuickTime is Carbon-based which made it screw up all the time.
The fact QT/OpenGL are not object-oriented is less of a pain... for one,
I doubt Apple see it as their task (and rightly so) to make the
cross-platform and independent OpenGL API object-oriented. But that's not
to say they shouldn't spend some time sprucing up their own QuickTime API
for Cocoa-compatibilty. Or at the very least (which would result in the
same performance knock I'm experiencing by doing it myself) they could
provide a set of utility functions for (for example) getting a movie
frame as an NSImage. I've been lucky enough to bump into Vince deMarco
who helped me write code to do this, but the fact that this sort of
functionality isn't documented just shows how difficult it is to view
It seems to me Apple should offer the following advice to new programmers:
* programmers wishing to use QuickTime should use Carbon
* programmers wishing to use WebObjects should use Cocoa (and Java)
* programmers wishing to mix QuickTime and WebObjects technologies should
think again
... unless they take that multi-layered drawing Steve Jobs uses (with
Darwin at the bottom, Aqua at the top, and Quartz / QuickTime / OpenGL /
Carbon / Cocoa / Java in the middle layers), where all the layers talk
together happily, seriously.
Just a thought.
Ken
----------------------------------------------
Ken Tabb
Mac & UNIX Propellerhead & Network Bloke (Health & Human Sciences)
Computer Vision / Neural Network researcher (Computer Science)
University of Hertforfdshire
e-mail: email@hidden
http://www.health.herts.ac.uk/ken/
Certified non-Microsoft Solution Provider