Re: Best drawing technology for audio waveforms, envelopes, etc.?
Re: Best drawing technology for audio waveforms, envelopes, etc.?
- Subject: Re: Best drawing technology for audio waveforms, envelopes, etc.?
- From: Alastair Houghton <email@hidden>
- Date: Sat, 23 Jun 2007 23:31:26 +0100
On 23 Jun 2007, at 20:35, Hans Kuder wrote:
I'm just starting a project along the lines of an audio editor that
will be
displaying lots of audio waveforms, user-drawn envelopes, frequency
spectra,
and so on. The central thrust of the project involves the concept of
"overlays" - both visual and auditory - on sections of the sound
file. So my
waveform displays have to show semi-transparent layers and so on.
My question is - what would be the best technology to accomplish this?
Quartz 2D, an NSView subclass, or something else?
I am familiar with subclassing NSViews for drawing but I find them
pretty
inflexible for transparency layers, adding and removing layers, and
so on.
My guess from your remarks is that you were trying to make each layer
into a separate NSView, but I don't think that's a very good idea,
not least because overlapping views are not officially supported.
My personal take on this is that you probably want to make your
display into a *single* NSView, and then make your own objects
representing the layers. Then in your -drawRect: you might do
something like this:
- (void)drawRect:(NSRect)rect
{
[layers makeObjectsPerformSelector:@selector(draw)];
}
Each of your layer objects (in your "layers" NSArray, which should be
a member variable) can then implement a -draw method to render
whatever data it pleases. If a layer is particularly expensive to
render, you could make it draw its content to an image or a CGLayer,
then have the -draw method just render that.
You might also consider making your layer objects subclasses of
NSResponder, so that they can handle events. If you do that, you can
make your layer objects part of the responder chain, which can be
quite useful for handling commands and events.
Other options you might consider are Core Animation (for Leopard) or
OpenGL (which will work on any system), though in both cases you're
going to have to be a little careful because audio waveforms when
fully zoomed-in can be quite wide (you might have to resort to some
trickery there if you use either of these options as there are size
limits on layers and textures I believe).
BTW, please remember not to make the classic mistake when drawing
audio waveforms, which is to assume that you can zoom out by simply
drawing every Nth point. To zoom out without drawing every point, at
a minimum you must find the minimum and maximum sample values at each
horizontal co-ordinate. I'm sure you wouldn't make this mistake, but
it's so common that I thought I'd mention it here.
Kind regards,
Alastair.
--
http://alastairs-place.net
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden