Re: highlight a nsview on click-drag?
Re: highlight a nsview on click-drag?
- Subject: Re: highlight a nsview on click-drag?
- From: Graham Cox <email@hidden>
- Date: Wed, 18 Mar 2015 08:54:59 +1100
> On 17 Mar 2015, at 10:28 pm, Sandor Szatmari <email@hidden> wrote:
>
> I like the simplicity of your design, but considering point b, how does a play head that spans multiple tracks, for a multi track editing interface, work? Presumably, we're talking about each view managing a single channel of audio. If there were many channels, would managing the play head in a separate, superimposed, view (or layer) be easier, or would you expect the play heads from each of the audio channel views be easily synchronized without any tear or drift?
Well, the original discussion didn't mention multiple channels, but if it were needed, with a single, synchronised playhead, I'd probably do it with a stack of views, one per channel, but with the playhead hosted by a single view that held each channel as a subview, and draws the playhead on top of the stack. However, synchronising a playhead across multiple views is also feasible without tearing - remember that all views are flushed to the screen at once at the end of each event loop, so provided the playhead position was updated for each view at the same time, you'd never see them out of step with each other.
In a trace analyser I've recently completed for a digital logic simulator, which has many of the same requirements, I use a stack of views drawing all the common stuff in the stack. Interestingly, my first straightforward design used high-level objects such as NSBezierPath to draw the "traces" but was excruciatingly slow once things got busy (i.e. beyond a flat line). In the end by developing a custom object to draw the actual traces using Core Graphics and putting a lot of effort into ensuring that the very minimum is drawn when needed, performance is now excellent - it no longer slows down the simulation which had to wait until it had updated the traces before it could proceed to the next simulation cycle. Further gains could probably be had by backing the traces with CALayers but as it stands that probably isn't needed.
In my design, the "stack" view draws a timebase grid behind all of the other views (which are all transparent) and also event markers which are simple vertical lines associated with ruler markers. Each trace view is a subview which renders the traces using CG functions (mostly CGContextStrokeLineSegments). Screenshot here: http://apptree.net/images/LB_trace_analyser.png
In some ways an audio waveform view would be simpler, because it doesn't have to be updated in real time - once the audio is loaded the waveform image can be generated once and then blitted as an image. Maybe if the timebase is changeable by the user it would have to regenerate the image to display more or less detail, but general repaints and motion of the playhead don't require too many tricks other than figuring out which part of the image to copy.
--Graham
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden