Question about CoreAudio and threading
Question about CoreAudio and threading
- Subject: Question about CoreAudio and threading
- From: Per Bull Holmen <email@hidden>
- Date: Sun, 08 May 2011 12:27:56 -0700 (PDT)
Hi
I have made a standalone Cocoa application which does some audio analysis and
displays it in a Cocoa view. It doesn't really change the audio, just analyse
and display. I would like to make it an AudioUnit with a Cocoa view, so I can
plug it into any application. I have read the Apple document Audio Unit
Programming Guide, and read through the source code of the SonogramViewDemo demo
code which does approximately what I want to do. I don't know too much about
CoreAudio, and am not so good at c++ (I'm an ObjC man, but I can cope), so keep
that in mind.
Anyway, the AU programming guide is a little light on info about how coreaudio
handles threading. From what I understand coreaudio rendering happens in a
separate thread from the UI. So, must I, when overriding
AUEffectBase::GetProperty, assume that it might be called concurrently with
AUEffectBase::Render or AUEffectBase::ProcessBufferLists?
The reason I'm asking is that the SongramViewDemo sends the FFT data produced by
the audio unit to the view via a property called
kAudioUnitProperty_SonogramOverview. Whenever the view needs update, the view
calls AudioUnitGetProperty (from the UI thread), which in turn calls
SonogramViewDemo::GetProperty, which copies the FFT data to a buffer owned by
the view (SonogramViewDemo is a subclass of AUEffectBase). If GetProperty is
called concurrently with the audio rendering code, it might be a concern if the
AU is producing FFT data while it's also copying data from the same FFT buffer
to the view's buffer.
Per
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden