Re: Moving data around...
Re: Moving data around...
- Subject: Re: Moving data around...
- From: Michael Thornburgh <email@hidden>
- Date: Tue, 27 Aug 2002 14:49:37 -0700
hi mike.
assuming that it's not critical to keep and process every buffer
that's recorded (that is, you won't be writing the audio to disk
or doing analysis of anything more than "the most recent buffer"),
i would have your own buffer, protected by a lock. your IOProc
locks the buffer, copies the audio data to it, increments a serial
number, and unlocks. your UI thread has an NSTimer or something
that wakes it up every buffer-size-in-seconds, locks the lock,
checks the serial number to make sure it's different than last time,
and if so, copies the data to a work buffer, unlocks, sets its
private serial number to the one associated with the buffer it just
copied, then does the spectrum analysis, graph rendering and
screen updating. if the serial number was not updated, then unlock
and return. time spent locked in either thread should be kept
to a minimum.
a buncha folks on this list don't like locking in the IOProc,
especially if the lock-mate is not in a real-time thread.
i don't think it should be an issue for your application,
though.
if you want to do your analysis, rendering, etc in a separate
thread from the UI thread, i would use an NSConditionLock
instead of an NSLock and an NSTimer. the IOProc would
unconditionally lock the buffer, copy data to it, then
unlockWithCondition:kHaveNewData. your analysis thread would
be in a loop, doing [lock lockWhenCondition:kHaveNewData],
and once you've copied the buffer to your work buffer, you
would unlockWithCondition:kOldStaleData.
that's what i would do, anyway.
-mike
On Tuesday, August 27, 2002, at 01:29 PM, Michael Beam wrote:
Hello Everyone,
I'm building a little spectrum analyzer application that will show a
frequency spectrum in real-time, and I've reached somewhat of a
sticking point in the design of the application. My question has to do
with the best way of getting the input audio data out of the coreaudio
IOProc callback, and into the user interface. Is it best to "push" the
data from the callback to an interface controller object, or would it
be better to drop the data into some intermediary buffer (maybe a
pipe?) and have the interface controller object "pull" it in response
to some notification? I'm not too well versed in the intricacies of
real-time app design, so any resources and tips would be greatly
appreciated. Thanks in advance!
Regards,
Mike Beam
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.