Hello all,
I'm working on a digital audio editor in Swift. So far I do import/export of files using AudioToolbox. I only have makeshift playback currently, where I send the whole buffer contents at once (so there's a ridiculous amount of lag before playback starts). Now I'm at the point where I want to get proper playback sorted. As part of my research, I tried to port the examples in "Learning Core Audio" to Swift. Mostly this went okay, however the most interesting examples for playback are currently broken on Apple's end*.
Can anyone describe to me the most appropriate way to do real-time playback of a buffer (floats normalized to ±1.0) using Swift?
Something using callbacks would suit me well, as I'll need to update the playhead position in my GUI and I was planning on doing so by calling a Notification each time the system executes my callback. Also, I want to mix the channels together on my end, to ensure that what the user hears is precisely the same as what they get upon export to file. Is there some particular method in AVAudioEngine appropriate for this?
Thanks,
Charles
* These projects all rely either upon incorrectly bridged structs, or C Function Pointers:
6) CARecorder
7) CAPlayer
8) CAConverter
12) SimpleSineWavePlayer
13) AUGraph Play Through
14) AUGraph Play Through II
17) iOS Backgrounding Tone
18) iOS AU Pass-Through
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden