Re: Real-time buffer playback in Swift?
Re: Real-time buffer playback in Swift?
- Subject: Re: Real-time buffer playback in Swift?
- From: Charles Constant <email@hidden>
- Date: Tue, 16 Sep 2014 13:15:10 -0700
Re: "real time", "render callback"
It's very possible that I have a misunderstanding of the correct terminology here. What I need to do in my project is not get such low latency that someone could monitor vocals etc.
All I need to do is mix some buffers together from memory on the fly, and update the position of the playhead. The WWDC video for Session 502 makes this seem possible (I could be wrong) by
1) repeatedly enqueing buffer segments (presumably the thing starts playing after you send it the first one)
and then
2) only updating some variable that represents the playhead frame.
This is just my impression. This is my first day with this API so I'm not positive by any means that I understand it, and I never did get Swift to use callbacks, so in that case I don't claim it should work.
You definitely seem more confident than I am here, I'd be interested to get you take on that session.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden