• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Real-time buffer playback in Swift?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Real-time buffer playback in Swift?


  • Subject: Re: Real-time buffer playback in Swift?
  • From: Charles Constant <email@hidden>
  • Date: Sat, 27 Sep 2014 15:36:41 -0700

Thank you, all, for the advice, though I admit much of it I don't understand. 

I thought I'd update the list as a courtesy, and also to summarize my own thoughts:

The short story is that I have decided to move back to C-based callbacks, using the old API, for anything that is time sensitive. 

It was actually even easier than I had expected to set up write my class using the new API in Swift (it took an afternoon). This is the way I proceeded:

1) create, and configure (attachNode, connect), a simple graph of AVAudioEngine, AVAudioPlayerNode, and AVAudioMixerNode. Start the player node with .play()

2) set up a block using async dispatch_async so that we don't block the main thread...

3) start iterating over my audio buffers (floats), repeatedly creating an AVAudioPCMBuffer, copying samples to it using buffer.floatChannelData.memory and .scheduleBuffer'ing them to play (player.scheduleBuffer atTime 'nil')

The result was initially encouraging, playback started reasonably quickly, and was not choppy.

So why am I abandoning this? 

Because, as I am guessing some of you were trying to warn me, I can't think of a simple way to control the exact time at which playback starts. I can't be doing this stuff on main thread, obviously, so I'm stuck with dispatch_async and the system may not want to execute my blocks as quickly as I do. 

While the lag (i.e.: before playback begins) I see may be acceptable for the task at hand ( < 1 second ), I don't want to continue down a road where the lag is changing outside my control. The only information I see on developer.apple.com about real time scheduling is in their kernel programming guide. Assuming anything that guide even works with the current version of Swift, it strikes me as obvious, that it will be easier to give up, use C, and allow the system to handle the scheduling for me via callbacks. 

Maybe this will be useful for someone else. I'm eager to hear further comments if anyone has any.











On Wed, Sep 17, 2014 at 1:03 PM, Paul Davis <email@hidden> wrote:


On Wed, Sep 17, 2014 at 1:09 PM, Zack Morris <email@hidden> wrote:
  Like context switches still take on the order of milliseconds.

12 usecs or less, actually, on a modern Linux kernel on not so fast hardware. But that doesn't include the effect of the TLB flush, which is variable as I mentioned.
 

I think on some level that the way computers perform context switching is wrong.  It probably won’t be fixed in typical UNIX kernels anytime soon, so I’m wondering if there’s a way to sidestep the kernel and somehow queue everything up to iron out the kinks of sleeping for so long.  I realize it probably isn’t possible when we need < 30 millisecond latencies for realtime audio processing.  But if CoreAudio can do it, then why have the separation between it and normal kernel processes at all?  I think it’s an honest question.

this is old research (like 1990's stuff). the answer is to use a single 64 bit address space and hardware that can mark page ranges as protected. the idea is to stop using address spaces as a means of providing protection from cross-process memory access. Look up (for example) the Opal operating system "concept".

when you do this, the whole notion of address spaces goes away (a pointer with address 0xfeedface can be passed around between processes, but they may not all be able to read/write it), which removes the need for VM hardware which speeds up context switching like crazy.

as i said, this is all old stuff that ran aground thanks to Microsoft and then Linux/Unix's utter domination of the OS marketplace. not much room left for deeply experimental kernels, and the ideas behind things like Opal can't easily be grafted onto the contemporary kernel of Windows, Darwin or Linux (or FreeBSD or whatever floats your boat)
 

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >Re: Real-time buffer playback in Swift? (From: Ian Kemmish <email@hidden>)
 >Re: Real-time buffer playback in Swift? (From: Zack Morris <email@hidden>)
 >Re: Real-time buffer playback in Swift? (From: Paul Davis <email@hidden>)
 >Re: Real-time buffer playback in Swift? (From: Zack Morris <email@hidden>)
 >Re: Real-time buffer playback in Swift? (From: Paul Davis <email@hidden>)

  • Prev by Date: Re: Convert an in-memory AAC file to LPCM
  • Next by Date: Re: What could cause error -4 in Logic validation?
  • Previous by thread: Re: Real-time buffer playback in Swift?
  • Next by thread: Re : Real-time buffer playback in Swift?
  • Index(es):
    • Date
    • Thread