• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Real-time buffer playback in Swift?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Real-time buffer playback in Swift?


  • Subject: Re: Real-time buffer playback in Swift?
  • From: Jamie Bullock <email@hidden>
  • Date: Tue, 16 Sep 2014 08:30:18 +0100

Hi Charles,,

Take a look at AVAudioFile and AVAudioPCMBuffer as described in WWDC 2014 Session 501 https://developer.apple.com/videos/wwdc/2014/

The only way I have found to access the raw floating point sample data is to use UnsafeArray, like so:

// Where buffer is an AVAudioPCMBuffer instance
        let channels = UnsafeArray(start: buffer.floatChannelData, length: buffer.format.channelCount)
        let samples = UnsafeArray(start: channels[0], length: buffer.frameLength) // Just the zero'th channel

for i in 0..buffer.frameLength  {
let sample = samples[i]
// etc...

If anyone knows a better way I'd also be interested to know it.

Cheers,

Jamie


On 16 Sep 2014, at 07:11, Charles Constant <email@hidden> wrote:

Hello all,

I'm working on a digital audio editor in Swift. So far I do import/export of files using AudioToolbox. I only have makeshift playback currently, where I send the whole buffer contents at once (so there's a ridiculous amount of lag before playback starts). Now I'm at the point where I want to get proper playback sorted. As part of my research, I tried to port the examples in "Learning Core Audio" to Swift. Mostly this went okay, however the most interesting examples for playback are currently broken on Apple's end*.

Can anyone describe to me the most appropriate way to do real-time playback of a buffer (floats normalized to ±1.0) using Swift? 

Something using callbacks would suit me well, as I'll need to update the playhead position in my GUI and I was planning on doing so by calling a Notification each time the system executes my callback. Also, I want to mix the channels together on my end, to ensure that what the user hears is precisely the same as what they get upon export to file. Is there some particular method in AVAudioEngine appropriate for this?

Thanks,

Charles

* These projects all rely either upon incorrectly bridged structs, or C Function Pointers:
6) CARecorder
7) CAPlayer
8) CAConverter
12) SimpleSineWavePlayer
13) AUGraph Play Through
14) AUGraph Play Through II
17) iOS Backgrounding Tone
18) iOS AU Pass-Through
Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
  • Follow-Ups:
    • Re: Real-time buffer playback in Swift?
      • From: Charles Constant <email@hidden>
References: 
 >Real-time buffer playback in Swift? (From: Charles Constant <email@hidden>)

  • Prev by Date: Real-time buffer playback in Swift?
  • Next by Date: Re: How to capture the audio of an output device in OSX
  • Previous by thread: Real-time buffer playback in Swift?
  • Next by thread: Re: Real-time buffer playback in Swift?
  • Index(es):
    • Date
    • Thread