I am in the process of converting an old existing project to use AVFoundation APIs. I have two goals
1)
Read the data into a Int16, interleaved format buffer
2)
Manipulate the data in the buffer directly (not realtime)
In testing, I found that I am getting “error -10868”, however I haven’t been able to figure out what is wrong.
The buffer is created, there is no error but I am unable to play the audio. How do I overcome this issue?
My second question is how do I access the buffer directly or more specifically the stereo channels. When I was using CoreAudio APIs (pre-Swift), I understood but
now I’m lost. In coreaudio, I previously used an interleaved buffer. Would someone be kind enough to show me how I could access the channels in both interleaved and non-interleaved.
My test input file is an Apple Lossless Compression file if that matters.
Thank you,
W.
/*
Test result
bufferExists = true
readErr = nil
audioFileBuffer.format = <AVAudioFormat 0x6080008968a0: 2 ch, 44100 Hz, Int16, inter>
File URL: Optional("file:///Users/wedwards/Desktop/tempAudioOutput.m4a")
File format: <AVAudioFormat 0x608000892ed0: 2 ch, 44100 Hz, 'alac' (0x00000001) from 16-bit source, 4096 frames/packet>
File format descr: <AVAudioFormat 0x608000892ed0: 2 ch, 44100 Hz, 'alac' (0x00000001) from 16-bit source, 4096 frames/packet>
Processing format: <AVAudioFormat 0x60800089dce0: 2 ch, 44100 Hz, Int16, inter>
Length: 184320 frames, 4.17959183673469 seconds
2015-04-13 15:47:36.911 SWIFT - Manipulate Audio Data[2993:250205] 15:47:36.911 ERROR: AVAudioNode.mm:521: AUSetFormat: error -10868
2015-04-13 15:47:36.912 SWIFT - Manipulate Audio Data[2993:250205] error -10868
*/
import Foundation
import AVFoundation
import Cocoa
var audioEngine :
AVAudioEngine = AVAudioEngine()
var audioFilePlayer:
AVAudioPlayerNode = AVAudioPlayerNode()
func displayAudioFormatInfo( fileURL:
NSURL, audioFile:
AVAudioFile)
{
var fileLength = AVAudioFramePosition(audioFile.length)
var lengthInSeconds = Float64(fileLength) / audioFile.fileFormat.sampleRate
println("File URL:
\(fileURL.absoluteString)")
println("File format:
\(audioFile.fileFormat)")
println("File format descr:
\(audioFile.fileFormat.description)")
println("Processing format:
\(audioFile.processingFormat.description)")
println("Length:
\(fileLength) frames,
\(lengthInSeconds) seconds")
}
func readAudioDataTEST( fileURL:
NSURL )
{
var readErr :
NSError? =
nil
var isInterleaved =
true
var format = AVAudioCommonFormat.PCMFormatInt16 // the desired working format
let audioFile = AVAudioFile(forReading: fileURL, commonFormat: format, interleaved: isInterleaved,
error: nil)
let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatInt16, sampleRate:
44100.0, channels:AVAudioChannelCount(2),
interleaved: true)
let audioFrameCount = AVAudioFrameCount(audioFile.length)
let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
var bufferExists = audioFile.readIntoBuffer(audioFileBuffer, error: &readErr)
println("bufferExists =
\(bufferExists)")
println("readErr =
\(readErr)")
println("audioFileBuffer.format =
\(audioFileBuffer.format)")
displayAudioFormatInfo( fileURL, audioFile )
var mainMixer = audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.connect(audioFilePlayer, to: mainMixer, format: audioFileBuffer.format)
audioEngine.startAndReturnError(nil)
audioFilePlayer.play()
audioFilePlayer.scheduleBuffer(audioFileBuffer, atTime:
nil, options:
nil, completionHandler:
nil)
}
func getUrlFromNavDialog() ->
NSURL?{
var openPanel = NSOpenPanel()
var url:
NSURL?
openPanel.allowsMultipleSelection =
false
openPanel.canChooseDirectories =
false
openPanel.canCreateDirectories =
false
openPanel.canChooseFiles =
true
openPanel.runModal()
return openPanel.URL?
}
var fileURL = getUrlFromNavDialog()
if (fileURL !=
nil ) { readAudioData(fileURL!) }