• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
AVAudioEngine - Multichannel output does not play on channel 1 and 2
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

AVAudioEngine - Multichannel output does not play on channel 1 and 2


  • Subject: AVAudioEngine - Multichannel output does not play on channel 1 and 2
  • From: Dave Gibson <email@hidden>
  • Date: Wed, 22 Feb 2017 15:05:22 +0000

AVAudioEngine - Multichannel output does not play on channel 1 and 2


We have an existing iOS application that uses the Core Audio C API and achieves multichannel output using the Matrix Mixer. I’m currently rewriting the audio engine using AVAudioEngine but I am having problems getting multichannel output working correctly.


In terms of hardware, I am running    iPad Air 2 (iOS 10.2) ——>  iConnectAudio 4+    (which receives 8 channels of audio from the iPad via USB). In my audio graph, I have a single custom AVAudioUnit which is outputting 8 channels of audio to the outputNode. I am setting up my audio format as follows:



        // Muti channel format

        let numChannels = UInt32(8)

        let multiChannelLayout = AVAudioChannelLayout(layoutTag: kAudioChannelLayoutTag_DiscreteInOrder | numChannels)

        

        let multiChannelFormat = AVAudioFormat(commonFormat: stereoFormat.commonFormat, sampleRate: stereoFormat.sampleRate, interleaved: stereoFormat.isInterleaved, channelLayout: multiChannelLayout)



I have confirmed that my custom AVAudioUnit is constantly sending all 8 channels of audio to the outputNode. However, on the iConnectAudio 4+ control panel, I can see that the device is only receiving audio on channels 3 - 8. For some reason, it is not receiving any audio from the app on the first 2 channels. I have confirmed this in listening tests too.


However, the audio device IS receiving all of the system sounds on channels 1 and 2. For example, the click sounds when editing the text in a UITextField can be heard on channels 1 and 2. It seems as though either - 


* iOS is blocking all audio on channels 1 and 2

* The stream for audio channels 1 and 2 and getting zeroed inside the AVAudioOutputNode


There seems to be no issue with the audio hardware itself because I am able to output audio on all 8 channels using our existing iOS application and other apps, such as Cubasis. So, it would suggest that the issue perhaps lies in the AVAudioOutputNode. If I set the AVAudioFormat to a 2 channel configuration, the audio will output on channels 1 and 2 (mixed with the system sounds) but any configuration with more than 2 channels results in no audio on channels 1 and 2, other than the system sounds.


It seems to me that  'kAudioChannelLayoutTag_DiscreteInOrder | numChannels' would be the correct layout to use in this case - but please do correct me if I am wrong - there's very little information on how AVAudioEngine handles this information. In essence - I want to output each channel independently to the audio interface, I want the outputNode to just output the audio channels as it receives them - this is a studio application, not a surround sound/positional audio scenario. I have also tried setting the AVAudioChannelLayout to some value other than  kAudioChannelLayoutTag_DiscreteInOrder but to no avail - resulting in either silence or 2-channel audio. 


Has anyone had this or a similar issue?  Or could anybody suggest what’s happening here and how I might fix the issue.


Many thanks,


Dave

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Prev by Date: Can there be symbol collisions with Swift code in AUs?
  • Next by Date: AUAudioUnit subclasses: Obj-C vs Swift
  • Previous by thread: Can there be symbol collisions with Swift code in AUs?
  • Next by thread: AUAudioUnit subclasses: Obj-C vs Swift
  • Index(es):
    • Date
    • Thread