• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Why does the AudioBuffer structure contain mNumberChannels?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Why does the AudioBuffer structure contain mNumberChannels?


  • Subject: Re: Why does the AudioBuffer structure contain mNumberChannels?
  • From: James McCartney <email@hidden>
  • Date: Mon, 23 Feb 2015 10:11:00 -0800


You’re right that it is a bit of redundant information. The correct value mNumberChannels for an AudioBuffer can be derived from the mChannelsPerFrame and the interleaved flag.
For non interleaved formats, mNumberChannels is always 1. For interleaved formats, mNumberChannels is equal to mChannelsPerFrame.
The reason for mNumberChannels existence may be that AudioBufferList may have slightly preceeded AudioStreamBasicDescription in CoreAudio’s development. It has a slightly earlier check-in date. These are 15 year old structs by now.
The channel assignments for the buffers are determined by an AudioChannelLayout for multichannel, or in the case of stereo, the order is always buffer 0 is left and buffer 1 is right.

On Feb 22, 2015, at 10:50 AM, Ilya Konstantinov <email@hidden> wrote:

On Sun, Feb 22, 2015 at 8:35 PM, Paul Davis <email@hidden> wrote:
Because an AudioBuffer is not related to an AudioStream. AudioBuffers can be passed into something that will feed them into a stream,

... ​or read them from a stream,
 
and the buffer format and stream format may be different.

The buffer doesn't specify any other format property though, e.g. whether it's 32-bit floating point of 16-bit signed.
 
Consider the trivial case of merging two single channel AudioBuffers into a two channel stream.

How will it know which channel (e.g. left or right) is in each buffer?​ Seems arbitrary.

Besides, let's say the stream is stereo interleaved. I know non-trivial conversion and resampling can occur within the AU to compensate format differences between its input and output, but for this, the input should receive data in its expected format (i.e. according to the Stream Format). For example, I cannot configure an AU to expect signed 16-bit on its input scope but somehow feed it 32-bit float buffers.
James McCartney
Apple CoreAudio
email@hidden



 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >Why does the AudioBuffer structure contain mNumberChannels? (From: Ilya Konstantinov <email@hidden>)
 >Re: Why does the AudioBuffer structure contain mNumberChannels? (From: Paul Davis <email@hidden>)
 >Re: Why does the AudioBuffer structure contain mNumberChannels? (From: Ilya Konstantinov <email@hidden>)

  • Prev by Date: Re: Why does the AudioBuffer structure contain mNumberChannels?
  • Next by Date: Does AudioUnitRemovePropertyListenerWithUserData wait for all callbacks?
  • Previous by thread: Re: Why does the AudioBuffer structure contain mNumberChannels?
  • Next by thread: Does AudioUnitRemovePropertyListenerWithUserData wait for all callbacks?
  • Index(es):
    • Date
    • Thread