• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Confusion with AudioStreamBasicDescription
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Confusion with AudioStreamBasicDescription


  • Subject: Re: Confusion with AudioStreamBasicDescription
  • From: "Stephen F. Booth" <email@hidden>
  • Date: Wed, 15 Sep 2010 21:04:34 -0700

I'm new to programming with Core Audio, and am having a bit of trouble
wrapping my head around some nuances of the fields in an
AudioStreamBasicDescription when hosting AUs.  One of the developer
docs on the topic has the following example:

size_t bytesPerSample = sizeof (AudioUnitSampleType);
AudioStreamBasicDescription stereoStreamFormat = {0};

stereoStreamFormat.mFormatID          = kAudioFormatLinearPCM;
stereoStreamFormat.mFormatFlags       = kAudioFormatFlagsAudioUnitCanonical;
stereoStreamFormat.mBytesPerPacket    = bytesPerSample;
stereoStreamFormat.mBytesPerFrame     = bytesPerSample;
stereoStreamFormat.mFramesPerPacket   = 1;
stereoStreamFormat.mBitsPerChannel    = 8 * bytesPerSample;
stereoStreamFormat.mChannelsPerFrame  = 2;           // 2 indicates stereo
stereoStreamFormat.mSampleRate        = graphSampleRate;

My confusion is this: Since mChannelsPerFrame is set to 2, this will
be a stereo stream and therefore interleaved.  So each frame should
contain 2 samples, one for each channel.  Why then are mBytesPerPacket
and mBytesPerFrame not 2 * bytesPerSample, as they each contain two
samples?

The key is the format flags- kAudioFormatFlagsAudioUnitCanonical is a combination of several other flags depending on if you are using fixed point samples (on iOS) or floating point samples (on OS X).  The common flags between the two OSs are:

kAudioFormatFlagsNativeEndian | kAudioFormatFlagIsPacked | kAudioFormatFlagIsNonInterleaved

(for the full definition see CoreAudioTypes.h)

Since kAudioFormatFlagIsNonInterleaved is set, mChannelsPerFrame specifies the total number of channels and the rest of the fields refer to a single channel's data.

HTH,
Stephen
 
 
I'm sure I'm overlooking something rather straightforward here, but
can't seem to see what.  Where am I going wrong?

Thanks in advance!
-Arshan
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >Confusion with AudioStreamBasicDescription (From: Arshan Gailus <email@hidden>)

  • Prev by Date: Confusion with AudioStreamBasicDescription
  • Next by Date: Re: AudioFileGetProperty possible bug???
  • Previous by thread: Confusion with AudioStreamBasicDescription
  • Next by thread: Migrating 10.4 AudioUnit project?
  • Index(es):
    • Date
    • Thread