• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Completely not getting it with AudioBufferList and CASpectralProcessor
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Completely not getting it with AudioBufferList and CASpectralProcessor


  • Subject: Re: Completely not getting it with AudioBufferList and CASpectralProcessor
  • From: William Stewart <email@hidden>
  • Date: Wed, 3 Dec 2008 17:51:35 -0800


On Dec 3, 2008, at 4:16 PM, David Preece wrote:

Thanks for this Bill,

On 3/12/2008, at 7:43 AM, William Stewart wrote:
[snip]
Audio Units use a canonical format of de-interleaved audio data - we wanted to use one standard layout so it would be trivial to pass audio data from one audio unit to the next. So, all effects will only generally deal with de-interleaved data.

Makes sense. A quick aside: Can I take it as read that float32 is the "preferred" format for audio unit chains?

expected is a better word. The only audio units that may take other formats at this point are converter and output units for the desktop.




Have a look at AUOutputBL in Public Utility - that is robust for creating an ABL that will represent different "layouts" of linear PCM in an audio stream basic description

Right, so I can construct one of these using an AudioStreamBasicDescription (actually a CAStreamBasicDescription that's cast)

(dont' even have to cast it as a CAStreamBasicDescription is a(n) AudioStreamBasicDescription)


which is the same stream description I'm setting as the client format on the source file (via ExtAudioFileSetProperty - kExtAudioFileProperty_ClientDataFormat). However, whenever I try to do this with a non-interleaved stream description (streamDescription.mFormatFlags=kAudioFormatFlagIsNonInterleaved | kAudioFormatFlagIsFloat) the call fails with error 1718449215 which does not appear to be documented.


1718449215 == ?fmt

bad format error - there's something wrong with the format you are trying to set. In the flags above, you probably need to be using the packet flag as well - if you use CAStreamBasicDescription to construct your ASBD does that work (it should)?


Am I to take it that I can only extract audio in the interleaved format? Does this apply to writing as well? Am I to take it that I should de-interleave manually? - in itself not a problem except for my dislike of reinventing the wheel.

ExtAudioFile should deal with either I think

Bill



Thanks,
Dave


_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
References: 
 >Completely not getting it with AudioBufferList and CASpectralProcessor (From: David Preece <email@hidden>)
 >Re: Completely not getting it with AudioBufferList and CASpectralProcessor (From: William Stewart <email@hidden>)
 >Re: Completely not getting it with AudioBufferList and CASpectralProcessor (From: David Preece <email@hidden>)

  • Prev by Date: RE: IOAudioEngine::clipOutputSamples cannot be called
  • Next by Date: Re: IOAudioEngine::clipOutputSamples cannot be called
  • Previous by thread: Re: Completely not getting it with AudioBufferList and CASpectralProcessor
  • Next by thread: IOAudioEngine::performFormatChange cannot be triggered by change of I/O Buffer Size
  • Index(es):
    • Date
    • Thread