• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Completely not getting it with AudioBufferList and CASpectralProcessor
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Completely not getting it with AudioBufferList and CASpectralProcessor


  • Subject: Completely not getting it with AudioBufferList and CASpectralProcessor
  • From: David Preece <email@hidden>
  • Date: Tue, 2 Dec 2008 16:30:00 +1300

Hi,

I'm trying to feed samples from a file into CASpectralProcessor and having no luck. I started by writing a simple "extract the samples" app using the ExtAudioFile API. That declared a client data format of 44.1k, LinearPCM, floating point, 2 channels per frame, 32 bits per channel and 1 frame per packet. By setting this stream description on both the source file and a wave file created through ExtAudioFileCreateNew I was able to transcode from one file to the next by looping round an ExtAudioFileRead, ExtAudioFileWrite pair. For that I created a single AudioBuffer using an AudioBufferList of one buffer with two channels, 1024 frames at once and a malloc of 8192 bytes.

I'm now trying to use this same code to feed a CASpectralProcessor via the ProcessForwards method call. However, I'm getting a crash in CASpectralProcessor::CopyInput on this loop:

for (UInt32 i=0; i<mNumChannels; ++i) {
memcpy(mChannels[i].mInputBuf + mInputPos, inInput- >mBuffers[i].mData, numBytes);
}


Where inInput is an AudioBufferList and since mNumChannels==2 it would seem to be written with the assumption that I've passed two separate AudioBuffers, one for each channel.

So, have I been inadvertently copying from/to using this interleaved audio I keep hearing about but not really understanding? Do I get non- interleaved audio by creating two separate audio buffers (under the auspices of just one AudioBufferList) and setting mChannelsPerFrame=1? Would this likely fix my problem? Does the CASpectralProcessor only work with non-interleaved audio?

I *will* get there :)

TIA,
Dave


Attachment: smime.p7s
Description: S/MIME cryptographic signature

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Follow-Ups:
    • Re: Completely not getting it with AudioBufferList and CASpectralProcessor
      • From: William Stewart <email@hidden>
  • Prev by Date: Re: Switching Device Programmatically and Subsequent call to AudioDeviceGetCurrentTime
  • Next by Date: IOAudioEngine::performFormatChange cannot be triggered by change of I/O Buffer Size
  • Previous by thread: Re: Switching Device Programmatically and Subsequent call to AudioDeviceGetCurrentTime
  • Next by thread: Re: Completely not getting it with AudioBufferList and CASpectralProcessor
  • Index(es):
    • Date
    • Thread