• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
RE: AudioUnitRender Errors
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: AudioUnitRender Errors


  • Subject: RE: AudioUnitRender Errors
  • From: "Ware, Pete" <email@hidden>
  • Date: Thu, 26 Sep 2002 09:44:14 -0600

Hi Bill,

Thanks for the insight into this. I'm up and running now! I
guess timing IS everything!

Pete Ware
email@hidden

-----Original Message-----
From: Bill Stewart [mailto:email@hidden]
Sent: Tuesday, September 24, 2002 7:52 PM
To: CoreAudio API
Cc: Ware, Pete
Subject: Re: AudioUnitRender Errors


on 24/9/02 11:50 AM, Ware, Pete wrote:

> Hi Bill,
>
> I really can't find examples which call AudioUnitRender. Should
> the AudioBufferList be filled out based on the Audio Unit's
> stream format description?
>
>
> ComponentDescription cd;
> Component auComp;
> ComponentResult result;
> AudioUnit reverbUnit;
> AudioBufferList auBufferList;
> AudioUnitRenderActionFlags ioActionFlags=0;
> UInt32 inNumberFrames=0;
>
>
> cd.componentFlags = 0;
> cd.componentFlagsMask = 0;
> cd.componentType = kAudioUnitType_Effect;
> cd.componentSubType = kAudioUnitSubType_MatrixReverb;
> cd.componentManufacturer = kAudioUnitManufacturer_Apple;
>
> auComp = FindNextComponent (NULL, &cd);
> result= OpenAComponent (auComp, &reverbUnit));
> result= AudioUnitInitialize (reverbUnit);

When an AudioUnit is opened it will have a default stream format
that it
expects on input and output. Generally, this will default to
stereo, 44.1KHz
sample rate, float 32, and de-interleaved channels for a V2 unit

If you are not providing information in that format, then you
will need to
set the property accordingly.

So, first step is to set the StreamFormat property on the output
scope (and
element 0) to the format that you're asking for data in.

For more info on this see the docs - also, setting stream formats
is in a
number of the SDK examples.

> inNumberFrames = 512;
> auBufferList.mNumberBuffers = 1;
> auBufferList.mBuffers[0].mNumberChannels = 1;
> auBufferList.mBuffers[0].mDataByteSize= inNumberFrames *
> sizeof(Float32);
> auBufferList.mBuffers[0].mData= NULL;

mData is null because the ultimate source will give you a buffer
back -
this is how the V2 units work. (This provides the ability for one
buffer to
travel a series of connections from the source, where those
connections can
operate on the supplied data in place, without having to copy.
However, if
they can't operate in place, then a copy can occur).

(There are some more intricate things you can do with using
external buffers
- see the docs on the external buffer property)

Then you call it:

- You have to supply time stamp information - this is extremely
important
because many audio units will use the sample count (and the
sample rate that
is provided by the stream format) to calculate timing information
- if only
the sample count is valid, then you indicate that with the timing
flags

result = AudioUnitRender (reverbUnit,
&ioActionFlags,
&myTimeStamp,
0, //inOutputBusNumber
inNumberFrames,
&auBufferList);

(if this is the first time you're calling this, then the sample
count of
theTimeStamp could be zero - then the next time you call it the
sample count
would be 512, then 1024, etc...)

When this call returns, you should have 1 channel's worth of 512
frames of
Float32 audio data in auBufferList.mBuffers[0].mData

Bill

> result= AudioUnitRender (reverbUnit, &ioActionFlags, 0,0,
> inNumberFrames, &auBufferList);
>
> I may be way off base here, that's why I'm looking for guidance
> :-)
> Do you call AudioUnitRender for each output?
>
> Thanks,
> Pete Ware
> email@hidden
>
> -----Original Message-----
> From: Bill Stewart [mailto:email@hidden]
> Sent: Tuesday, September 24, 2002 11:59 AM
> To: Ware, Pete; CoreAudio API
> Subject: Re: AudioUnitRender Errors
>
>
> Can you post an example of how you're calling this?
>
> A -50 error is indicating that one of the parameters you're
> passing in is
> invalid
>
> Bill
>
> on 24/9/02 8:51 AM, Ware, Pete wrote:
>
>> Hi everyone,
>>
>> I keep getting a -50 result error when calling the
>> AudioUnitRender function. I'm using the v2 reverb unit as an
>> example. I guess I'm not passing in valid data in the
>> AudioBufferList.
>>
>> Can anyone provide details on what this audio unit expects to
>> see?
>>
>> Thanks!
>>
>> Pete Ware
>> email@hidden
>> _______________________________________________
>> coreaudio-api mailing list | email@hidden
>> Help/Unsubscribe/Archives:
>> http://www.lists.apple.com/mailman/listinfo/coreaudio-api
>> Do not post admin requests to the list. They will be ignored.

--
mailto:email@hidden
tel: +1 408 974 4056

_________________________________________________________________
_________
"Much human ingenuity has gone into finding the ultimate Before.
The current state of knowledge can be summarized thus:
In the beginning, there was nothing, which exploded" - Terry
Pratchett
_________________________________________________________________
_________
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.

  • Prev by Date: Re: cant load resources in AudioUnits
  • Next by Date: Managing Power for PCI Audio Device Driver
  • Previous by thread: Re: AudioUnitRender Errors
  • Next by thread: AU2 - manufacturer names and subfolders
  • Index(es):
    • Date
    • Thread