Re: Where should CAStreamBasicDescription be instantiated?
Re: Where should CAStreamBasicDescription be instantiated?
- Subject: Re: Where should CAStreamBasicDescription be instantiated?
- From: Jim Griffin <email@hidden>
- Date: Thu, 28 Feb 2013 15:50:47 -0500
Hello Jeff,
I am subclassing the Audio Unit public AUEffectBase class and have over ridden the Render method to try and use the PullInput method and the GetInput(0)->GetBufferList() to retrieve more than one input buffer.
I'm trying to implement a PICOLA algorithm method to control the time-scale and pitch of an audio stream. This algorithm computes a pitch period value used to determine which parts of the audio stream can be removed and still let the audio stream be understandable. I want to minimize the chipmunk voice effect when the audio stream is sped up a few times.
The Pitch period of the PICOLA algorithm needs about 1500 - 2000 data points to begin its calculations and the default buffer value of 512 isn't enough to start with.
I've tried using the PullInput method and the GetInput(0)->GetBufferList() method in a do … while loop to get 3 or 4 buffers of audio data but the methods don't seem to get new data. I just get the same buffer data 3 or 4 times in a row.
I am looking for a way to have the Audio Unit give me more than 512 float data points per audio channel at a time.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden