• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Coreaudio-api Digest, Vol 10, Issue 71
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Coreaudio-api Digest, Vol 10, Issue 71


  • Subject: Re: Coreaudio-api Digest, Vol 10, Issue 71
  • From: Jim Griffin <email@hidden>
  • Date: Fri, 01 Mar 2013 16:29:48 -0500

Jeff,

Thanks for the suggestions.  The audio component I am working on will be controlled by my host application so I take it that I can have the buffers increased to accommodate the buffer needs of the component?

Increasing the latency of the component at the start my computations sounds pretty good also.  I will need to experiment on what happens when a user changes the playback speed in the middle of playback.


Jim Griffin
Macintosh Software Developer
email@hidden



Message: 2
Date: Thu, 28 Feb 2013 19:30:15 -0800
From: Jeff Moore <email@hidden>
To: "email@hidden" <email@hidden>
Subject: Re: Where should CAStreamBasicDescription be instantiated?
Message-ID: <email@hidden>
Content-Type: text/plain; charset=windows-1252

AudioUnits don't get to control the buffer size. That belongs to the host application. Further, as it says in <AudioUnit/AUComponent.h, all AUs, with a few exceptions, are expected to work in real time and thus can only request the same amount of audio input as they are being asked to produce for output.

That said, there is no restriction on the amount of latency an AU can introduce, provided that this amount is published through the appropriate properties. This allows you to buffer up the data a bit. For example, if the algorithm needs X frames, the AU would return silence for the first X frames of getting pulled while still pulling on it's input. Then once the X frames have been accumulated, the AU would start putting out actual data. This is how you would do a look-ahead limiter for example.

--

Jeff Moore
Core Audio
Apple




On Feb 28, 2013, at 12:50 PM, Jim Griffin <email@hidden> wrote:

Hello Jeff,

I am subclassing the Audio Unit public AUEffectBase class and have over ridden the Render method to try and use the PullInput method and the GetInput(0)->GetBufferList() to retrieve more than one input buffer.

I'm trying to implement a PICOLA algorithm method to control the time-scale and pitch of an audio stream.  This algorithm computes a pitch period value used to determine which parts of the audio stream can be removed and still let the audio stream be understandable.  I want to minimize the chipmunk voice effect when the audio stream is sped up a few times.

The Pitch period of the PICOLA algorithm needs about 1500 - 2000 data points to begin its calculations and the default buffer value of 512 isn't enough to start with.

I've tried using the PullInput method and the GetInput(0)->GetBufferList()  method in a do … while loop to get 3 or 4 buffers of audio data but the methods don't seem to get new data.  I just get the same buffer data 3 or 4  times in a row.

I am looking for a way to have the Audio Unit give me more than 512 float data points per audio channel  at a time.




------------------------------

_______________________________________________
Coreaudio-api mailing list
email@hidden
https://lists.apple.com/mailman/listinfo/coreaudio-api

End of Coreaudio-api Digest, Vol 10, Issue 71
*********************************************

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Follow-Ups:
    • re: Where should CAStreamBasicDescription be instantiated?
      • From: Jim Griffin <email@hidden>
    • Re: Coreaudio-api Digest, Vol 10, Issue 71
      • From: Jeff Moore <email@hidden>
  • Next by Date: Re: Coreaudio-api Digest, Vol 10, Issue 71
  • Next by thread: Re: Coreaudio-api Digest, Vol 10, Issue 71
  • Index(es):
    • Date
    • Thread