On Feb 4, 2013, at 2:28 PM, Chris Adamson < email@hidden> wrote: Well, I'm away from my source for the moment, but I think this begs a question of how you would use an AudioQueueProcessingTap to apply an AUEffect, g
You are not applying an effect, you are applying AUNewTimePitch, which is a format converter.
Okay James, but given this. What would be the proper way to construct an AUGraph, which should read and play a file of generic file format, time-stretch it with UIControls and output it to remoteIO?
iven that the AUEffect likely uses a different PCM format than what is being received from the tap (and given that the format received by the stream cannot be changed). In my experience, the AUEffects require specific floating-point formats, and I've never been able to set the stream format of an AUEffect and not get -50 (paramErr).
So, given that I cannot change the format I get from the queue, nor can I change the format supplied to the AUEffect, it seems like the only reasonable option is to use an AUConverter.
I should note by the way -- I don't know if you looked at the code -- the actual use of the converter and effect is in an offline AUGraph. In the tap callback, I pull the provided buffer through an AUGraph that looks like this:
[render callback] -> AUConverter -> AUEffect -> AUConverter -> AUGenericOutput
The converters go to and from the AUEffect's default format, and the render callback exists only to provide the buffer that was originally passed to the tap callback. It looks unnecessarily elaborate, but since the format issues prevented me from just calling AudioUnitRender() in place on a single AUEffect, it was the best solution I could get working.
--Chris
Sent from my iPad On Feb 4, 2013, at 4:54 PM, James McCartney < email@hidden> wrote:
You should not put a format converter (kAudioUnitType_FormatConverter) into an AudioQueueProcessingTap. That is only meant for effects. Unlike effects, format converters can pull for unpredictable amounts of audio. The only exception might be an AUConverter that is not doing any sample rate conversion. So it is expected that this will not work.
On Jan 31, 2013, at 8:36 AM, Chris Adamson < email@hidden> wrote: Yeah, I'm looking into the silence issue now. I'm worried that 6.1 may break the Audio Queue Processing Tap (or at least how I'm using it). That demo still works on the 6.0 simulator, but produces silence on 6.1 (simulator and device).
The AUNewTimePitch demos from that CocoaConf Portland blog still work on 6.1.
--Chris
On Jan 31, 2013, at 11:26 AM, Tony Kirke < email@hidden> wrote: I've also had the same issue based on Chris's demo project. In fact I just upgraded to 6.1 and am now getting silence instead of distortion. Were you on 6.0 or 6.1?
On Thu, Jan 31, 2013 at 3:27 AM, Jont Olof Lyttkens <email@hidden> wrote:
Hi!
I am building an app for iOS. which depends to a great extent on time-stretching and I was thrilled to find the NewTimePitch Audio Unit. I am configuring a simple AUGraph:
kAudioUnitSubType_AudioFilePlayer -> kAudioUnitSubType_NewTimePitch -> kAudioUnitSubType_RemoteIO
The parameter "rate" of the NewTimePitch is set from around 0.7 to 1.3. Problem is that the NewTimePitch Unit distorts the audio quite badly. I found this post on coreaudio-api-maillist which relates to the problem:
Unfortunately I don't come to the same conclusion. I don't hear any phase problems, nor any monofication of the signal. We are only talking really bad distortion. I've done testing with all sorts of audio input down to a -24dB FS audio input and the distortion seems to be unrelated to volume.
Finally I found this blog post from CocoaConf by Chris Adamson, where he bundles the demo project from the conference:
I must admit that he pointed out a couple of things that I had missed out, but even after fixing these, I still had the distortion. More so, compiling the Demo Project from CocoaConf also distorted the audio, so I am beginning to wonder if the NewTimePitch AU really is corrupt.
A printout of my CAShow():
AudioUnitGraph 0xBD33001:
Member Nodes: node 1: 'auou' 'rioc' 'appl', instance 0x815e940 O I
node 2: 'aufc' 'nutp' 'appl', instance 0xa85ade0 O I
node 3: 'augn' 'afpl' 'appl', instance 0x8298510 O I Connections:
node 3 bus 0 => node 2 bus 0 [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved]
node 2 bus 0 => node 1 bus 0 [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved]
CurrentState: mLastUpdateError=0, eventsToProcess=F, isRunning=F
I also have:
- Ensured that the StreamFormat of the AUNewTimePitch is set equally on all other units
- Setup the RemoteIO bus
Is there any magic trick that I need to know about or does anyone have a non-distorting sample of a graph that works with the NewTimePitch Audio Unit?
Best regards
/Jont Olof
_____________________________________
Co-founder and Senior Cocoa Developer centCode AB
Adress: Frödingsgatan 25, Uppsala, Sweden
Zip: SE-75421
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
James McCartney Apple CoreAudio
James McCartney Apple CoreAudio
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
|