Shouldn't I be able to do the following to just create a passthru AUv3?
- (AUInternalRenderBlock)internalRenderBlock { // Capture in locals to avoid ObjC member lookups. If "self" is captured in render, we're doing it wrong. See sample code.
return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags, const AudioTimeStamp *timestamp, AVAudioFrameCount frameCount, NSInteger outputBusNumber, AudioBufferList *outputData, const AURenderEvent *realtimeEventListHead, AURenderPullInputBlock pullInputBlock) { // Do event handling and signal processing here. pullInputBlock(actionFlags, timestamp, frameCount, outputBusNumber, outputData);
return noErr; }; } Before 'pullInputBlock' I am able to see the buffers are all 0.0 samples and then after it has resonable audio samples. I am able to connect from a host app with no complaints anywhere but still get silence.
|