• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Seeking advice for modifying audio input source
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Seeking advice for modifying audio input source


  • Subject: Re: Seeking advice for modifying audio input source
  • From: Leo Thiessen <email@hidden>
  • Date: Wed, 7 Jun 2017 14:57:56 -0500

Hello W.,

To answer re “drive audio source(1) using audio source(2)” - have you tried using AudioUnitAddRenderNotify()

Recently, as a test, I successfully sub-classed and used AVAudioUnitEffect with the following init:

- (instancetype)init {
    AudioComponentDescription component = VEComponentDescriptionMake(kAudioUnitManufacturer_Apple,
                                                                     kAudioUnitType_Effect,
                                                                     kAudioUnitSubType_PeakLimiter);
    if ( (self=[super initWithAudioComponentDescription:component]) ) {
        _audioUnit = self.audioUnit;
        _didAddRenderNotify = VECheckOSStatus(AudioUnitAddRenderNotify(_audioUnit,
                                                                       VEAVLimiterRenderCallback,
                                                                       (__bridge void * _Nullable)(self)),
                                              "AudioUnitAddRenderNotify");
    }
    return self;
}


…with a render function like:

OSStatus VEAVLimiterRenderCallback(void *inRefCon,
                          AudioUnitRenderActionFlags *ioActionFlags,
                          const AudioTimeStamp *inTimeStamp,
                          UInt32 inBusNumber,
                          UInt32 inNumberFrames,
                          AudioBufferList *ioData) {
    if ((*ioActionFlags & kAudioUnitRenderAction_PostRender) && (inRefCon)) {
        __unsafe_unretained VEAVLimiter * THIS = (__bridge VEAVLimiter*)inRefCon;
        // This is essentially a “tap” do something, e.g. here you could do some calculations such as
        // determine peak amplitude, or whatever, then store those values and use them to do something like
        // “modulate" the “pan” on the source(1) AVAudioNode 
    }
    return noErr;
}

If I understand you correctly, I think this could work.

Kind regards,
Leo Thiessen
http://visionsencoded.com 




On Jun 6, 2017, at 2:00 PM, email@hidden wrote:

Send Coreaudio-api mailing list submissions to
email@hidden

To subscribe or unsubscribe via the World Wide Web, visit
https://lists.apple.com/mailman/listinfo/coreaudio-api
or, via email, send a message with subject or body 'help' to
email@hidden

You can reach the person managing the list at
email@hidden

When replying, please edit your Subject line so it is more specific
than "Re: Contents of Coreaudio-api digest..."


Today's Topics:

  1. Seeking advice for modifying audio input source (Waverly Edwards)


----------------------------------------------------------------------

Message: 1
Date: Tue, 6 Jun 2017 16:59:05 +0000
From: Waverly Edwards <email@hidden>
To: "email@hidden" <email@hidden>
Subject: Seeking advice for modifying audio input source
Message-ID:
<email@hidden>

Content-Type: text/plain; charset="utf-8"

I've been reworking an existing project that uses core-audio to now use AVAudioEngine and I am need of some assistance.
I am currently forcing AVAudioEngine to render offline, using AudioUnitRender on the output node  - the setup is non-standard but it works.
I am using Swift and MacOS, not iOS, it's been a bit of a roller-coaster but I'm making (slow) progress

I am seeking direction on how you can drive audio source(1), using audio source(2).  My specific need to perform a panning effect back and forth at a specific frequency.
I believe that I could also perform other audio magic such as fade-in, fade-out using the same basic idea.  It would be so much better if there is a high-level way to manage this however, I will go low-level if needed.

Here are the ideas that I've investigated or attempted:

Add an audio unit effect.
Add a tap and use the output as a source make adjustments to the other source
Adding a render callback to an existing mixer node
Add an AVAudioIONode?
Use an AUAudioUnit to access the buffer list data (latest idea being investigated)

I looked at AudioUnitV3Example and was completely out of my depth, considering writing an audio effect
Unfortunately, I can't use a tap to monitor and drive the source because working offline, the tap doesn't get data - I tried.  I would also need to decrease the buffer significantly for granularity sake if it did.
I've been attempting to access the underlying Audiounit of one of the mixer nodes or see if it is possible to add or access a render callback  - not successful yet.
It took me a couple of weeks to get a render callback on the input node working so I thought I could do so on a mixer node - that battle is still being fought.
In researching AVAudioIONode, I haven't determined how do you get access to the input stream

Here is the latest idea being investigated, AUAudioUnit
I looked into AUAudioUnit and have been unable to determine how you can create a node that accepts input
I'm not getting the input but do I need it, since I have access to the output. Is access to the input provider necessary?

Q1: should the componentType be kAudioUnitType_Mixer to get input and output?
Q2: can this be non-interleaved on MacOS
Q3: how do you get the input provider working?  Is this necessary, since I have access to the output provider
Q4: How do you create a component an insert it in the between nodes - let's say between a mixer and distortion node, not just at the end

Thank you,


W.

       do {

           let audioComponentDescription = AudioComponentDescription(
               componentType: kAudioUnitType_Output,                 // Q1: should this be kAudioUnitType_Mixer to get input and output?
               componentSubType: kAudioUnitSubType_HALOutput,
               componentManufacturer: kAudioUnitManufacturer_Apple,
               componentFlags: 0,
               componentFlagsMask: 0 )

           if (auAudioUnit == nil) {

               try auAudioUnit = AUAudioUnit(componentDescription: audioComponentDescription)

               let upstreamBus = auAudioUnit.inputBusses[0]

               let audioFormat = AVAudioFormat(
                   commonFormat: AVAudioCommonFormat. pcmFormatFloat32,
                   sampleRate: Double(sampleRate),
                   channels:AVAudioChannelCount(2),
                   interleaved: false )                                 // Q2: can this be non-interleaved

              auAudioUnit.isInputEnabled = true
              auAudioUnit.isOutputEnabled = true

              auAudioUnit.inputHandler = { (actionFlags, timestamp, frameCount, inputBusNumber) in
                 print("handling input,calling method to fill audioBufferList ")  // Q3: not working but is this necessary, since I have access to the output
              }

            auAudioUnit.outputProvider = { (actionFlags, timestamp, frameCount, inputBusNumber, inputData) -> AUAudioUnitStatus in
            print("handling output,calling method to fill audioBufferList ")
            return 0
            }
}
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.apple.com/mailman/private/coreaudio-api/attachments/20170606/cb39d234/attachment.html>

------------------------------

Subject: Digest Footer

_______________________________________________
Coreaudio-api mailing list
email@hidden
https://lists.apple.com/mailman/listinfo/coreaudio-api

------------------------------

End of Coreaudio-api Digest, Vol 14, Issue 27
*********************************************

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Follow-Ups:
    • RE: Seeking advice for modifying audio input source
      • From: Waverly Edwards <email@hidden>
  • Prev by Date: RE: Question for USB MIDI Support on OSX
  • Next by Date: RE: Seeking advice for modifying audio input source
  • Previous by thread: Seeking advice for modifying audio input source
  • Next by thread: RE: Seeking advice for modifying audio input source
  • Index(es):
    • Date
    • Thread