Re: AUGraph deprecation
Re: AUGraph deprecation
- Subject: Re: AUGraph deprecation
- From: Bartosz Nowotny <email@hidden>
- Date: Wed, 11 Jul 2018 16:15:11 +0200
Arshia,
Thank you for clearing that up.
On Wed, Jul 11, 2018 at 4:10 PM, Arshia Cont <email@hidden>
wrote:
> Bartosz,
>
> Laurent was mentioning the installTapOnBus. Your published code would not
> need that. You are just playing MIDI. You would be concerned if you had to
> do custom real-time audio processing on the audio output of your MIDI
> device (such as FFT analysis).
>
> Arshia
>
> On 11 Jul 2018, at 16:04, Bartosz Nowotny <email@hidden>
> wrote:
>
> Laurent,
>
> What you said about not being able to achieve latency lower than 100ms is
> worrisome. I need a realtime MIDI synth, low latency is absolutely crucial.
> Does the limitation you mention apply only to signal processing or other
> applications of the API as well, in particular MIDI synthesis?
>
> Regards,
> Bartosz
>
>
> On Wed, Jul 11, 2018 at 3:30 PM, Laurent Noudohounsi <
> email@hidden> wrote:
>
>> Thanks Benjamin for the precision. I thought that `installTapOnBus` was
>> the successor of `RenderCallback`.
>> For me it was not natural to mix old api like `kAudioUnitProperty_
>> SetRenderCallback` in AVAudioEngine.
>>
>> So as Arshia said, I'm also looking for a way to use real-time processing
>> with AVAudioEngine.
>>
>> Le mer. 11 juil. 2018 à 15:05, Arshia Cont <email@hidden> a
>> écrit :
>>
>>> Interesting thread here!
>>>
>>> Anyone has achieved low-latency processing on AVAudioEngine?
>>>
>>> The RenderCallback seems natural to me (which is the good “old” way of
>>> doing it with AUGraph). But I’m curious to hear if anyone has done/achieved
>>> real stuff here with AVAudioEngine real-time processing and how.
>>>
>>>
>>> Arshia
>>>
>>>
>>> On 11 Jul 2018, at 15:00, Benjamin Federer <email@hidden> wrote:
>>>
>>> Laurent,
>>>
>>> `installTapOnBus` is not intended for realtime processing as a tap only
>>> provides the current frame buffer but does not pass it back into the signal
>>> chain. The documentation reads `Installs an audio tap on the bus to record.
>>> monitor, and observe the output of the node`.
>>>
>>> Although I have not done that myself yet my understanding is that for
>>> realtime processing you can still retrieve the underlying audio unit from
>>> an AVAudioNode (or at least some nodes?) and attach an input render
>>> callback via AudioUnitSetProperty with kAudioUnitProperty_SetRenderCa
>>> llback.
>>>
>>> I assume the other way would be to subclass AUAudioUnit and wrap that
>>> into an AVAudioUnit which is a subclass of AVAudioNode. Yes, it confuses
>>> me, too. Random Google result with further information:
>>> https://forums.developer.apple.com/thread/72674
>>>
>>> Benjamin
>>>
>>>
>>> Am 11.07.2018 um 14:34 schrieb Laurent Noudohounsi <
>>> email@hidden>:
>>>
>>> Hi all,
>>>
>>> I'm interested in this topic since I've not found any information about
>>> it yet.
>>>
>>> Correct me if I'm wrong but AVAudioEngine is not able to lower than
>>> 100ms latency. It's what I see in the header file of `AVAudioNode` with its
>>> method `installTapOnBus`:
>>>
>>> @param bufferSize the requested size of the incoming buffers in sample
>>> frames. Supported range is [100, 400] ms.
>>>
>>> Maybe I'm wrong but I don't see any other way to have a lower latency
>>> audio processing in an AVAudioNode.
>>>
>>> Best,
>>> Laurent
>>>
>>> Le mer. 11 juil. 2018 à 13:57, Arshia Cont <email@hidden> a
>>> écrit :
>>>
>>>> Benjamin and list,
>>>>
>>>> I double Benjamin’s request. It would be great if someone from the
>>>> CoreAudio Team could respond to the question.
>>>>
>>>> Two years ago, after basic tests I realised that AVAudioEngine was not
>>>> ready for Low Latency Audio analysis on iOS. So we used AUGraph. I have a
>>>> feeling that this is no longer the case on iOS and we can move to
>>>> AVAudioEngine for low-latency audio processing. Anyone can share experience
>>>> here? We do real-time spectral analysis and resynthesis of sound and go as
>>>> low as 64 samples per cycle if the device allows.
>>>>
>>>> Thanks in advance.
>>>>
>>>>
>>>> Arshia
>>>>
>>>>
>>>> PS: I actually brought the deprecation issue of AUGraph in a local
>>>> Apple Dev meeting where the EU director of developer relation was present.
>>>> According to him, when Apple announces a deprecation, it WILL happen. My
>>>> interpretation of the conversation is that AUGraph is no longer maintained
>>>> but provided as is.
>>>>
>>>> On 11 Jul 2018, at 12:36, Benjamin Federer <email@hidden> wrote:
>>>>
>>>> Since it was mentioned in another email (thread) I’m giving this topic
>>>> a bump. Would be great if someone at Apple, or anyone else in the know,
>>>> could take the time to respond. The documentation at the link cited below
>>>> still has no indication of deprecation. Will it come with one of the next
>>>> Xcode Beta releases?
>>>>
>>>> On another note I am really interested in how transitioning over to
>>>> AVAudioEngine is working out for everyone. I know AVAudioEngine on iOS.
>>>> What I am interested in is any macOS specifics or hardships.
>>>>
>>>> From my experience AVAudioEngine is relatively robust in handling
>>>> multiple graphs, i.e. separate chains of audio units. I had some issues
>>>> with the AVAudioPlayerNode connecting to multiple destinations in that
>>>> scenario. Also connect:toConnectionPoints:fromBus:format: did not work
>>>> for me as it only connected to one of the destination points. Anyone else
>>>> experienced problems in that regard?
>>>>
>>>> Thanks
>>>>
>>>> Benjamin
>>>>
>>>>
>>>> Am 08.06.2018 um 16:59 schrieb Benjamin Federer <email@hidden>:
>>>>
>>>> Last year at WWDC it was announced that AUGraph would be deprecated in
>>>> 2018. I just browsed the documentation (https://developer.apple.com/d
>>>> ocumentation/audiotoolbox?changes=latest_major) but found
>>>> Audio Unit Processing Graph Services not marked for deprecation.
>>>> The AUGraph header files rolled out with Xcode 10 beta also have no mention
>>>> of a deprecation in 10.14. I searched for audio-specific sessions at this
>>>> year’s WWDC but wasn’t able to find anything relevant. Has anyone come
>>>> across new information regarding this?
>>>>
>>>> Judging by how much changes and features Apple seems to be holding back
>>>> until next year I dare ask: Has AUGraph API deprecation been moved to a
>>>> later time?
>>>>
>>>> Benjamin
>>>>
>>>>
>>>> _______________________________________________
>>>> Do not post admin requests to the list. They will be ignored.
>>>> Coreaudio-api mailing list (email@hidden)
>>>> Help/Unsubscribe/Update your Subscription:
>>>> email@hidden
>>>>
>>>> This email sent to email@hidden
>>>>
>>>>
>>>> _______________________________________________
>>>> Do not post admin requests to the list. They will be ignored.
>>>> Coreaudio-api mailing list (email@hidden)
>>>> Help/Unsubscribe/Update your Subscription:
>>>> email@hidden
>>>>
>>>> This email sent to email@hidden
>>>>
>>>
>>>
>>>
>> _______________________________________________
>> Do not post admin requests to the list. They will be ignored.
>> Coreaudio-api mailing list (email@hidden)
>> Help/Unsubscribe/Update your Subscription:
>> email@hidden
>>
>> This email sent to email@hidden
>>
>>
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list (email@hidden)
> Help/Unsubscribe/Update your Subscription:
> email@hidden
>
> This email sent to email@hidden
>
>
>
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden