Re: Channel Specific MIDI Reverb with an AUGraph
Re: Channel Specific MIDI Reverb with an AUGraph
- Subject: Re: Channel Specific MIDI Reverb with an AUGraph
- From: Sven Thoennissen <email@hidden>
- Date: Thu, 22 Feb 2018 10:20:00 +0100
Timothy,
One minute sounds definitely WAY too long.
I cannot imagine the V3 API has a new implementation from scratch, it probably
encapsulates the V2 C API.
What you could do is to write a quick & dirty test app, and build your graph
using AVAudioEngine and AVAudioUnitSampler, and load your SoundFonts there, to
see if there is a difference.
I don’t know your code but I have 2 unfounded suspicions. Maybe there is a
glitch that causes the problem; especially those C data types in Swift can be
tricky. Or you have a SoundFont issue. I remember the AUSampler has
difficulties with complex SoundFonts. Either way, a test project with
AVAudioEngine could reveal something.
If you like you can send your test project to me to help troubleshoot the issue.
Best,
Sven
> Am 21.02.2018 um 22:54 schrieb Timothy Erdmer <email@hidden>:
>
> With 8 tracks and the 2 MB soundfont, my AUGraph is taking about a minute to
> initialize, and about 10 minutes for the 31 MB soundfont. This would be
> doubled if 16 tracks were implemented.
>
> If your assumption about the single preset being loaded for the
> AVAudioUnitSampler is correct, I may experiment with moving my project from
> the AUGraph and AudioUnit MIDISynth to an AVEngine and AVAudioSampler
> configuration, especially if MIDI functionality past note on/off messages is
> unsupported.
>
> Would it be safe to assume that the AVEngine is just faster than the
> soon-to-be-deprecated AUGraph?
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden