Re: Overview of Sequencing, MIDI & Audio Units in iOS 4.3
Re: Overview of Sequencing, MIDI & Audio Units in iOS 4.3
- Subject: Re: Overview of Sequencing, MIDI & Audio Units in iOS 4.3
- From: Gregory Wieber <email@hidden>
- Date: Mon, 11 Apr 2011 15:16:34 -0700
Hi Charlie,
You seem to have it all correct so far. There was a fairly extensive conversation on here a while back about counting samples to arrive at a steady clock. You'll want to use that as your means of keeping time. Don't allocate any memory in your callbacks, and stick to C++ for all of your data structures that you're going to reference from within your render callbacks. Develop your UI in a way that it is not dependent on the render thread for information, and vice versa. In other words, stay away from using any objective c in your callbacks. Aaron Mullholland and Michael Tyson have written some detailed accounts of their own dealings with core audio. Those entries, and everything on this thread should get you started.
best.
Greg
On Mon, Apr 11, 2011 at 2:52 PM, Charlie Macchia
<email@hidden> wrote:
Hi guys, I'm looking to write a simple app that does basic MIDI sequencing and generates tones on iOS. After digging though some sample code ( and experiencing shock and awe at the powers of Garage Band ), I thought I'd run what I discovered passed this group to see if I've got it about right.
/*******************/
Obvious Rule #1 - Core Audio on OSX is more expansive than Core Audio on iOS.
( well of course, but specifics are important )
Some potential key ingredients NOT in iOS.
1 - GM MIDI Sounds ( Licensed from Roland ) and thus Audio Units like: kAudioUnitSubType_DLSSynth;
( taken from sample code OSX PlaySoftMIDI )
There is no default synth AU in iOS.
2 - Music Player Services - not on iOS.
MP services is nice because it provides MIDI Playback and the ability to play while you accept new input - it also synchronizes audio playback, with MIDI playback.
OSX Sample Code for this is "PlaySequence"
Note - COREMidi was added as of iOS 4.2 - it looks very cool, however it's primarily a way to route and manipulate MIDI data, it does not provide a means to play or record it.
The only options as of 4.3 for MIDI sequencing that I can see are third party software.
Other stuff:
I've run through a fair bit of sample code on the web, having fun playing with rendering sine waves and futzing with callback functions in Core Audio.
However, that's a long way from writing a simple sequencer that sends MIDI information to an audio unit.
Here's some thoughts - tell me if I'm crazy.
I know MIDI really well, my guess is it's probably not too hard to translate MIDI information into parameters sent to a waveform render proc, just write a simple wrapper that translates pitch values and velocity into whatever native values your proc understands and pass those along as parameters into your render proc's signature. For now the generator can remain monophonic - so this is fairly simple. ( famous last words … )
However I'm a little intimidated at writing a simple MIDI sequencer, perhaps unfounded, regardless … obviously I'm no Gerhard Lengeling - so if there's something already out there I should look at, please let me know - otherwise, if somebody can point me in the right direction, that would be great.
Thanks guys,
Charlie
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden