Overview of Sequencing, MIDI & Audio Units in iOS 4.3
Overview of Sequencing, MIDI & Audio Units in iOS 4.3
- Subject: Overview of Sequencing, MIDI & Audio Units in iOS 4.3
- From: Charlie Macchia <email@hidden>
- Date: Mon, 11 Apr 2011 17:52:50 -0400
- Thread-topic: Overview of Sequencing, MIDI & Audio Units in iOS 4.3
Title: Overview of Sequencing, MIDI & Audio Units in iOS 4.3
Hi guys, I'm looking to write a simple app that does basic MIDI sequencing and generates tones on iOS. After digging though some sample code ( and experiencing shock and awe at the powers of Garage Band ), I thought I'd run what I discovered passed this group to see if I've got it about right.
/*******************/
Obvious Rule #1 - Core Audio on OSX is more expansive than Core Audio on iOS.
( well of course, but specifics are important )
Some potential key ingredients NOT in iOS.
1 - GM MIDI Sounds ( Licensed from Roland ) and thus Audio Units like: kAudioUnitSubType_DLSSynth;
( taken from sample code OSX PlaySoftMIDI )
There is no default synth AU in iOS.
2 - Music Player Services - not on iOS.
MP services is nice because it provides MIDI Playback and the ability to play while you accept new input - it also synchronizes audio playback, with MIDI playback.
OSX Sample Code for this is "PlaySequence"
Note - COREMidi was added as of iOS 4.2 - it looks very cool, however it's primarily a way to route and manipulate MIDI data, it does not provide a means to play or record it.
The only options as of 4.3 for MIDI sequencing that I can see are third party software.
Other stuff:
I've run through a fair bit of sample code on the web, having fun playing with rendering sine waves and futzing with callback functions in Core Audio.
However, that's a long way from writing a simple sequencer that sends MIDI information to an audio unit.
Here's some thoughts - tell me if I'm crazy.
I know MIDI really well, my guess is it's probably not too hard to translate MIDI information into parameters sent to a waveform render proc, just write a simple wrapper that translates pitch values and velocity into whatever native values your proc understands and pass those along as parameters into your render proc's signature. For now the generator can remain monophonic - so this is fairly simple. ( famous last words … )
However I'm a little intimidated at writing a simple MIDI sequencer, perhaps unfounded, regardless … obviously I'm no Gerhard Lengeling - so if there's something already out there I should look at, please let me know - otherwise, if somebody can point me in the right direction, that would be great.
Thanks guys,
Charlie
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden