Re: What now ???
Re: What now ???
- Subject: Re: What now ???
- From: Chris Reed <email@hidden>
- Date: Mon, 27 Oct 2003 10:45:09 -0600
On Oct 25, 2003, at 4:18 PM, Mark's Studio wrote:
I have almost finished a standalone synth ( It's making controllable
noise :),
I just use "MIDIDestinationCreate" to get some midi into my callback,
and the "OpenDefaultAudioOutput" and "AudioUnitInputCallback" to setup
the callback for the output.
The interface is Cocoa, and im using ObjC.
So my problem is, what option would be the best/possible.
1. Would it be possible to leave it as a standalone app and provide the
output to a host app, is there something for that?
Using Jack you can route audio between apps. Not much support for this
API on the Mac, though. See
http://jackit.sourceforge.net/
Alternatively, you can add ReWire support to your app. See
http://www.propellerheads.se/products/rewire/frame.html
A third option is to simply let the user choose an output audio device
and let the driver mix your output.
2. Make a MusicDevice using Cocoa and ObjC is that possible, and will
anyone be able to use it?
Yes, you can build an interface for a MusicDevice AU using the Panther
SDK. No, no one will be able to use it, since there aren't any hosts
(yet, that I know of) that support Cocoa editors.
3. Convert my code and interface to Carbon and use the SDK to make a
MusicDevice?
This is currently the way to go.
-chris
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
References: | |
| >What now ??? (From: "Mark's Studio" <email@hidden>) |