Re: here I am
Re: here I am
- Subject: Re: here I am
- From: Michael Thornburgh <email@hidden>
- Date: Mon, 25 Aug 2003 16:54:00 -0700
for the problem domain of "objective-c", "input samples" and "output
samples", my completely unbiased opinion is that MTCoreAudio.framework
is where it's at! ;) the AudioMonitor.app example application reads &
buffers samples from an input device, processes (sample rate conversion
and amplitude scaling) the samples and sends them to an output device.
it should be straight-forward to see where to put your own processing &
synthesis.
http://aldebaran.armory.com/~zenomt/macosx/MTCoreAudio/
-mike
On Monday, August 25, 2003, at 04:18 PM, Massimiliano Viel wrote:
Hi,
thank you for the very fast answer...
generating sound or generating midi/note events?
generating sound... I'm also interested in generating MDI, but it is
very secondary now...
this is very ambitious. You need some advanced knowledge of digital
signal processing (audio + 2d) for that.
I can manage it, i'm actually trying to experiment a waveform
synthesis....
The problem I have is mainly to have osx routines for inputting real
time audio and get the samples and outputting (buffered) samples in
real time...
Thank you for the help!
massimiliano viel
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.
References: | |
| >Re: here I am (From: Massimiliano Viel <email@hidden>) |