Re: New Member
Re: New Member
- Subject: Re: New Member
- From: "tahome izwah" <email@hidden>
- Date: Tue, 1 Aug 2006 22:38:08 +0200
Hi Greg,
since noone has replied to your question yet let me tell you what I
know (which is not very much since I just got started myself).
The input and output callback procedures are called whenever the audio
system needs data. After installing the callback procs there's nothing
for the program to do except for running its main event loop (via
RunApplicationEventLoop, I guess this is what NSApplicationMain
ultimately does), occasionally fire timers and pick up events.
You could install a new EventLoopTimer to call your code every now and
then (but that timer might not fire when the program doesn't get idle
time) or run your code in a separate thread. The keywords to look up
here are InstallEventLoopTimer for the former and MPCreateTask for the
latter.
Hope that helps.
--th
2006/7/30, Greg Berchin <email@hidden>:
Greetings;
I just joined this forum a few days ago, because I find myself in a difficult
situation. To make a long story short; all of my (limited) experience
programming audio applications has been with ASIO under Windows. But just
the other day a Mac Mini showed up in my office, along with orders to port my
Windows ASIO application to OS X Core Audio.
The last time that I even touched a Mac was 1995, and then I had only a
passing familiarity with it. I know C, but not C++. (Granted; ASIO is
written in C++, but it's so "C-like" that I was able to figure out enough to
get by. Core Audio is very different, so I've been taking a crash course in
C++ just to try to understand the syntax.) I have never used Xcode before.
And I'm an engineer, not a computer scientist, so a lot of this is unfamiliar
to me anyway.
Now for my plea for help: I downloaded the sample code for Complex PlayThru
from
http://developer.apple.com/samplecode/ComplexPlayThru/index.html
because it seems like the closest thing to my 2-in/6-out ASIO application. I
think that I have managed to figure out where the input data are received
(ComplexPlayThru::InputProc) and where the output data are sent
(ComplexPlayThru::OutputProc), and I am assuming that these functions are
called as part of the interrupt callback. But I have not been able to figure
out where the application "waits" while it's not servicing the I/O
interrupts. My application processes data in much larger blocks than the
size of the I/O buffers, so I need to maintain three sets of buffers; one
being filled, one being processed, and one being emptied. I need to put my
processing code in the section of code that "waits" for the interrupts.
ANY guidance would be most appreciated, not only for ComplexPlayThru, but for
Core Audio, as well. (At this point I have no understanding of the Core
Audio framework; I'm just trying to hijack a similar program to at least get
things going -- I'll clean it up later.)
Many thanks,
Greg
=========================
Greg Berchin
email@hidden
The information contained in this communication is confidential and is
intended only for use by the addressee(s) named above. Any other
distribution, copying, disclosure or unauthorized use is strictly
prohibited and may be unlawful. If you have received this
communication in error, please notify me immediately. Thank you.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden