Re: Audio Capture (Java)
Re: Audio Capture (Java)
- Subject: Re: Audio Capture (Java)
- From: Gerd Castan <email@hidden>
- Date: Wed, 31 Jul 2002 22:37:50 +0200
Bill,
sorry for the delay, I've been offline for a few days.
Bill Stewart <email@hidden> wrote:
Message: 1
Date: Thu, 25 Jul 2002 18:39:22 -0700
Subject: Re: Audio Capture (Java)
From: Bill Stewart <email@hidden>
To: Zachary Crockett <email@hidden>, CoreAudio API
<email@hidden>
on 7/22/02 10:03 PM, Zachary Crockett wrote:
(I'm glad to hear someone else say the docs for CoreAudio aren't the
best.. I've had a heck of a time trying to learn the API.)
They are lousy. Or at least bad enough that I fear that the
one who made the Java API made a big mistake that makes it
impossible to write aa clean Java app with sound input...
After several attempts, I'm stumped -- I'm using the HAL for
capture. I
get the default input AudioDevice using
AudioHardware.getDefaultInputDevice(), and register an
This is exacly the point where I gave up for the moment. A clean code
must get the properties of
AHConstants.kAudioDevicePropertyStreamConfiguration
and store the result in an com.apple.audio.util.AudioBufferList
In the time I spent I came to the conclusion that there is a
design bug in the Java API that I can't instantiate such an object
and therefore can't store the results of that call.
(there is a second bug: the javadocs aren't created with class-use
so I can't be sure about that. And there is no source code so
I can't create the javadocs myself)
I've been wondering about that -- why on earth are there no public
constructors for what seems like over half of the CoreAudio classes?
AudioBufferList being a HUGE one, and AudioBuffer and
AudioValueTranslation are two other biggies... how in the world is one
supposed to use these objects if she can't instantiate them? I was
assuming I just didn't understand it well enough...
You get these objects instantiated for you by the core audio API. You can't
copy an object that you create into the ones given you by the API calls, so
there's no need for a public constructor.
Please give use at least two lines of code.
the second line is probably (according to the spec) something like
inputDevice.getInputProperty (AHConstants.kAudioPropertyWildcardChannel,
AHConstants.kAudioDevicePropertyStreamConfiguration,audioBufferList);
Since Java doesn't allow to instantiate audioBufferList inside
getInputProperty using this syntax, I need to get an instance of
audioBufferList before I do this. Which is the recommended way
of getting this instance?
In fact, there is a considerable efficiency gained by doing things this way
that would be lost if you were to create these objects yourself (aside from
the fact that the underlying API doesn't support the replacement semantic
anyway)
The same is applied to the I/O objects in the CoreMIDI API...
or is there some hidden static method somewhere which returns an
instance of these objects? The Javadocs don't have any "See"
references, which makes finding such things incredibly difficult... for
instance, I was baffled that there was no public constructor for
com.apple.audio.units.AudioDeviceOutputUnit, but after several hours of
perusing the javadocs, I finally found the static method
com.apple.audio.units.AUComponent.getDefaultOutput() which returns an
instance.
This is due to the way that components work. Basically, you find the
component that describes the audio unit you want, then you create an
instance of that through the open call...
There are examples in the /Developer/Examples/CoreAudio directory that shows
you how to use these.
There are design patterns being used here that is based on the usage of
these objects and how the system represents these. Using a constructor is
not the only design pattern that can be applied.
You need to create javadocs with class-use for all other design
patterns, which hasn't been the case when the thread started.
The spec says a clean application has to use
kAudioDevicePropertyStreamConfiguration. None of the mentioned examples
does it this way.
And I didn't find a way to do it.
AudioDeviceIOProc... However, the inInputData parameter of the IOProc
doesn't seem to actually contain anything but near-silence even if I
... are you sure that you know what device you are listening to?
Is it the same you are screaming to? Tried to insert an audio CD?
Good point, and on that note, also in response to Mike Thornburgh who
wrote:
CD's are not drivers on X (they were on 9)
if you're not getting anything other than 1-1.5 bits
of noise even when a microphone is connected to the
sound input port, then i bet your source is set to
"Zoomed Video".
you can change the input source by turning on speech
recognition and choosing from the "Microphone" pulldown
in the "Listening" tab. or, of course, you can set
it yourself in your program.
In Jaguar, you can set this in either the Sound Prefs pane or in Audio MIDI
Setup - and if you run Daisy you'll see this there as well.
I looked in the speech preferences and the internal mic was selected,
however speakable items was off. When I turned it on, my app was able
to capture audio. So my question is: how can I "set it [my]self in [my]
program"? (i.e., wrest control of audio input from the SpeechManager)
Is this set using kAudioDevicePropertyDataSource? If so, how do I deal
with getting the NameForID property since the IDs mean nothing to me & I
can't instantiate an AudioValueTranslation?
This oversight should be fixed in Jaguar - but from an API point of view you
can do this already on a 10.1 system. The code for this is in Daisy.
Thanks much... (and does anyone know if Apple plans ever to
implement/support the Java Sound API?)
Ask about this on the java dev list.
Bill
Best regards,
Gerd
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.