Getting AU to produce sound output
Getting AU to produce sound output
- Subject: Getting AU to produce sound output
- From: Antonio Nunes <email@hidden>
- Date: Mon, 5 May 2008 10:26:16 +0100
Hi,
I have in the last few days started to look at Core Audio. To get a
test project going I used some code from the AUViewTest example. The
AUViewTest example works fine when I build and run it, and so does my
own project using the leveraged sample code as is.
The problem starts when instead of using the standard DLS synthesizer
(componentSubType = kAudioUnitSubType_DLSSynth) I use the third party
Garritan Steinway grand piano audio unit:
// Ask for the Steinway
componentDescription.componentType = kAudioUnitType_MusicDevice;
componentDescription.componentSubType = 'GARS';
componentDescription.componentManufacturer = 'GaRR';
// Create nodes
componentDescription.componentFlags = 0;
componentDescription.componentFlagsMask = kAnyComponentFlagsMask;
AUNode synthNode, delayNode, outputNode;
err = (NewAUGraph(&_graph));
err = AUGraphAddNode(_graph, &componentDescription, &synthNode);
When I use the above instead of:
componentDescription.componentType = kAudioUnitType_MusicDevice;
componentDescription.componentSubType = kAudioUnitSubType_DLSSynth;
componentDescription.componentManufacturer =
kAudioUnitManufacturer_Apple;
…there is no sound.
I verified that the Garritan Steinway is correctly returned when
requested, but it is not clear to me why it won't produce any sound.
Since I am totally new to Core Audio, I'm sure I'm missing something
basic here. Any pointers as to what might it be?
In the pre-built sequence I removed the program change messages and
set the volume to max, but it makes no difference:
MIDIChannelMessage chmsg;
// Set max volume
chmsg.status = 0xB0;
chmsg.data1 = 0x07;
chmsg.data2 = 127;
chmsg.reserved = 0;
err = MusicTrackNewMIDIChannelEvent(track, 0., &chmsg);
// pan channel 1 hard left
chmsg.status = 0xB0;
chmsg.data1 = 10;
chmsg.data2 = 0;
chmsg.reserved = 0;
err = MusicTrackNewMIDIChannelEvent(track, 0., &chmsg);
// pan channel 2 hard right
chmsg.status = 0xB1;
chmsg.data1 = 10;
chmsg.data2 = 127;
chmsg.reserved = 0;
err = MusicTrackNewMIDIChannelEvent(track, 0., &chmsg);
Another matter on which the documentation doesn't clear my doubts is
that I would like to be able to drive the audio units (synths,
samplers) directly, without going through midi. From the
documentation I get the impression that this is possible, but there is
very little info and I could not find any sample code that shows how
to do this. Is this possible? If so, for all AU synths/samplers, or
does it depend on the synth/sampler used?
Kind Regards,
António Nunes
SintraWorks
-----------------------------------------
Accepting others as they are
brings a wonderful freedom
to your own mind.
--The Peace Formula
-----------------------------------------
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden