• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
A simple 'Device Through' app (need some help).
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

A simple 'Device Through' app (need some help).


  • Subject: A simple 'Device Through' app (need some help).
  • From: "Glenn McCord" <email@hidden>
  • Date: Fri, 27 Jun 2008 18:44:10 +1200

Hi. I'm trying to make a simple application whereby I can just copy
the input buffers of my soundcard to the output buffers. It's a 6 in 6
out and the input channels are regarded as a separate device to the
output channels. Essentially the goal is to grab some audio from the
soundcard at 64 frame per callback, do something with the audio, then
return it to the soundcard. At the moment, I'm just ommiting and
processing and just passing the audio straight through.

Unfortunately Coreaudio is proving a lot more difficult that ASIO and
Portaudio, so here I am.

I've read some of the core audio documentation and played around with
the sample code from :
http://developer.apple.com/samplecode/CAPlayThrough/

The following post was also quite useful.
http://www.wodeveloper.com/omniLists/macosx-dev/2000/September/msg00262.html

I have played around with the CAPlayThrough sample code but have
failed to strip it down and make it simpler for my needs. I had a bit
of success with the code used in that post (from the link above) and
was at least able to get a callback to get called, albeit with a
buffer size of zero.


Basically I have two questions.
1. Is there some nice simple, 'run in a command line' styled sample
code out there that I can peruse?
2. My understanding is that AudioUnits are quite high level
abstractions. If this is true, what should I be doing in order to make
a through application that is way down at the HAL layer? Although
simple first is fine by me.


Going by...
http://developer.apple.com/technotes/tn2002/tn2091.html
... I'm of the understanding that I need two AudioUnits, one to handle
the input device and one for the output device (despite physically
being the same device). Each of which needs to have its input and
output disabled appropriately.

At this point I'm kind of guessing. Does the input AudioUnit receive
initial data via a callback? How does it send it to the other
AudioNode? via an AUGraph?

Despite the docs being somewhat helpful, It would be nice if someone
broke the core audio structure for a  'device through' down into
lay-man's terms for me. I would really appreciate it.


I have appended some code of me trying to at least receive some audio
data, but I get a buffer size of zero.


static AudioUnit InputUnit;

static int bufferSamples = 64;

void CCoreAudioHost::Start()
{

	AudioDeviceID outputDeviceID;

	UInt32 propertySize;
	propertySize = sizeof(outputDeviceID);
	OSStatus status =
AudioHardwareGetProperty(kAudioHardwarePropertyDefaultOutputDevice,
						&propertySize,
						&outputDeviceID);

    if (status)
	{
        DEBUGSTR("AudioHardwareGetProperty returned %d\n", (int)status);
    }

	if (outputDeviceID == kAudioDeviceUnknown)
	{
        DEBUGSTR("AudioHardwareGetProperty: outputDeviceID is
AudioDeviceUnknown\n");
    }

    int audioIOBufferByteCount = bufferSize * 2 * sizeof(float);
    propertySize = sizeof(audioIOBufferByteCount);
    status = AudioDeviceSetProperty(outputDeviceID,
						NULL,
						0,
						false,
						kAudioDevicePropertyBufferSize,
						propertySize,
						&audioIOBufferByteCount);

	if (status)
	{
        DEBUGSTR("AudioDeviceGetProperty: returned %d when
settingkAudioDevicePropertyBufferSize\n", (int)status);
    }

    status = AudioDeviceAddIOProc(outputDeviceID,
						AudioDeviceIOProc,
						NULL);

    if (status)
	{
        DEBUGSTR("AudioDeviceAddIOProc: returned %d\n", (int)status);
    }

    status = AudioDeviceStart(outputDeviceID, AudioDeviceIOProc);
    if (status)
	{
        DEBUGSTR("AudioDeviceStart: returned %d\n", (int)status);
    }
}

OSStatus CCoreAudioHost::AudioDeviceIOProc(AudioDeviceID inDevice,
						const AudioTimeStamp *inNow,
						const AudioBufferList *inputBuffer,
						const AudioTimeStamp *inInputTime,
						AudioBufferList *outputBuffer,
						const AudioTimeStamp *inOutputTime,
						void *userData)
{
	DEBUGSTR("Callback has been called\n");
	DEBUGSTR("There are %d buffers\n", inputBuffer->mNumberBuffers);
}
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

  • Follow-Ups:
    • Re: A simple 'Device Through' app (need some help).
      • From: Brian Willoughby <email@hidden>
  • Prev by Date: Re: audio units and iphone
  • Next by Date: Re: A simple 'Device Through' app (need some help).
  • Previous by thread: NSSound taper, revisited
  • Next by thread: Re: A simple 'Device Through' app (need some help).
  • Index(es):
    • Date
    • Thread