• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Coreaudio-api Digest, Vol 6, Issue 419
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Coreaudio-api Digest, Vol 6, Issue 419


  • Subject: Re: Coreaudio-api Digest, Vol 6, Issue 419
  • From: Niel Warren <email@hidden>
  • Date: Thu, 5 Nov 2009 20:48:37 -0800

Jeff is correct. While all audio engines running in the built-in audio complex are derived from the same crystal, we do allow them to be at different sample rates.

Best Regards,
Niel Warren

On Nov 5, 2009, at 12:05 PM, email@hidden wrote:


Message: 8 Date: Thu, 05 Nov 2009 09:24:08 -0800 From: Jeff Moore <email@hidden> Subject: Re: Built-In devices on single thread To: CoreAudio API <email@hidden> Message-ID: <email@hidden> Content-Type: text/plain; charset=us-ascii; format=flowed; delsp=yes

The solution it seems like you are looking for is to create an
aggregate device that has all the built-in devices in it. There has
been a lot of discussion about using aggregate devices on this list
previously.

On Nov 5, 2009, at 8:02 AM, audioboy 77 wrote:

In my application (a pro-audio effect unit/synthesizer), I want to
show the Built-In devices as one physical device, with different
channels representing the Microphone, Input and Output.

To do this, I use the "related devices" property to find all
subdevices and group them together.

I then tried starting an IOProc on the first subdevice (which is the
microphone on my system) with the idea that I would read/write the
buffers for all the sub-devices in that callback.  However, im
getting mData which is always NULL.  Im not sure if this is
expected, but im not sure how to create the buffer.  (There is a
property to set the buffersize, but no property to get or set the
address).

I also tried another approach.  I registered each sub-device with
the same callback, hoping that the register proc might cause the
buffers to get created.  When doing this, I have discovered that
each subdevice is running on a seperate thread.  This is strange
given they should be on the same clock, and seems wrong given that I
want the latency to be as low as possible.

I've been messing around with this on and off for a couple weeks
now, and have basically run out of ideas.  Could somebody please
explain the correct strategy on how to approach this?


--

Jeff Moore
Core Audio
Apple



_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
  • Prev by Date: RE: XCode AU project template issue
  • Next by Date: Re: Issues in ExtAudioFileSeek and ExtAudioFileTell
  • Previous by thread: Re: Calling ExtAudioFileDispose() on a large MP4 audio file causes a crash
  • Next by thread: Aggregate Devices: Setting Channel Names
  • Index(es):
    • Date
    • Thread