• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: CoreAudio driver's buffer management
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: CoreAudio driver's buffer management


  • Subject: Re: CoreAudio driver's buffer management
  • From: Tommy Schell <email@hidden>
  • Date: Wed, 22 Dec 2004 16:20:25 -0700

I mean to say that I'd like to do a kernel space CoreAudio driver.
My dilemma is that I already have a user space firewire driver which pulls audio (and video) off of firewire.
I need to bring the audio over to CoreAudio, but I don't want to write a user space CoreAudio driver, as there
is little documentation for it.
So I want to do a kernel space CoreAudio driver, with the data provided not directly by the hardware, but by the
user space firewire driver, which interacts with the hardware (via the firewire device interface).
Is this feasible?


Also, I can't seem to get the PhantomAudioDriver to provide dummy data to the app. I write data into the input buffer
used in setSampleBuffer, but the app or the HAL never asks for it.
How to actually get audio data flowing up to the app?
I saw a reference in the documentation to startAudioEngine. The doc says this must be subclassed- is that true?
PhantomAudioDriver doesn't subclass this function.


Thanks a lot,
Tommy Schell


On Dec 21, 2004, at 1:04 PM, email@hidden wrote:


Message: 1
Date: Mon, 20 Dec 2004 13:03:51 -0800
From: Jeff Moore <email@hidden>
Subject: Re: CoreAudio driver's  buffer management
To: CoreAudio API <email@hidden>
Message-ID: <email@hidden>
Content-Type: text/plain; charset=US-ASCII; format=flowed

You'll need to be a bit more specific. If you are writing a user-space
driver, you don't have the IOAudio family's services available to you.
You are completely on your own with respect to buffer management. The
only constraints are the semantics of the HAL's API imposes.

On Dec 20, 2004, at 7:38 AM, Tommy Schell wrote:

Hi,

Suppose I have a CoreAudio driver which, instead of pulling data
directly from the hardware, received (and gave) audio data via
a user space firewire driver.  That is, incoming data came from user
space down to CoreAudio driver, and outgoing data went
from CoreAudio driver up to the user space firewire driver.  Would
that negatively impact buffer creation or management?

And then how exactly is a CoreAudio driver's ring buffer managed?  How
does the data get wrapped around?
Does it all happen automatically behind the scenes?
If the data came from user space, would the ring buffer be effected?
How would I trigger an interrupt when the ring buffer wraps around (to
take a time stamp), if the CoreAudio driver isn't
dealing directly with hardware?

Thanks a lot,
Tommy Schell

--

Jeff Moore
Core Audio
Apple

_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
  • Follow-Ups:
    • Re: CoreAudio driver's buffer management
      • From: Jeff Moore <email@hidden>
  • Prev by Date: RE: Re: Midisport and sysex
  • Next by Date: Re: CoreAudio driver's buffer management
  • Previous by thread: Re: CoreAudio driver's buffer management
  • Next by thread: Re: CoreAudio driver's buffer management
  • Index(es):
    • Date
    • Thread