• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Datasources and digital audio output
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Datasources and digital audio output


  • Subject: Re: Datasources and digital audio output
  • From: Jeff Moore <email@hidden>
  • Date: Mon, 14 Nov 2005 12:23:46 -0800


On Nov 12, 2005, at 4:11 PM, Derk-Jan Hartman wrote:

On 11-nov-2005, at 20:55, Jeff Moore wrote:
On Nov 11, 2005, at 10:50 AM, Derk-Jan Hartman wrote:

I have a G5 now, so i can finally start work on supporting the digital out of the G5.
Can anyone explain the concept of "Datasources" for output devices?
I tried looking in the documentation, but couldn't find anything useful.

The data source control of an output provides a means of telling the hardware which destination, from a set of mutually exclusive choices (analog and digital in the G5 case), to send the data. I'm not sure how to explain it more simply than that.

So I should consider it as a "hint" to the driver about the destination output port of the audio? There is no implied meaning at the application level?

Other than it changes the physical port the audio is coming out of, no, it has no bearing on application functioning in general.


On the G5, the 2 datasources are presented to the users as two different devices in the Sound PrefPane.

Yes, but the Sound Prefs pane is presenting a greatly simplified view of the devices on the system. It's example should not, in general, be followed by applications that are implementing their own device selection UI.

Right ok... But how else am I gonna know in my application if the user wants to make use of his digital output port?
I mean selecting cac3 when you only have plain "Line Out" , makes no sense, and will cause that you have no audible output at all. Be it stereo or encoded digital.

I'm no UI designer but if I had an app that wanted to allow the user to control whether the app put out AC-3 data to a digital port or to output regular linear PCM, then I'd just have a single checkbox that said "Encoded output" on my device selection UI. The checkbox would only be enabled if the currently selected device supports it.


Note that figuring out whether or not a given device supports both a digital port and AC-3 are totally separate operations. There are cases where a device will support AC-3 but not have a digital output port and vice-versa, where the device has a digital output port but doesn't support AC-3, in addition to the devices that support both or neither.

Selecting one of them changes absolutely 0 to the setup of the device from what I can see.

Incorrect. It has changed the value of the data source selector and has redirected the audio output to the chosen port. I think some of our built-in hardware knows whether or not something is actually plugged into the optical jack, so it may still get sent to the analog output when there isn't anything plugged in. I forget and I don't have a G5 in front of me to check.

I believe the G5 is one of those.

I both modes all the lpcm and cac3 streamformats are available. So should I see this as some sort of "preference" ??? What are these "datasources" an abstraction of?

You are making connections where none are implied. The data source and the available formats don't necessarily have anything to do with each other.

So if there is no connection, then I am unable to automate the selection of the digital format from an application? My only option is providing an application specific preference that says: "Output encoded audio", and have the user select this if he wants to use it? I had anticipated that a user could select this systemwide in one form or another, but apparently that's not the case.

When you enable AC-3 output, you also have to hog the device so that no other app can use it (if they did, they would corrupt the output and defeat the decoder). Consequently, the decision to fully support encoded output, by definition, implies that it is done on a per- application basis and makes no sense on a system-wide basis. If you support it, you have to have UI that allows your user to tell your app to use or not use it so you can do the right thing as far as being a proper HAL client in this area goes.


Speaking of, when dealing with hog mode and encoded output, the polite thing for an application to do is to save the device's state prior to taking hog mode and changing the format and restoring the state when you are done.


--

Jeff Moore
Core Audio
Apple


_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
  • Follow-Ups:
    • Re: Datasources and digital audio output
      • From: Derk-Jan Hartman <email@hidden>
References: 
 >Datasources and digital audio output (From: Derk-Jan Hartman <email@hidden>)
 >Re: Datasources and digital audio output (From: Jeff Moore <email@hidden>)
 >Re: Datasources and digital audio output (From: Derk-Jan Hartman <email@hidden>)

  • Prev by Date: HAL, AudioUnits, OS X, QuickTime (was: Re: Device unplugged, how application should handle the situation?)
  • Next by Date: Re: BlueTooth transport type detection?
  • Previous by thread: Re: Datasources and digital audio output
  • Next by thread: Re: Datasources and digital audio output
  • Index(es):
    • Date
    • Thread