Re: Cannot change device stream format
Re: Cannot change device stream format
- Subject: Re: Cannot change device stream format
- From: Brian Willoughby <email@hidden>
- Date: Mon, 04 Jan 2016 10:31:54 -0800
On Jan 3, 2016, at 11:18 PM, Yue Wang <email@hidden> wrote:
> Thanks Brian! This sounds very reasonable to me. What about volume setting? Will that affect the result (since multiply a floating point number may narrowing dynamic range)?
It depends upon where the volume setting is implemented. If your code does the multiply, then it's not bit perfect. If the audio driver does a multiply, it's also not bit perfect. But if a custom driver talks to the hardware in such a way as to attenuate the signal in the analog domain, then it would be bit perfect. I am not aware of a driver that does this.
I switched from the oldest MobileIO interface to one of the newest in order to take advantage of the digital-controlled analog pots that they added (in the ULN-8 and LIO-8). Unfortunately, the Metric Halo Labs FireWire audio driver does not present this volume control to CoreAudio, so I must use their software (or the physical knob / encoder) to control the volume setting. All of the Apple stuff is set to full volume, 0 dB attenuation.
> There're also rumors that adjusting iTunes' (or Foobar2000's) volume to max will give bit perfect result. I have no idea if that's the case.
For iTunes, it's not a rumor. I've confirmed bit perfect performance. However, you must manually set the interface to match the file's sample rate before starting iTunes to avoid Sample Rate Conversion, and then the volume must be set to max with all sound processing options turned off. iTunes has a bad habit of fixing its output sample rate, so if you change the interface's sample rate without restarting iTunes then there will be SRC. Even at max volume, SRC will not be bit perfect. Newer versions of iTunes may perform better with regard to SRC, but this bug was prevalent for many major releases of iTunes, so I operate on the assumption that it's still an issue.
> I have another curious question, though. Can I trust AudioOutputUnit automatically do that for me (i.e, set the AudioDevice's physical format to match the file format, and hog device and disable mixing, and set the stream format of AudioOutputUnit as the file format, and send raw PCM file data to the AudioOutputUnit, and expect AudioOutputUnit automatically translate 16bit integer to 32bit float and translate back to 16 bit int without loss of precision and send to device). If that path is "bit perfect", then no extra work is needed. Unlike Android's audio stack, AudioOutputUnit is a black box to me, so I have no knowledge of its behavior.
The AudioOutputUnit will not automatically change the physical device. Apple's philosophy is that the hardware is a shared resource that is potentially used by multiple pieces of software, and they've decreed that the user should set the sample rate, not the software. If you ignore Apple's guidelines and set the sample rate in software, you run the risk that another piece of software will also try setting the sample rate, and then the two (or more) pieces of software will fight each other to set a different sample rate. I've witnessed this happening, and it makes audio impossible!
If you want to automate this process, you'll need to find the physical device that the AudioOutputUnit is connected to and then send it messages to determine the available sample rates and to select the desired sample rate. This is certainly possible, and Logic Studio Pro does it when loading a Song that was authored at a particular sample rate. I've written CoreAudio code to do this in the past. If you implement such a thing, I recommend having a Preference setting to turn off automatic sample rate changes, and to have it start in the Off setting by default until your end user purposely enables it.
It's particularly bad for audio software to change the physical sample rate when using the Default Audio Output as set in AudioMIDISetup, because that output is shared by far too many audio applications. Instead, if you add Preferences to your application that allow the user to select a different audio output than the default, then it makes more sense to directly control the sample rate. It's not really that hard to query the list of available audio devices, present a popup, and save the selection between runs of your application. Users who are interested in bit perfect audio are more likely to have their AudioMIDISetup default pointing to a different audio device for system and default output, because this keeps all other software segregated from mixing unwanted audio into the bit perfect stream. If all professional audio applications had their own audio device selections, it would make bit perfect audio much easier.
You can also request Hog mode when you attach to an audio interface, but I've had trouble where this still doesn't prevent other software from accessing the physical audio interface. Perhaps those were bugs in my code, or bugs in an older release of OSX. It's still a good idea to use Hog mode any time you want bit perfect audio, because as soon as two or more application audio streams are mixed, neither will be bit perfect any more.
Brian
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden