• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Logical lpcm format.
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Logical lpcm format.


  • Subject: Re: Logical lpcm format.
  • From: Jeff Moore <email@hidden>
  • Date: Tue, 17 Feb 2004 12:43:02 -0800

The driver does the conversion from the physical format to the virtual format. It can be bypassed by taking hog mode (kAudioDevicePropertyHogMode) on the device and turning off mixing (kAudioDevicePropertySupportsMixing). That said, not every device supports hog mode or turning mixing off as well as devices that don't support 16 bit samples and only provide 8 bit samples or 24 bit fully packed samples or a host of other interesting integer sample formats. So if you're writing a app for other people to use, you'll probably need to deal with floats anyway.

Not to mention, the amount of CPU involved in converting from int to float isn't really all that much. It's way down in the noise of most profiling I've done, especially if the app is careful to disable the streams it isn't using. Plus, floating point numbers have other virtues that have nothing to do with quality. For instance, if you are doing signal processing with integers, you are going to burn away the savings from skipping the conversion to floating point on headroom management and the other hassles of fixed point math.

But if you are still determined to forge ahead with native mode in the HAL, then please make sure that your application restores the device to the state it found it in when it quits.

On Feb 17, 2004, at 10:46 AM, Zachary Drew wrote:

The physical stream format of my line-in audio hardware is 16 bit signed
int lpcm. The HAL converts this to 32 bit float lpcm. My question:

Does the HAL use the cpu to perform this conversion? I'm writing a very
performance-sensitive application which is trying to get 16 bit unsigned
int samples from the line-in source and I would like to aviod needless
conversions. Is there a way to bypass this conversion and get 16 bit
signed int format from HAL? Is there any reason this shouldn't be done
(the captured samples will never be saved as audio data in my
application so quality is not an issue)?

Thanks,

-Zach

--

Jeff Moore
Core Audio
Apple
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.

References: 
 >Logical lpcm format. (From: Zachary Drew <email@hidden>)

  • Prev by Date: Logical lpcm format.
  • Next by Date: Re: GarageBand and AU presets
  • Previous by thread: Logical lpcm format.
  • Next by thread: Garage Band MIDI inputs
  • Index(es):
    • Date
    • Thread