• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Which channels are used for what?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Which channels are used for what?


  • Subject: Re: Which channels are used for what?
  • From: Steve Checkoway <email@hidden>
  • Date: Tue, 23 May 2006 16:21:22 -0700

William Stewart wrote:
Yes, both are variable sized structs
Good. I'm handling this correctly then.

BTW, the output unit handles all of this stuff for you.
I understand. That's what you said in the previous e-mail. You also said that the output unit does not provide as much information as the HAL. I really have 3 requirements: I need to play audio with as little latency as possible. I need to survive device format changes similarly to how iTunes does if you change the device sample rate using AMS. I need to know when "now" is as accurately as possible multiple times per second and at each io proc callback. Using the audio unit, it seems like I can easily get the first two (well, I assume the overhead of the output unit doesn't introduce much latency). Just talking to the HAL, I have working code that handles the first and third (at least on Tiger, there's nothing I can do to work around the timing bug I sent so much mail to the list about in prior systems).

Is there any documentation that answers the questions I posed? I'm willing to do work to handle this correctly. If no such documentation exists and I had any other audio device to test, I could poke and prod it myself but with just the built-in device in my G5 (and I suppose I could try my G4 as well, but I suspect it acts roughly the same), there's not a lot to go on. I have two channels and one stream and the mapping is clear.

This is the right one.
Thank you. When I get home, I'll file a documentation bug on the other one.

You only need to provide packet descriptions if the format is variable (either mBytesPerPacket or mFramesPerPacket of an ASBD would be zero). You don't see this for a linear PCM format. An example would be MP3 or AAC, or the variable bit rate version of AC-3 (which is rare). Even compressed formats that are constant bit rate (both of these fields above are non-zero - that is each packet has the same number of bytes, representing the same number of frames), do not require packet descriptions - because they are constant. (For example IMA4 is a constant bit rate compressed format).
Okay. This is good to know. I will also file a documentation bug asking to include this as well.

I appreciate all the time you've spent helping me resolve my problems and answering my questions.

- Steve

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: Which channels are used for what?
      • From: Jeff Moore <email@hidden>
References: 
 >Which channels are used for what? (From: Steve Checkoway <email@hidden>)
 >Re: Which channels are used for what? (From: William Stewart <email@hidden>)

  • Prev by Date: multichannel device with ahal
  • Next by Date: Re: AudioDeviceStop and ioProc
  • Previous by thread: Re: Which channels are used for what?
  • Next by thread: Re: Which channels are used for what?
  • Index(es):
    • Date
    • Thread