• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Ring buffer design for MP3 (et. al.) playback
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Ring buffer design for MP3 (et. al.) playback


  • Subject: Re: Ring buffer design for MP3 (et. al.) playback
  • From: "Andreas Falkenhahn" <email@hidden>
  • Date: Wed, 17 Oct 2007 13:56:05 +0200

On 16.10.2007 at 11:35 William Stewart wrote:

>I'd have a look at using the ScheduledSlice Player AU - we have an
>example for using this (MixMash) in the Leopard Developer tools. There
>is some documentation on the properties associated with this AU in
>AudioUnitProperties.h

Unfortunately, I don't belong to that inner circle who has access to the
Leopard developer tools, so I can't have a look at the MixMash example,
which would probably answer all of the questions below.

>From the documentation in AudioUnitProperties.h alone, it is very hard
for me to use the ScheduledSlice Player AU, because it seems to be a
totally different concept than my current implementation.

Currently, I'm using the stereo mixer which is connected to the
default output unit. To start playback on a mixer bus, I'm starting a
render callback on that very bus. The render callback then uses
an audio converter to convert the data to the output unit's stream
format.

Now, I don't see how I could fit the ScheduledSlice Player AU into this
scheme, because it does not seem to use a render callback at
all. Instead, it expects audio data to be passed via a call to

AudioUnitSetProperty(playerUnit, kAudioUnitProperty_ScheduleAudioSlice, ...)

Maybe you could outline how to use the ScheduledSlice Player AU,
the StereoMixer AU, and the Default Output Unit in connection. I
really can't imagine how to bring these three together.

Should I connect the ScheduledSlice Player AU to the StereoMixer and
then the StereoMixer to the Default Output Unit? But this still does
not solve the problem that the ScheduledSlice Player AU expects
audio data through AudioUnitSetProperty() while the mixer pulls audio
data from a render callback. It's quite confusing to me....

Thanks for any enlightenment on this issue!

Andreas

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:

This email sent to email@hidden

References: 
 >Ring buffer design for MP3 (et. al.) playback (From: "Andreas Falkenhahn" <email@hidden>)
 >Re: Ring buffer design for MP3 (et. al.) playback (From: "Andreas Falkenhahn" <email@hidden>)
 >Re: Ring buffer design for MP3 (et. al.) playback (From: "Andreas Falkenhahn" <email@hidden>)
 >Re: Ring buffer design for MP3 (et. al.) playback (From: William Stewart <email@hidden>)

  • Prev by Date: hao can i get Audio's informations
  • Next by Date: Re: Coreaudio-api Digest, Vol 4, Issue 283
  • Previous by thread: Re: Ring buffer design for MP3 (et. al.) playback
  • Next by thread: AUSplitter: Second output returns error on creation
  • Index(es):
    • Date
    • Thread