• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Core Audio beginner questions
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Core Audio beginner questions


  • Subject: Re: Core Audio beginner questions
  • From: Chris Reed <email@hidden>
  • Date: Sun, 22 Jun 2003 12:59:32 -0500

On Sunday, Jun 22, 2003, at 11:33 US/Central, David Scrhve wrote:

Hello,

I'm trying to port my sound API to CoreAudio and I have severals
questions that are not answered by the manual (which is also very hard to
find in developer doc pages):
* First of all, I would like to be able to play severals buffered
files at the same time. My code is responsible of filling the buffer and
reading from the files but all files have differents sounds format (number
of channel, bytes per sample, sample rate). Should I have to create one
AudioUnit per file and could I have severals AudioUnit with same or
different sound format ? And, am I forced to send floating point values to
CoreAudio ?

You can create AudioConverters to convert the sound file formats to the canonical 32-bit float format used in CoreAudio.

I'd use a single output AU and manually mix the converted audio file data.

There are definitely other ways to do it--a lot is personal preference.


* I saw severals sources that are using the AudioDevice directly.
What's the best way for my problem : Audio Device or AudioUnit ?

It's very much up to you. If you are doing fairly simple streaming, all you will probably ever need is handled by the output AUs. But if you are doing input as well, you'll need to work with AudioDevices.

On the other hand, if you are using MTCoreAudio for instance, using AudioDevices is just as easier (or easier) than AUs.


* Can I change the size of the buffer I sent to CoreAudio and know
the current playback position in each buffer (I need to know, in addition,
when a buffer begin to be played ) ?


Not sure exactly what you mean here..

-chris
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.

  • Follow-Ups:
    • RE : Core Audio beginner questions
      • From: David Scrève <email@hidden>
References: 
 >Core Audio beginner questions (From: David Scrève <email@hidden>)

  • Prev by Date: Re: Known issues with VST-AU based AudioUnits
  • Next by Date: Re: more AU info and plugins (should be "I need to share data")
  • Previous by thread: Core Audio beginner questions
  • Next by thread: RE : Core Audio beginner questions
  • Index(es):
    • Date
    • Thread