• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Audiofile buffering
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Audiofile buffering


  • Subject: Re: Audiofile buffering
  • From: Philippe Wicker <email@hidden>
  • Date: Mon, 7 Jul 2003 21:24:46 +0200

On Monday, July 7, 2003, at 06:11 PM, Mark's Studio wrote:

I need some advice on how to implement buffering for an audiofile.

The app is basic Cocoa soundfile editor in ObjC.

I use the audiofile api to open the file and convert the samples to
Float32,
and then i just copy the whole thing into NSData.

There is 2 things that need access to the data and that is the waveform
display,
and when the file is playing.

How do i implement buffering of the sound file so i don't need to load
it all at once?


You can find an example of audio file reading while playing it in the AU SDK sample code CoreAudio/Services/PlayAudioFile (C++) or PlayBufferSoundFile (Obj C) written by Kurt Revis (www.snoize.com).

In both cases, the idea is to pre-read reasonably "large" chunk of the audio file in a high priority thread (let's call it the feeder thread), convert these data to the native AU format (Float32), and to pick the amount of data in these chunks from your code in the IOProc (which executes in another high priority thread, the rendering thread). Assume you just wand to read your file from the 1st sample to the last "linearly" (that is without looping). Assume also that the file chunk size is 32K frames (just to give a number). And last, assume that at a given time your program is in the following state:
- chunk n and n+1 are loaded in memory,
- the current IOProc (or AURenderCallback) requests audio data located at the very beginning of the chunk n+1,
- previous IOProcs have read data from chunk n (if you prefer, chunk n is older than n+1).

Because you are reading the file "linearly", you can assume that you won't need anymore data from chunk n. On the other hand, you will soon need data from chunk n+2 (in about 0.74 sec if your audio is sampled at 44.1 KHz). Then what you need to do is to "send" - from the render callback - to the feeder thread a request to read the data for chunk n+2. Because you don't need any longer data from chunk n, your feeder thread can replace the chunk n data by the chunk n+2 data.

When the next render callback is called, your code determine that requested data are not located at the very beginning of the chunk and therefore just do the rendering without emitting any request to the feeder thread.

0.74 sec later, one of your render callback will request data located at the very beginning of chunk n+2. Same state as above, and your code will send to the feeder thread a request to read data for chunk n+3 which will replace in memory chunk n+1. Your 2 buffers play ping-pong. While you are rendering from one of them, your feeder thread is reading the other from the disk.

The key point here is that rendering and loading from disk are executed in 2 different threads, both having real time requirements (especially the rendering thread). It means that you must **not** use any communication mechanism between rendering and feeder thread that could - potentially - blocks the IOProc. Also you must set scheduling and priority properties to your feeder thread in order to guarantee that it will not be preempted when you move a window, launch another app, etc... The code for this is available in both sample codes mentioned earlier.

In your particular application, you may need to access audio data from the UI thread (to draw the waveform). I don't know what is the data you want to display and when. The worst case is when data are rendered, say, at the beginning of the file, while the user wants to examine - with a strong zoom - audio located somewhere else. Data loaded in memory for the rendering is not the one you need for waveform drawing. BTW I say "with a strong zoom", because this is the only situation where you need actual audio data, otherwise, you may use a pre-computed waveform overview. Any way, you are in the situation where there exist 2 readers of the same file in the same process. Additionally, one of the "reader" a priori needs to access data randomly. You cannot predict where the user will want to look at.

I'm forgetful here. I don't remember if reading the same file from 2 threads is thread-safe. I think it isn't, but not sure. Maybe someone else in the list could give some precisions about this point. So, if it isn't thread-safe, requests coming from the render thread and requests coming from the UI thread must be put into some kind of queue and executed in a third thread (in this case the feeder thread). Of course, the queuing mechanism must not be potentially blocking. If it is thread-safe, then the solution is fairly simple: read the file in the UI thread and it's Okay.

Hope these few words will give you - if not whole - at least the beginning of the solution.

Regards,

Philippe Wicker
email@hidden
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.

  • Follow-Ups:
    • Re: Audiofile buffering
      • From: "Mark's Studio" <email@hidden>
References: 
 >Audiofile buffering (From: "Mark's Studio" <email@hidden>)

  • Prev by Date: Re: DeviceIsRunning for which IOProc?
  • Next by Date: Re: Sound Input Manager under OSX
  • Previous by thread: Audiofile buffering
  • Next by thread: Re: Audiofile buffering
  • Index(es):
    • Date
    • Thread