Re: .mov into AudioFilePlayer Unit?
Re: .mov into AudioFilePlayer Unit?
- Subject: Re: .mov into AudioFilePlayer Unit?
- From: Seth Willits <email@hidden>
- Date: Tue, 13 Nov 2012 12:34:02 -0800
First off: Thank you for your book. I haven't read it all, or even most of it, but I jumped around as needed, and particularly the AudioUnit section was immensely helpful. Naturally I had to piece some things together and do a little experimenting but I figured it out.
So right now I'm taking multiple .aif files -> AudioFilePlayers -> mixer -> [more units] -> generic output, looping through AudioUnitRender (hopefully correctly - this is the one part your book didn't cover*) to pull AudioBufferLists out, then it took me 45 minutes of searching and fiddling to figure out that turning an ABL into a CMSampleBuffer was easily done with CMSampleBufferSetDataBufferFromAudioBufferList, and then I'm using a AVAssetWriter to mix this audio with video in a file.
(The AudioFilePlayer unit yields LPCM which goes through the graph. It does the conversion automatically.)
I have multiple test projects in parallel and one of them does use AVAssetReaderTrackOutput to get CMSampleBufferRefs which I could copy into a destination, but I wanted to process them through a graph so I just went straight to your book and started with the AudioFile API.
But as you say (and I briefly alluded to in my first post), I could take those LPCM sample buffers, get ABLs back out of them and provide them via the callback to the AUGraph, but to do I'm pretty sure I'd need to buffer them since the graph will ask for some "arbitrary" amount of data from me, and I'm only able to get data in chunks the size of whatever AVF decides to give them to me, I'll have to read ahead some amount more, copy the data, figure out where to cut it, hand that back to the callback etc... Things I'm sure I could figure out in some number of hours, but I really don't want to do. With an mp4/aif/etc file via AudioFile, it's simple configuration rather than performing working.
*That is, your book doesn't cover driving a graph by looping and calling AudioUnitRender() yourself. Instead, the only examples I could find in your book or anywhere else on the net is AudioUnitRender() being called inside of a callback so you're already given the action flags, timestamp, etc which just gets passed into AudioUnitRender. It may be trivial, but it wasn't really clear to me how I should create my own timestamp for Render, what flags are really necessary (0 it seems), how many frames to ask for at a time, knowing how many max frames there are (especially if there is a timepitch unit involved which), incrementing the timestamp, what happens if you ask for frames after the input/source/generator units run out of data, etc. I believe I figured it out, but I'm not 100% confident.
--
Seth Willits
On Nov 13, 2012, at 11:41 AM, Chris Adamson wrote:
> If you don't mind mixing in some AV Foundation with your Core Audio, consider an AVAssetReader. Set it up to give you an AVAssetReaderTrackOutput from the file's audio track, then call that class' copyNextSampleBuffer() to get CMSampleBufferRefs. Core Media then has some convenience functions to get the sample data in a Core Audio format (either as an AudioBufferList, or separate calls to get the packet descriptions and the data buffer).
>
> Actually, you have a bigger problem to deal with: your AAC data can't go into an AUGraph without first being converted to LPCM. I think you can have AVAssetReader do this conversion for you, by setting the outputSettings of the AVAssetReaderTrackOutput to some AUGraph-friendly LPCM format. Assuming that works, you'd be receiving LPCM in the CMSampleBufferRefs, and could then just fetch that data as AudioBufferLists, which should be what you need in your AUGraph (particularly if you're doing this in an AURenderCallback… usual caveats about slow/indeterminate operations in render callbacks, yada yada).
>
> Hope this helps. Heck, I just hope it's somewhat correct. Grain of salt: this is pretty speculative.
>
> --Chris
>
> On Nov 13, 2012, at 2:14 PM, Seth Willits <email@hidden> wrote:
>
>>
>> I have some .mov files with nothing but a single AAC track in them that I need to pipe through an AUGraph. I figured out all of the AUGraph stuff (*pats myself on the back*) but I was using AudioFileOpenURL with aif files to get an AudioFileID to use with kAudioUnitSubType_AudioFilePlayer units and now I see that AudioFileOpenURL doesn't open mov files — even being nearly identical to an mp4 file.
>>
>> At this point one reasonable option I see is to transcode the .movs to .mp4s using passthrough since the files are relatively small so it'd be quick, but is there another way that's little work? I don't want to have to read the samples myself from the file, buffer them, and hand them off in a callback to an AudioUnit etc.
>>
>>
>> --
>> Seth Willits
>>
>>
>>
>>
>> _______________________________________________
>> Do not post admin requests to the list. They will be ignored.
>> Coreaudio-api mailing list (email@hidden)
>> Help/Unsubscribe/Update your Subscription:
>>
>> This email sent to email@hidden
>
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden