Sample code and questions about Audio File Services
Sample code and questions about Audio File Services
- Subject: Sample code and questions about Audio File Services
- From: Bjorn Roche <email@hidden>
- Date: Tue, 11 Mar 2008 19:48:18 -0400
Hey there,
I've been researching options for reading audio files in a more
flexible way in my app. Right now I have my own library that reads
most AIFF/C and WAV files, and I'd like to add support for SD2 as well
as support for some more metadata such as regions that are defined
inside the files. I've looked into libsndfile and libaudiofile which
do not currently meet my needs but are at least portable and well
documented. I suspect I could add support for regions pretty easily.
Apple's Audio File services appears to meet my needs but is not
portable, and the documentation (At least the one here: http://developer.apple.com/documentation/MusicAudio/Reference/AudioFileConvertRef/Reference/reference.html)
leaves me with many concerns, some of which were addressed by
reading the header file, but I'm wonder what other people's
experiences are: were they able to get everything working easily,
including metadata? Is there some sample code I'm missing?
Many thanks!
bjorn
(For the curious, or perhaps the brave ones who want to answer my,
I've attached my irate ranty complaints about the docs below.
Apologies for their irritated tone. I was... irritated.)
I've worked a lot with audio files and I found this document difficult
to follow. For example,
*) The AudioFileRegion contains markers defining the start and end.
The docs state:
"Typically, a region consists of at least two markers designating the
beginning and end of the segment. Other markers might define
additional meta information such as sync point."
Hmmm. That's pretty vague. "Typically... at least two...". It really
needs to ALWAYS have at least two: a region has a start and an end. As
for the rest of it, yes, there could be a sync point or other stuff as
well as a start and an end, but how do we know what's what? The
authors of the library may intend the extreme values to be the start
and end, or the first to be the start and the second to be the end,
but how do we know? (There's no real reason a sync point couldn't be
outside the bounds of a region -- I've never seen this implemented in
software, but it could be). This desperately need clarification.
*) When defining, say, markers to save to a new file, the docs say
nothing of what values have to be defined. Does the user need to
define SMPTETime and FramePosition, or is just one sufficient? Many
parts of the docs have holes like this.
*) The Documentation for mFrameType simply states "The marker type."
What are the possible values? Many parts of the docs need further
explanation to be usable. (perhaps this is connected to the previous
question...)
*) Many things are missing. Where, for example, is
AudioFileGetProperty()? I eventually had to find in the the AudoFile.h
file. With this document alone it is impossible to open and read an
audio file, because you need this function to know what format the
data files are in.
bjorn
-----------------------------
Bjorn Roche
XO Wave
Digital Audio Production and Post-Production Software
http://www.xowave.com
http://blog.bjornroche.com
http://myspace.com/xowave
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden