• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Beginner question
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Beginner question


  • Subject: Re: Beginner question
  • From: Alexander Dvorak <email@hidden>
  • Date: Wed, 15 Sep 2004 23:23:25 -0400

Chris and Others,

Thank you for your continued help. The process is starting to become clearer to me.

Essentially, you want to take the data you get in inInputBuffer, and
put it somewhere until you're done recording.  If you have enough
memory, you could just keep appending the data to a memory buffer,
until the user presses stop.

So writing the information from the InputBuffer to a memory buffer would be done in the ioProc? What variable type would the memory buffer be? What kind of code would append the incoming data to a memory buffer?


Once stop is pressed, you can take your
memory buffer and write it out to a file.

So, I will end up with a buffer filled with AIFF formated sound? If
this is the case, how can I have this saved to a file? I know how to
get the user to choose a file (using the save panel methods in Cocoa).
How would I (1) get the program to stop recording when the user pressed
the stop button, (2) bring up a save panel, and (3) save this AIFF
buffer to a file? I think, to answer my own question, I would (1)
link a button to a "stop" action method. In this method I would call
AudioDeviceStop() and (2) call up a save panel (to get the name and
location of the new file). It is the third step that throws me. How
would I save the aiff buffer (if that is what the ioproc gives me) as
an AIFF file?

Saving the buffer, as I said before, is up to you. If you've never read up on file I/O before, then you might want to read one of the many samples available online. I think that fopen() and fwrite() and fclose() from the C standard library are what you are looking for. Combine that withe some information about the AIFF file format and you should have exactly what you need.

What do you mean "multiplexed data structure" and how would this impact
what I am trying to do?

Multiplexed data is just audio stored in a left, right, left, right fashion. So you'll see a bunch of numbers that represent sample values for the left channel and right channel, one after another. For example, if you were receiving sound *only* on the left channel, you'd see something like this:

0.7, 0.0, 0.6, 0.0, 0.7, 0.0, 0.9, 0.0, ...

Does that clear things up?

I'm not sure that writing the data to disk during the IOProc is
necessary, given what you described what you want to do. You can pass
a pointer along to the IOProc that would allow you to send Cocoa
messages to the object that set up the IOProc in the first place.

You lost me here.

The IOProc has a pointer parameter at the end of its parameter list. When you tell the device about its IOProc in the first place, you can provide a pointer that will be passed along to the IOProc at that time. If you pass a pointer along to 'self', then you can send messages to the object that told the device about the IOProc in the first place.

Unfortunately, if you have not done much C programming that has
followed the same style as this (passing pointers along to callbacks),
then there is a little bit of a learning curve ahead of you.

MTCoreAudio actually looks more difficult than Core Audio! Also, when
I tried compiling the Test... example, there were 33 errors. I think I
will stick to trying to figure out Core Audio! Thank you for the
suggestion, though!

I'm not sure why you got errors, but you might want to verify that the Framework is installed in the right place, and you have your XCode project pointing to the right framework location. The example should compile with no errors at all. If you can follow that example, then you'll be able to do a lot with CoreAudio.

Another example that mixes Cocoa and plain CoreAudio can be found here:

http://www.audiosynth.com/sinewavedemo.html

There's a lot of information to wade through out there, and I'm sure
you will get it soon enough with enough examples.

Cheers,

Chris
http://www.supermegaultragroovy.com


_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list (email@hidden) Help/Unsubscribe/Update your Subscription: This email sent to email@hidden
References: 
 >Beginner question (From: Alexander Dvorak <email@hidden>)

  • Prev by Date: interleaved=stereo=multiplexed?
  • Next by Date: Re: interleaved=stereo=multiplexed?
  • Previous by thread: Re: Beginner question
  • Next by thread: CodeWarrior (9.x) example?
  • Index(es):
    • Date
    • Thread