• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Playing wave file data
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Playing wave file data


  • Subject: Playing wave file data
  • From: B&L <email@hidden>
  • Date: Wed, 15 Oct 2003 21:44:23 +0200

Hello

I have figured out how to load a wave file and then playing the data from it, using the AudioFileReadBytes() and AudioDeviceAddIOProc() functions. The file is mono and 16 bits per sample. This all builds on the example code Sinewave demo written by James McCartney.

My problem is the sound sounds awful! So there seems to be something I dont understand when trying to play wave data. And I have looked around for hints on how to do it.

I have also tried Apples latest example "PlayAudioFileLite", which complains about the constant "kAudioConverterDecompressionMagicCookie" does not exist! Well I just removed that part from the code, and it compiles (with warnings about precomp headers i CarbonSound not being the same date!) and also runs playing the aiff file fine. But it fails with a segmentation fault when I try to play my wave file! Both iTunes and QuickTime plays the file fine.

Well back to the way I try to play the file

Here is how I interpret the data:

The data is in little endian format, so I read one byte (a) and then the other (b). The b byte is shifted up 8 positions and the a byte is put into the first 8 bytes.

Then I convert it to a signed int with some byte handling remembering that 0x8000 is the same as zero. All below is inverted and then negated. If it is above then it is interpreted as a positive value.

Since the output should be a float between 1 and -1 I multiply the data with 0.00000000001, anything higher made it sound like my speakers would blow :-)

This float is then written to the output buffer, both left and right. I have also tried to write a 0 to either of the sound channels.

Whatever I do the sound is awful. I can recognize the sound but there is a lot of noise in there also!

Do anybody have some hints as to what I am doing wrong?

As you probably can gather from the above I am quite the newbie at programming with sound, but you have to start somewhere :-)

I am not quite sure if I should be using the AudioConverter toolbox or AudioUnits for the playing and conversion of the wave file. It seems to be the way Apples is doing it in the above example.

BTW I am using OS X 10.2.8 (second edition) and PB 2.0.1.

Thanks for any help.

/Brian
_______________________________________________
coreaudio-api mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/coreaudio-api
Do not post admin requests to the list. They will be ignored.

  • Follow-Ups:
    • Re: Playing wave file data
      • From: Steve Bird <email@hidden>
  • Prev by Date: Re: Signaling the end of audio conversion
  • Next by Date: Re: Signaling the end of audio conversion
  • Previous by thread: Re: Signaling the end of audio conversion
  • Next by thread: Re: Playing wave file data
  • Index(es):
    • Date
    • Thread