Re: Audio Streaming
Re: Audio Streaming
- Subject: Re: Audio Streaming
- From: "Marco Papa" <email@hidden>
- Date: Mon, 28 Jul 2008 00:40:53 -0700
Date: Fri, 25 Jul 2008 16:46:09 -0700
From: Jens Alfke <email@hidden>
Subject: Re: Audio Streaming
To: Erik Aigner <email@hidden>
Cc: email@hidden
Message-ID: <email@hidden">email@hidden>
Content-Type: text/plain; charset="windows-1252"
On 25 Jul '08, at 2:34 PM, Erik Aigner wrote:
[...]
> Is there an EASY! way to solve this Problem?
> (QTMovie doesn't work in this case btw)
If it's being streamed over HTTP, you can use QTMovie. Otherwise, you
have to roll your own solution using NSURLConnection + AudioFileStream
+ AudioQueue. It's definitely not easy :-(
It would be nice if Apple could put out some sample code showing how
to do this, since people keep asking about it. I have my own code, but
it's not well-factored enough to give out standalone.
—Jens
I just finished version 1.0 of an iPhone application to be submitted to the App Store in the next couple of weeks and I totally agree with Jens: IT IS NOT EASY.
Our app uses CFNetwork + AudioFileStream + Audio Toolbox Queues. The additional problem when you must handle "streaming" audio is that 99% of the MP3 streams out there use "Shoutcast metadata", which is dispersed in between the audio frames, spaced at fixed intervals, but variable for each stream. You end up writing code (like circular queues) that strips off the metadata (mostly song title, artist and station URLS, but it could be anything of any length), before you send it to the audio queue buffers. If you do not do that, you'll end up with garbled music. You can see the metadata roll like a marquee under the station name when using iTunes.
There are actually LOTS of code examples that come with Xcode 3.1. There are examples of an audio client and server that use AudioFileStream + Audio Toolbox Queues, and another example that grabs data using CFNetwork APIs. We were able to lift and re-user 90% of this code and concentrate on the UI.
An additional problem when you have a UI "concurrent" with the live stream is NOT locking up the UI while the whole stream from the network to the audio device is at full speed. Some of the stream is handled by the main thread (which also handles the UI) while all the audio queue handling code uses callbacks on a separate thread. If you do not set up enough audio buffers you end up locking up the UI while new packets are waiting for available audio buffers (which get recycled after being played). Even using POSIX thread mutex + condition signals (which are used in the Xcode samples) does not completely fix the issue. You end up increasing the number of audio queue buffers until you find enough of them able to sustain up to 192K of streaming audio, with no hiccups to the UI.
And these are just some of the issues you'll face. The rest involve handling streams that do HTTP redirects (301 or 302) , authentication (401) and of course streams that are dead (no HTTP response or 404).
-- Marco
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden