Re: Megalatency
Re: Megalatency
- Subject: Re: Megalatency
- From: Hamish Allan <email@hidden>
- Date: Wed, 5 Jan 2005 17:26:27 +0000
I ask because I was wondering about latency issues for playing movies
with sound over Airport Express. The advice given on the Rogue Amoeba
forums is "Adjust the synchronisation manually in VLC or MPlayer" and I
was wondering whether this could perhaps be performed automatically if
VLC or MPlayer paid proper attention to timestamps, i.e., the Audio
Unit requesting slices for playback several seconds into the future.
In my attempts to bring this up on the Rogue Amoeba website it
transpired that Slipstream was not written as an Audio Unit which
surprised me because it means, amongst other things, that it will not
play nicely with other Rogue Amoeba software such as Detour.
Paul from Rogue Amoeba has rather avoided the question on the forums
(e.g., http://www.rogueamoeba.com/forum/ubb/Forum7/HTML/000003.html),
so I thought I'd ask here whether it was possible in principle, both to
write the wrapper and to get the latency anywhere near correctly
handled.
Thanks for your replies!
Hamish
On Jan 5, 2005, at 17:02, Dan Nigrin wrote:
He wanted an AudioUnit....though to your point, I'm not sure why.
Perhaps he wanted to only stream wirelessly an specific submix in a
larger app...?
Dan
At 11:57 AM -0500 1/5/05, email@hidden wrote:
i'm not sure why you would even need jack for this, if you've got an
app
that's producing audio you just have Slipstream select that app for
broadcast.
-rudy
I wonder if when Slipstream is available, that you could try to use
it together with JackOSX and it's AudioUnit - that might be a
workaround with no coding necessary...
Dan
At 2:35 PM +0000 1/5/05, Hamish Allan wrote:
Hello people,
A quick question: is it theoretically possible to write an Audio
Unit to expose Apple's Airport Express as a system audio output
device?
The technology to stream audio to the APEx exists as JustePort
(http://www.nanocrew.net/software/JustePort-latest.tar.gz), but
nobody seems to be wrapping it in an audio unit. I had hoped that
Slipstream from Rogue Amoeba would be taking that approach, but it
seems not.
I wonder, is there any technical reason why not? Certainly such a
device would have an extremely large latency but it should be
possible to determine this fairly accurately (based on ping times
and size of remote buffer) and therefore request audio slices with a
timestamp far enough into the future. Even if the latency were not
accurate it would still surely be worth writing a wrapper so that
other music players, etc., can use the APEx?
Thank you for your opinions,
Hamish
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden