Intercepting audio from WebKit
Intercepting audio from WebKit
- Subject: Intercepting audio from WebKit
- From: Jason Perkins <email@hidden>
- Date: Wed, 25 May 2011 10:22:50 -0400
New to Cocoa and especially Core Audio, so apologies in advance if I've missed something completely obvious.
I have a WebKit-based application that plays Pandora Radio. I'd like to add an equalizer. In order to do this, I have to get at the audio coming out of the Flash plugin.
I developed a proof-of-concept using mach_override to intercept the Core Audio APIs. This works, but is...crude, and I'd like to find a way that uses the APIs to do it properly, if possible.
Based on what I've read in the archives and my own experiements, it appears that it is not possible to get my application's audio out of AUHAL once it has gone in.
It sounds like I could create a userspace driver, a la JackOSX, set that as my application's output, and then process the audio there before sending it on to the system output device. But I want this to be transparent to my application. That is, I don't want the user to see a new device in System Preferences (i.e. JackRouter), and I don't want this device to be available to other applications. It should all be transparent and under the hood, so to speak. I haven't been able to figure out if it is possible to create an application-specific audio driver, and if so, how to set it up.
Any thoughts? I suppose I'm really trying to get a sense of "yes, that's possible" or "no, that's crazy" before I spend any more time on this path, especially since I already have a (ugly) solution that works.
Thanks!
Jason
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden