Getting timing data and syncing things
Getting timing data and syncing things
- Subject: Getting timing data and syncing things
- From: HDS <email@hidden>
- Date: Wed, 23 Jan 2013 19:02:45 -0800
Hello all,
I'm VERY much a newbie with Core Audio, and pretty much with OOP, Cocoa and Objective-C (but not with C, or with audio in general). That said, I am building a Cocoa app (MacOS X 10.7 or newer), a lot of which is already working nicely using AVAudioPlayer. But I need to move this to Core Audio, so I can add use of AUTimePitch. So now with the Core Audio APIs, I have the following requirements:
1) Play an audio file, during which a user will type occasional keystrokes to represent "events". The timings of these events against the audio timeline (file position) need to be saved. So I need a way to quickly grab the current time when the key is pressed, and save it. It can be in seconds, or sample frames, but ideally without too much latency. Using AVAudioPlayer, I can simply use the currentTime method to grab it. I'd like something analogous, if it exists.
2) Once the events have all been entered, I need to play back the audio file, and use the list of stored events to trigger something onscreen for each one - analogous to karaoke, when words are highlighted on the screen in sync with audio. So for this, I need a way to know when to fire off each display event, based on the current position within the playing audio file. This needs to be fairly accurate - plus/minus a few milliseconds would be OK. NSTimers might not be accurate enough.
3) I need to be able to use AUTimePitch during the user-keystroke-entry process, to enable keystrokes at a slower speed, but still correlate the timings of these events with the "real" playback speed (i.e. not sped up or slowed down). I'm OK with doing math to compensate for speed differences if necessary.
I do already have simple file playback working with Core Audio using example code from the Adamson/Avila book - without a render callback at this point - and I can start and stop playback at any point in the file arbitrarily (another requirement), using regions.
So I'm politely begging list readers to just point me in the right direction to get my requirements met: getting quick timestamps during playback, and somehow synchronizing external events against audio. I'm asking for specific guidelines, not necessarily working code. (Though I'll take it if it's offered!)
Please forgive the newbie-ish nature of my questions, but I'm already choking on all the other material I have to digest just to get Cocoa UI stuff working, and could really use a leg-up with CA. I'm simply stuck trying to figure out how to continue (which API, and general tactics). Once aimed correctly, I can probably succeed without becoming a much worse annoyance.
Many thanks in advance, if you can help…
Howard
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden