Re: making a video from still frames that don't change that often plus audio
Re: making a video from still frames that don't change that often plus audio
- Subject: Re: making a video from still frames that don't change that often plus audio
- From: email@hidden
- Date: Sun, 15 Jan 2017 16:04:20 -0500
> On Jan 15, 2017, at 3:12 PM, Quincey Morris <email@hidden> wrote:
>
> On Jan 15, 2017, at 09:22 , email@hidden wrote:
>>
>> I have an iOS presentation app (https://itunes.apple.com/app/redraw/id1114820588?mt=8) that I currently make videos from by AirPlaying it to my Mac and using Screenflow on the Mac to show the iPad screen and record my audio from a microphone (and then edit). I'd like to build this functionality into my app directly
>
> AVFoundation doesn’t seem to have the ability of capturing screen video on iOS — AVCaptureScreenInput is documented as Mac only. That would rule out AVFoundation for the basic video capture within your app. You might be able to capture a series of screen shots, but it has to be done in real time, and that’s going to be tricky to get right on iOS where you’ll need to buffer the captured images to storage that might not be fast enough.
>
> If you mean you want to write a companion Mac app, then I guess you can use AVCaptureScreenInput to capture the raw video, and then you could use AVAssetWriter to export your final, composed video. However, AVAssetWriter is *not* a real-time function, so you couldn’t rely on it keeping up if you tried to export as the user interleaves the still images with the raw video. What you’d need to do is add a playback/edit phase, where you played the raw video, captured the timing of the user edits (letting the playback skip frames if the edits held up the playback), then export the “composition” when the user is done. (Or, you could export in the background *during* editing, which would mean it would be done soon after the user finishes, but this may have adverse effects on playback on a lower-end Mac.)
>
> AVCaptureScreenInput does let you choose the screen, though.
>
> FWIW, since I’m not sure I properly understood exactly what solution you’re looking for.
I'm talking about doing this on the iPad (not with a separate Mac app). I know the only option for recording the screen itself is using ReplayKit. I don't really need to record the screen. I want to write a video in real-time that consists of the audio from the microphone and an image that changes periodically (and that image happens to be shown on a UIImageView of the second screen of my app). So given that AVAssetWriter is not real time, I think my best option (if I want to do it all the work on the iPad without a separate Mac app) is to use ReplayKit.
The other option would be to let the user navigate to a specific image on the screen, record audio for that image, navigate to the next image, record audio for that image, and so on and then I could probably use AVAssetWriter to write that audio and the fixe image and then the next audio segment and the next fixed image, etc.
Thanks,
Dave Reed
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden