Hey I am trying to run a live feed on my device. Now I want to capture a photo every 3 seconds, but every-time it does it. It makes a shutter sound. This is bad UX.
Hence I want to run a live camera stream from the front camera and capture the frame at certain duration(~3 sec).
How can I extract a frame from the live camera feed and store it in a UIImage variable?
Thanks and Cheers!
I understand your problem to the full extent. Few days back, I was also facing this problem, that's why I have developed a complete solution from showing live camera preview, arranging it properly on the view to getting camera frames continuously and converting the frames into UIImages efficiently without memory leak to utilise it accordingly. Kindly utilise the solution according to your need. The solution is optimised for swift 4.2 and developed on Xcode 10.0.
THIS IS THE LINK OF GITHUB REPO FOR THIS :- https://github.com/anand2nigam/CameraFrameExtractor
Kindly use your iPhone or iPad to test the application because it will not work on simulator. Please let me know about the app functionality and its working and if any help needed, do contact me. Hope it can solve your problem. Happy Learning.