I am trying to calculate frame rate of a video format based on CMFormatDescription but I'm getting strange output that I do not know what to do with. According to the documentation, "value/timescale = seconds". This is also the answer in this question.
The code is being called while getting a video stream from FaceTime camera:
let av = device.activeFormat
let fd = av.formatDescription
print("time scale",fd.frameDuration.timescale)
print("value",fd.frameDuration.value)
print("value/timescale=",fd.frameDuration.value)/Int64(fd.frameDuration.timescale))
This is the output:
time scale 480
value 2749654773878
value/timescale= 5728447445.579166
What am I missing? What is the frames rate?
EDIT: It seems that there is a bug or perhaps something is terribly wrong. time scale is always == height (of the format description). I tried it with a usb camera and they are always equal.
TL;DR You don't query a device's frame rate, but you can choose the frame rate you want
The
CMTimevalue you're receiving has itsflagsfield set to zero. This means it is invalid.CMFormatDescriptions describe several media types, andframeDurationis not a video property. An invalidCMTimeresult isCMFormatDescription's way of saying "not found". I guess thetimescalebeing the format's height is just Undefined Behaviour & I'll bet it's different on an intel mac.This problem is easier to see in plain old C, where the sugared swift
frameDurationproperty is calledCMTimeCodeFormatDescriptionGetFrameDuration(). Frame duration is aTimeCodeproperty (whatever that is). Mixing up your format description media types does not result in compiler errors in either language.