Why AVAsset
tracks has different timeRanges for same video file?
I have fully loaded media file in AVAsset
. I print tracks property of this class and receive this information.
▿ 2 elements
- 0 : <AVAssetTrack: 0x17000fed0, trackID = 1, mediaType = vide>
- 1 : <AVAssetTrack: 0x17000fe90, trackID = 2, mediaType = soun>
So I have one video and one sound AVAssetTrack
of the same media file.
So I print timeRange of each AVAssetTrack
.
So for video:
▿ CMTimeRange
▿ start : CMTime
- value : 0
- timescale : 1000
▿ flags : CMTimeFlags
- rawValue : 1
- epoch : 0
▿ duration : CMTime
- value : 5000
- timescale : 1000
▿ flags : CMTimeFlags
- rawValue : 1
- epoch : 0
And for sound.
▿ CMTimeRange
▿ start : CMTime
- value : 0
- timescale : 1000
▿ flags : CMTimeFlags
- rawValue : 1
- epoch : 0
▿ duration : CMTime
- value : 5002
- timescale : 1000
▿ flags : CMTimeFlags
- rawValue : 1
- epoch : 0
So why duration of the sound AVAssetTrack
is longer than video one? And that is for same video file.
may I ask where does the file comes from? Maybe it was created by a user who merged an audio track and a video track in a composition after editing them separately. And when editing the file he has created this small difference. In general, via AVMutableComposition you can merge whichever asset you want, time range does not have to be the same. So I would not be surprised to get your findings.