Assigning AVPlayer to AVPlayerLayer rounds players currentTime asyncronously

74 views Asked by At

I am trying to display single video in two places and have them both in sync.

I create AVPlayer, assign it to AVPlayerLayer and play it for about a second. After that second AVPlayerLayer is created and sme AVPlayer is assigned to it.

At that point I expect both layers to display same frame as they are backed by same AVPlayer, but the act of assigning AVPlayer to second AVPlayerLayer changes player's current time for some reason.

Here's code that illustrates my problem. This logic is invoked after I video has been played for about a second and then paused:

print("\(player.currentTime())") // CMTime(value: 111158, timescale: 90000)

_ = AVPlayerLayer(player: player)

print("\(player.currentTime())") // CMTime(value: 111158, timescale: 90000)

DispatchQueue.main.asyncAfter(deadline: .now() + 0.3) {
    print("\(player.currentTime())") // CMTime(value: 600, timescale: 600)
}

Current time is modified in similar way in different parts of the video and in different videos. From CMTime(value: 175196, timescale: 90000) to CMTime(value: 1220, timescale: 600) for example.

I do not call seek(to:toleranceBefore:toleranceAfter:) method at any point.

Periodic time observer is triggered with that new rounded value but I am unable to figure out what triggered time change based on stack trace from there.

If players time is already at 600 timescale - there is not time change when creating AVPlayerLayer.

One of the options I've tried to mitigate this is to store currentTime before creating second AVPlayerLayer and then seeking player back to it but it is unclear at what point this rounding happens exactly. I do not feel comfortable asyncing it after some arbitrary time interval as this does not look like long term solution.

I suppose some logic behind preparing AVPlayer's content to being rendered is responsible for this rounding, but I was unable to find anything related to tolerance in AVPlayerLayer, AVPlayer or AVPlayerItem documentation.

I would appreciate any advice on why AVPlayer adjusts it's current time and how to disable this behaviour.

1

There are 1 answers

0
Lieksu On

After in-depth discussion with Apple Developer Technical Support I've got the following explanation:

The reason for this frame change is unrelated to adding new views. The simple fact that the player is paused and resumed can change the timescale, and therefore the value of the player’s current time. Compressed formats often contain frames that carry a complete image, as well as frames that depend on surrounding frames to be decoded. When adding a view after the player has been paused, this effect becomes more apparent, but the underlying reason is the play/pause mechanism itself and not adding subviews or sublayers.

Perhaps it would be helpful to someone.