If you pause and then unpause a MediaStreamTrack using .enabled = false
and then .enabled = true
, there's a slight delay between the .enabled
flag being true, and actually getting real data.
If you pass the stream to a video element, you'll see that you get a black screen at first, followed by the actual camera data shortly afterwards.
I'm wondering if there's a way to detect when the 'real' stream becomes available. I could periodically take a snapshot from the stream using a canvas, and check the data using something like .getImageData
, but that seems expensive.
Is there an event that is fired once data becomes available?
Note: this question looks similar, but is not. For remote connections, it suggests sending signalling events letting the other end know when the camera is paused/unpaused. I'm doing local processing, so I just want to know when I have useful data coming through.