Why does video resolution change when streaming from Android via WebRTC

5.3k views Asked by At

I'm trying to stream at 640x480 from Chrome on Android using WebRTC, and the video starts off at that, but then the resolution drops to 320x240.

Here are the getUserMedia parameters that are sent:

 "getUserMedia": [
  {
   "origin": "http://webrtc.example.com:3001",
   "pid": 30062,
   "rid": 15,
   "video": "mandatory: {minWidth:640, maxWidth:640, minHeight:480, maxHeight:480}"
  }

My question is why does the resolution fall? When I try it from Chrome on my Mac that does not happen. I would like to make adjustments so that the video resolution does not change.

video frames dumped using ffmpeg

chrome://webrtc-internals text dump

I'm using the Licode WebRTC streaming server, but have also seen the same behavior using Kurento.

1

There are 1 answers

2
xdumaine On BEST ANSWER

getUserMedia constraints only affect the media requested from the browser to the hardware and returned as a stream. getUserMedia constraints don't have any effect on what is done to that stream afterwards (i.e., when it's streamed over a connection). The degradation you're seeing is in the PeerConnection layer, not in the getUserMedia layer. Degradation is triggered by the webrtc implementation when hardware and bandwidth statistics are indicative of low performance, and is negotiated by both sides.

[Hardware] <-   getUserMedia   -> [javascript client] <- PeerConnection -> [another client]
           <- 640x480 captured ->                     <-  320x240 sent  ->

You'll have to dig into source code for documentation and evidence of how it's done in each implementation, but references to behavior:

From the OReilly Chapter on WebRTC:

The good news is that the WebRTC audio and video engines work together with the underlying network transport to probe the available bandwidth and optimize delivery of the media streams. However, DataChannel transfers require additional application logic: the application must monitor the amount of buffered data and be ready to adjust as needed.

...

WebRTC audio and video engines will dynamically adjust the bitrate of the media streams to match the conditions of the network link between the peers. The application can set and update the media constraints (e.g., video resolution, framerate, and so on), and the engines do the rest—this part is easy.