I'm working on an android app that uses the DJI Mobile SDK to send the video stream of a DJI drone to an ANT Media Server. The video is encoded and sent using the RTMP protocol, using the SDK methods. Using ANT RTMP to WEBRTC, thanks to the very low latency, I noticed a problem:
- as soon as streaming is started the latency is about 600ms
- after about 2-3s the latency begins to degrade linearly, losing about 1s every 3-4 minutes
I also tested with Mobile SDK Sample provided by DJI and the behavior is the same. Thinking about a performance problem, I then attached the Android Studio Profiler, and I noticed a really strange behavior: with the profiler attached the streaming performance is perfect. 4-500ms of constant latency, without any loss even after hours. There are no differences if I launch the app immediately with the profiler attached or attach it to the opened app, the important thing is that the profiler is attached before starting the streaming.
What are the differences between the profiling environment and the non-profiling environment? Are some parameters (thread priority, system timer resolution, no device optimization ..?) set on system/app during the Android Profiler execution?
EDIT
This is the code snippet that sets the rtmp URL and starts streaming, no more code is needed. All the magic is done by the DJI Mobile SDK.
new Thread() {
@Override
public void run() {
DJISDKManager.getInstance().getLiveStreamManager().setLiveUrl(rtmpUrl);
int result = DJISDKManager.getInstance().getLiveStreamManager().startStream();
DJISDKManager.getInstance().getLiveStreamManager().setStartTime();
}
}.start();
That issue arises because Low overhead profiling run with debuggable false parameter. You can't attach a debug process when you are trying to profile your app.