Playing a Live stream from media server on android application

2.2k views Asked by At

My setup is as follows:

  • OBS Studio to create the video feed
  • Ant Media Server to distribute the stream

Now I'm building an app that will display this stream and I'm currently using ExoPlayer, however I'm having a hard time getting it to work for both RTMP and HLS, I read some where that I could embed a webplayer in my app would that be easier? Here is my code for ExoPlayer:

 //RTMP Url
    String url = "rtmp://192.168.1.244/WebRTCApp/379358104902020985845622";
    
    BandwidthMeter bandwidthMeter = new DefaultBandwidthMeter();
    TrackSelection.Factory videoTrackSelectionFactory =
            new AdaptiveTrackSelection.Factory();
    TrackSelector trackSelector =
            new DefaultTrackSelector(videoTrackSelectionFactory);
    SimpleExoPlayer player = ExoPlayerFactory.newSimpleInstance(this, trackSelector);

    PlayerView playerView = findViewById(R.id.simple_player);

    playerView.setPlayer(player);

    /*
      Create RTMP Data Source
     */

    RtmpDataSourceFactory rtmpDataSourceFactory = new RtmpDataSourceFactory();


    MediaSource videoSource = new ExtractorMediaSource.Factory(rtmpDataSourceFactory)
            .createMediaSource(Uri.parse(url));

    player.prepare(videoSource);

    player.setPlayWhenReady(true);

Any help on this would be much appreciated.

1

There are 1 answers

0
Mick On BEST ANSWER

Most online video streaming use Adaptive Bit Rate streaming (ABR) protocols to deliver the video, mainly HSL and DASH this days.

Most Media players, like ExoPlayer, support these protocols well, although they are complex and evolving protocols so there are always edge cases.

Many video conferencing applications use WebRTC which is a real time optimised protocol - the usual approach is to use a WebRTC client for this type of stream.

The difference between the two approaches from a streaming latency point of view, at a very high level, is:

  • ABR protocols prioritise quality and avoiding interruptions and buffer enough of the video to try to gaurantee uninterrupted playback. They are usually aimed at movie and live video streaming services. Even for low latency implementation the latency is measured in multiple seconds and more.

  • WebRTC prioritises latency and sacrifices quality if necessary. It is aimed typically at real time sensitive applications like video conferencing where it is important not to fall behind the discussion even if it means a temporary video glitch or even brief interruption in video. Latency is usually measured in sub seconds.

Any Media Server comes from the WebRTC side although recent versions support HLS /CMAF and Low Latency DASH (these are still higher latency than WebRTC generally as noted above).

For your service, if you are able to use a DASH or HLS stream you may find that it is an easier path with ExoPlayer. If you look at the demo app for example you will see DASH and HLS streams but no RTMP ones. You can easily extend or modify the demo app to play your own HLS or DASH stream and this is often an easy way to start - look at the sample material in the assets/media.exolist.json and add your own URL:

However, ExoPlayer should also support RTMP via an extension if this is your preferred route - there is a specific extension that allows this:

In theory you simply need to add this dependency to your application:

if your application is using DefaultDataSource or DefaultDataSourceFactory, adding support for RTMP streams is as simple as adding a dependency to the RTMP extension

It would be worth checking the issues list in this repository for any recent issues and/or workarounds.