Where is the audio stream added libjingle?

552 views Asked by At

I can't find out where the audio stream is added to the 'speaker'. Is it possible that I modify the stream and add it later by myself? I have the feeling that libjingle is handling the stream and adding it.

I have added the libjingle part of my code:

import AVFoundation
import UIKit

let TAG = "ViewController"
let AUDIO_TRACK_ID = TAG + "AUDIO"
let LOCAL_MEDIA_STREAM_ID = TAG + "STREAM"

class ViewController: UIViewController, RTCSessionDescriptionDelegate, RTCPeerConnectionDelegate {

    var mediaStream: RTCMediaStream!
    var localAudioTrack: RTCAudioTrack!
    var remoteAudioTrack: RTCAudioTrack!
    var renderer: RTCEAGLVideoView!
    var renderer_sub: RTCEAGLVideoView!
    var roomName: String!    

    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.

        initWebRTC();
        sigConnect(wsUrl: "http://192.168.1.59:3000");

        localAudioTrack = peerConnectionFactory.audioTrack(withID: AUDIO_TRACK_ID)
        mediaStream = peerConnectionFactory.mediaStream(withLabel: LOCAL_MEDIA_STREAM_ID)
        mediaStream.addAudioTrack(localAudioTrack)
    }

    var peerConnectionFactory: RTCPeerConnectionFactory! = nil
    var peerConnection: RTCPeerConnection! = nil
    var pcConstraints: RTCMediaConstraints! = nil
    var audioConstraints: RTCMediaConstraints! = nil
    var mediaConstraints: RTCMediaConstraints! = nil

    var wsServerUrl: String! = nil
    var peerStarted: Bool = false

    func initWebRTC() {
        RTCPeerConnectionFactory.initializeSSL()
        peerConnectionFactory = RTCPeerConnectionFactory()

        pcConstraints = RTCMediaConstraints()
        audioConstraints = RTCMediaConstraints()
        mediaConstraints = RTCMediaConstraints(
            mandatoryConstraints: [
                RTCPair(key: "OfferToReceiveAudio", value: "true"),
            ],
            optionalConstraints: nil)
    }

    func prepareNewConnection() -> RTCPeerConnection {
        var icsServers: [RTCICEServer] = []

        icsServers.append(RTCICEServer(uri: NSURL(string: "stun:stun.l.google.com:19302") as URL!, username: "",
        password: ""))

        let rtcConfig: RTCConfiguration = RTCConfiguration()
        rtcConfig.tcpCandidatePolicy = RTCTcpCandidatePolicy.disabled
        rtcConfig.bundlePolicy = RTCBundlePolicy.maxBundle
        rtcConfig.rtcpMuxPolicy = RTCRtcpMuxPolicy.require

        peerConnection = peerConnectionFactory.peerConnection(withICEServers: icsServers, constraints: pcConstraints, delegate: self)
        peerConnection.add(mediaStream);
        return peerConnection;
    }


    func peerConnection(_ peerConnection: RTCPeerConnection!, signalingStateChanged stateChanged: RTCSignalingState) {
    }

    func peerConnection(_ peerConnection: RTCPeerConnection!, iceConnectionChanged newState: RTCICEConnectionState) {
    }

    func peerConnection(_ peerConnection: RTCPeerConnection!, iceGatheringChanged newState: RTCICEGatheringState) {
    }

    func peerConnection(_ peerConnection: RTCPeerConnection!, gotICECandidate candidate: RTCICECandidate!) {
        if (candidate != nil) {
            print("iceCandidate: " + candidate.description)
            let json:[String: AnyObject] = [
                "type" : "candidate" as AnyObject,
                "sdpMLineIndex" : candidate.sdpMLineIndex as AnyObject,
                "sdpMid" : candidate.sdpMid as AnyObject,
                "candidate" : candidate.sdp as AnyObject
            ]
            sigSend(msg: json as NSDictionary)
        } else {
            print("End of candidates. -------------------")
        }
    }

    func peerConnection(_ peerConnection: RTCPeerConnection!, addedStream stream: RTCMediaStream!) {
        if (peerConnection == nil) {
            return
        }

        if (stream.audioTracks.count > 1) {
            print("Weird-looking stream: " + stream.description)
            return
        }
    }

    func peerConnection(_ peerConnection: RTCPeerConnection!, removedStream stream: RTCMediaStream!) {
    }

    func peerConnection(_ peerConnection: RTCPeerConnection!, didOpen dataChannel: RTCDataChannel!) {
    }

    func peerConnection(onRenegotiationNeeded peerConnection: RTCPeerConnection!) {

    }
}

My thought is that I can catch the audio stream in the function under this command. Is that correct? In addition, can I add the stream manually to the speaker?

 func peerConnection(_ peerConnection: RTCPeerConnection!, addedStream stream: RTCMediaStream!) {
            if (peerConnection == nil) {
                return
            }

            if (stream.audioTracks.count > 1) {
                print("Weird-looking stream: " + stream.description)
                return
            }
        }
1

There are 1 answers

0
manishg On

When the webRTC call is connected, the Webrtc stack uses platform APIs to play or record the audio. You can only control things like

  1. Mute or unmute the audio stream
  2. Use system APIs to increase or decrease volume or change audio configuration

You can't add stream manually to speaker but you can choose to change the default audio output to speaker or a headphone so that webrtc audio is redirected to correct output. This can be done using avfoundation APIs