iOS Broadcast upload extension fails to start after clicking on "Start broadcast"

145 views Asked by At

listen to my problem please.

In my company, client requested iOS app for videotalk. This client has already one PWA app, that contains lots of data cached inside Safari, and its like pdfs, presentations, videos etc. This client has requested possibility to share this content via this new videotalk app. We already had one app that was based on Twilio, so we checked if its possible to share system screen, and after we found out that its possible, we decided to "go for it".

Firstly I have followed Twilio ReplayKit example in order to write main functionality and build app around it. I have succesfuly created MVP project that was using App-groups to share data and I was able to run screen sharing, connect to Twilio room, and broadcast system screen. After some time, when I added more and more capabilities into the app, screen sharing stopped working. I could see my app in the list, but after i pressed "Start broadcast", counter was going to 0, but nothing was happening.

Here is the worst part - I have moved back, to previous commit, to compare what was changed and what might caused the problem, but I found out that screen share stopped working for that previous commit. I am not sure if this is cause, but I deleted app entitlements (I had like 6 of them), and I created just one for all of my configurations. I was constantly going back, commit by commit, to check if its working, but it was not :| Also whats weird (or maybe not when you think of apple "ecosystem") its fact, that all my apps like Discord or Teams, stopped screen sharing. It was same scenario as for my app. Also this happend for my coworker, we not sure when, but he was working on different branch, and probably after some provisioning profiles update (but we are not sure). If this possible that system blocks my app extension to work? If so, how to 'reset it'?

What i did to fix this? I have manually created provisioning profiles with all capabilities that I expected inside app. I tried restoring old entitlements I restored iPhone to factory settings (because Teams and Discord stopped screen sharing) I updated iOS to newest version (thats when Teams and Discord started working again) I have tried thousands of code implementations and project settings

Basically we checked this on different platforms and different devices. It worked for one iPad, but it had newest os installed. I dont really know what to do now, because it looks like system widget blocks some of my apps.

Here is my RPBroadcastpicker variable declaration

lazy var broadcastPicker: RPSystemBroadcastPickerView = {
     let view = RPSystemBroadcastPickerView(
         frame: .init(
              origin: .init(),
              size: .init(size: 80)))
        
      view.preferredExtension = AppData.kBEBundleId
      view.showsMicrophoneButton = true
      if let button = view.subviews.first as? UIButton {
          button.imageView?.tintColor = UIColor.white
      }
      return view
}()

Here is my SampleHandler code

class SampleHandler: RPBroadcastSampleHandler {

    var room: Room?
    var audioTrack: LocalAudioTrack?
    var videoSource: ReplayKitVideoSource?
    var screenTrack: LocalVideoTrack?
    let audioDevice = ExampleReplayKitAudioCapturer(sampleType: SampleHandler.kAudioSampleType)

    var statsTimer: Timer?
    var disconnectSemaphore: DispatchSemaphore?
    
    static let options = ReplayKitVideoSource.TelecineOptions.p30to24or25
    static let kAudioSampleType = RPSampleBufferType.audioApp
    static let kVideoCodec = VideoCodec.H264
    
    func connectAndStartBroadcast(roomId: String, accessToken: String) {

        createVideoTrackIfNeeded()
        createAudioTrackIfNeeded()
        
        let connectOptions = ConnectOptions(token: accessToken) { [unowned self] in
            $0.audioTracks = [self.audioTrack!]
            $0.videoTracks = [self.screenTrack!]
            $0.preferredVideoCodecs = [SampleHandler.kVideoCodec.codec!]
            $0.encodingParameters = self.getParamsAndFormat().0
            $0.isAutomaticSubscriptionEnabled = false
            $0.roomName = roomId
        }
        
        room = TwilioVideoSDK.connect(options: connectOptions, delegate: nil)
    }

    private func createVideoTrackIfNeeded() {
        guard videoSource == nil,
              screenTrack == nil else {
            return
        }
        
        videoSource = ReplayKitVideoSource(
            isScreencast: false,
            telecineOptions: SampleHandler.options)
        
        screenTrack = LocalVideoTrack(
            source: videoSource!,
            enabled: true,
            name: "Screen")
        
        videoSource?.requestOutputFormat(getParamsAndFormat().1)
    }
    
    private func createAudioTrackIfNeeded() {
        guard audioTrack == nil else { return }
        TwilioVideoSDK.audioDevice = self.audioDevice
        audioTrack = LocalAudioTrack()
    }
    
    private func getParamsAndFormat() -> (EncodingParameters, VideoFormat) {
        return ReplayKitVideoSource
            .getParametersForUseCase(
                videoCodec: SampleHandler.kVideoCodec,
                isScreencast: false,
                telecineOptions: SampleHandler.options)
    }
    
    override func broadcastStarted(withSetupInfo setupInfo: [String : NSObject]?) {
        super.broadcastStarted(withSetupInfo: setupInfo)
        
        let defaults = UserDefaults(suiteName: "group.com.myapp")
        let room = defaults?.value(forKey: "roomId") as? String ?? ""
        let token = defaults?.value(forKey: "userToken") as? String ?? ""

        connectAndStartBroadcast(roomId: room, accessToken: token)
    }

    override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
        switch sampleBufferType {
        case RPSampleBufferType.video:
            videoSource?.processFrame(sampleBuffer: sampleBuffer)
            break

        case RPSampleBufferType.audioApp:
            if (SampleHandler.kAudioSampleType == RPSampleBufferType.audioApp) {
                ExampleCoreAudioDeviceCapturerCallback(audioDevice, sampleBuffer)
            }
            break

        case RPSampleBufferType.audioMic:
            if (SampleHandler.kAudioSampleType == RPSampleBufferType.audioMic) {
                ExampleCoreAudioDeviceCapturerCallback(audioDevice, sampleBuffer)
            }
            break
        @unknown default:
            break
        }
    }
}
0

There are 0 answers