MPVolumeView Airplay only touches when mirroring

584 views Asked by At

I am using a custom AVPlayerLayer to display a simple video. I am trying to add airplay support, but the button, when tapped, does not show anything.

self.player.allowsExternalPlayback = true
...
let airplayButton = MPVolumeView(frame: self.airplayButtonPlaceholder!.bounds)
airplayButton.showsRouteButton = true
airplayButton.showsVolumeSlider = false

self.airplayButtonPlaceholder?.addSubview(airplayButton)
self.airplayButtonPlaceholder?.backgroundColor = UIColor.clear

When I run my code (on a real device), I see the button, but when I tap on it, nothing happens. What could be causing this? Is it because I am using custom AVPlayerLayer and AVPlayer?

EDIT:

When I turn on mirroring through the control center, I can touch the button and it displays the pop up. What's going on?

1

There are 1 answers

4
Pochi On BEST ANSWER

Nothing happen because you haven't properly configured this "new window".

There are two ways to display content using Airplay.

Mirroring

Doesn't need any configuration.

Note: You don’t need to do anything to make mirroring happen. In iOS 5.0 and later, mirroring—that is, displaying the same content on both the host device and the external display—occurs by default when the user selects an AirPlay video output device.

Extra Window

(check apple guide here)

The steps as described by apple are:

  1. At app startup, check for the presence of an external display and register for the screen connection and disconnection notifications.
  2. When an external display is available—whether at app launch or while your app is running—create and configure a window for it.
  3. Associate the window with the appropriate screen object, show the second window, and update it normally.

Here is the code taken from the apple docs for a quick reference.


- Create a New Window If an External Display Is Already Present

- (void)checkForExistingScreenAndInitializeIfPresent
{
    if ([[UIScreen screens] count] > 1)
    {
        // Get the screen object that represents the external display.
        UIScreen *secondScreen = [[UIScreen screens] objectAtIndex:1];
        // Get the screen's bounds so that you can create a window of the correct size.
        CGRect screenBounds = secondScreen.bounds;

        self.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
        self.secondWindow.screen = secondScreen;

        // Set up initial content to display...
        // Show the window.
        self.secondWindow.hidden = NO;
    }
}

- Register for Connection and Disconnection Notifications

- (void)setUpScreenConnectionNotificationHandlers
{
    NSNotificationCenter *center = [NSNotificationCenter defaultCenter];

    [center addObserver:self selector:@selector(handleScreenDidConnectNotification:)
            name:UIScreenDidConnectNotification object:nil];
    [center addObserver:self selector:@selector(handleScreenDidDisconnectNotification:)
            name:UIScreenDidDisconnectNotification object:nil];
}

- Handle Connection and Disconnection Notifications

- (void)handleScreenDidConnectNotification:(NSNotification*)aNotification
{
    UIScreen *newScreen = [aNotification object];
    CGRect screenBounds = newScreen.bounds;

    if (!self.secondWindow)
    {
        self.secondWindow = [[UIWindow alloc] initWithFrame:screenBounds];
        self.secondWindow.screen = newScreen;

        // Set the initial UI for the window.
    }
}

- (void)handleScreenDidDisconnectNotification:(NSNotification*)aNotification
{
    if (self.secondWindow)
    {
        // Hide and then delete the window.
        self.secondWindow.hidden = YES;
        self.secondWindow = nil;

    }

}

EDIT:

When using an AVPlayerViewController it is already automatically implemented as described within the documentation here.

AVPlayerViewController automatically supports AirPlay, but you need to perform some project and audio session configuration before it can be enabled in your application.