I've implemented an audio player using AVAudioPlayer (not AVPlayer). I'm able to handle the remote control events with the following method. It works quite alright so far, however I see two more subtypes for these events: UIEventSubtypeRemoteControlEndSeekingForward and UIEventSubtypeRemoteControlEndSeekingBackward.
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
//if it is a remote control event handle it correctly
if (event.type == UIEventTypeRemoteControl)
{
if (event.subtype == UIEventSubtypeRemoteControlPlay)
{
[self playAudio];
}
else if (event.subtype == UIEventSubtypeRemoteControlPause)
{
[self pauseAudio];
}
else if (event.subtype == UIEventSubtypeRemoteControlTogglePlayPause)
{
[self togglePlayPause];
}
else if (event.subtype == UIEventSubtypeRemoteControlBeginSeekingBackward)
{
[self rewindTheAudio]; //this method rewinds the audio by 15 seconds.
}
else if (event.subtype == UIEventSubtypeRemoteControlBeginSeekingForward)
{
[self fastForwardTheAudio]; //this method fast-forwards the audio by 15 seconds.
}
}
So the questions:
In order to have things work right, am I supposed to implement those two subtypes, too?
This method only enables the
rewind,play/pause, andfast forwardbuttons on lock screen, but it doesn't display the file title, artwork, and duration. How can I display that info usingAVAudioPlayerorAVAudioSession(I don't really want one more library/API to implement this)?2-a. I discovered
MPNowPlayingInfoCenterwhile searching and I don't know much about it. Do I have to use it to implement those stuff above? :-[
You are correct,
MPNowPlayingInfoCenteris the only way to do this. So go ahead and link withMediaPlayer.framework. In the class that handles playing tracks, import<MediaPlayer/MediaPlayer.h>. Whenever your track changes, do this: