zoukankan      html  css  js  c++  java
  • 音乐播放-后台-耳机控制-耳机插拔

    1、首先需要引用系统Framework – AVFoundation,然后在AppDelegate的应用启动事件里面添加以下代码:

    AVAudioSession *session = [AVAudioSession sharedInstance]; [session setCategory:AVAudioSessionCategoryPlayback error:nil] [session setActive:YES error:nil]

    AVAudioSessionCategoryPlayback是用来指定支持后台播放的。

    当然代码添加完了之后并不是就已经可以后台播放了,还需要在info-plist文件里面注明我们的应用需要支持后台运行。

    打开info- plist,添加Required background modes项,再把Item 0编辑成audio按回车,xCode会自动补全内容

    2、我们接下来需要做的就是向系统注册远程控制(Remote Control),在播放音频的ViewController里添加以下代码:

    - (void)viewWillAppear:(BOOL)animated
    {
      [super viewWillAppear:animated];  
      [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
      [self becomeFirstResponder];
    }
     
    - (void)viewWillDisappear:(BOOL)animated
    {
      [super viewWillDisappear:animated];  
      [[UIApplication sharedApplication] endReceivingRemoteControlEvents];
      [self resignFirstResponder];
    }
     
    - (BOOL)canBecomeFirstResponder
    {
      return YES;
    }
    3、完成了注册工作,需要控制生效的话还需要对不同的remote control事件进行响应
    - (void)remoteControlReceivedWithEvent:(UIEvent *)event
    {
      if (event.type == UIEventTypeRemoteControl) {
            switch (event.subtype) {
                case UIEventSubtypeRemoteControlTogglePlayPause:
                    [self resumeOrPause]; // 切换播放、暂停按钮
                    break;
     
                case UIEventSubtypeRemoteControlPreviousTrack:
                    [self playPrev]; // 播放上一曲按钮
                    break;
     
                case UIEventSubtypeRemoteControlNextTrack:
                    [self playNext]; // 播放下一曲按钮
                    break;
     
                default:
                    break;
            }
        }
    }
    4、锁屏的时候可以显示当前播放曲目的封面和一些信息
    - (void)configPlayingInfo
    {
      if (NSClassFromString(@"MPNowPlayingInfoCenter")) {
        NSMutableDictionary * dict = [[NSMutableDictionary alloc] init];
        [dict setObject:@"曲目标题" forKey:MPMediaItemPropertyTitle];
        [dict setObject:@"曲目艺术家" forKey:MPMediaItemPropertyArtist];
        [dict setObject:[[[MPMediaItemArtwork alloc] initWithImage:[UIImage imageNamed:@"曲目封面.png"]] autorelease] forKey:MPMediaItemPropertyArtwork];
     
        [[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:nil];
        [[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:dict];
      }
    }
    5、耳机插拔监控
    [[NSNotificationCenterdefaultCenter]

    addObserver:selfselector:@selector(outputDeviceChanged:)name:AVAudioSessionRouteChangeNotificationobject:[AVAudioSessionsharedInstance]];

     - (void)outputDeviceChanged:(NSNotification *)aNotification

    
    

    {

    
    

     // do your jobs here

    
    

    }

    请注意,addobserver的参数填写:其中的object必须是[AVAudioSession sharedInstance],
    而不是我们通常很多情况下填写的nil,此处若为nil,通知也不会触发。


    1. 检测声音输入设备
    1. - (BOOL)hasMicphone {  
    2.     return [[AVAudioSession sharedInstance] inputIsAvailable];  
    3. }
    2.输出设备的检测,我们只考虑了2个情况,一种是设备自身的外放(iTouch/iPad/iPhone都有),一种是当前是否插入了带外放的耳机
    1. CFStringRef route;  
    2. UInt32 propertySize = sizeof(CFStringRef);  
    3. AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &route);
    所有设备: 
    "Headset"  "Headphone"  "Speaker" "SpeakerAndMicrophone" "HeadphonesAndMicrophone"  "HeadsetInOut" "ReceiverAndMicrophone"  "Lineout" 


    判断有无设备:
     - (BOOL)hasHeadset {  
        #if TARGET_IPHONE_SIMULATOR  
            #warning *** Simulator mode: audio session code works only on a device  
            return NO;  
        #else   
        CFStringRef route;  
        UInt32 propertySize = sizeof(CFStringRef);  
        AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &route);  
        if((route == NULL) || (CFStringGetLength(route) == 0)){  
            // Silent Mode  
            NSLog(@"AudioRoute: SILENT, do nothing!");  
        } else {  
            NSString* routeStr = (NSString*)route;  
            NSLog(@"AudioRoute: %@", routeStr);  
            /* Known values of route:  
             * "Headset"  
             * "Headphone"  
             * "Speaker"  
             * "SpeakerAndMicrophone"  
             * "HeadphonesAndMicrophone"  
             * "HeadsetInOut"  
             * "ReceiverAndMicrophone"  
             * "Lineout"  
             */  
            NSRange headphoneRange = [routeStr rangeOfString : @"Headphone"];  
            NSRange headsetRange = [routeStr rangeOfString : @"Headset"];  
            if (headphoneRange.location != NSNotFound) {  
                return YES;  
            } else if(headsetRange.location != NSNotFound) {  
                return YES;  
            }  
        }  
        return NO;  
        #endif  
    }

    不能再simulator上运行(会直接crush),所以必须先行处理

    强制更改输出设备

     - (void)resetOutputTarget {  
        BOOL hasHeadset = [self hasHeadset];  
        NSLog (@"Will Set output target is_headset = %@ .", hasHeadset ? @"YES" : @"NO");  
        UInt32 audioRouteOverride = hasHeadset ?  
            kAudioSessionOverrideAudioRoute_None:kAudioSessionOverrideAudioRoute_Speaker;  
        AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);  
    } 
    4. 设置Audio工作模式(category,我当做工作模式理解的)
    iOS系统中Audio支持多种工作模式(category),要实现某个功能,必须首先将AudioSession设置到支持该功能的工作模式下。所有支持的工作模式如下
     Audio Session Categories  
    Category identifiers for audio sessions, used as values for the setCategory:error: method.  
    NSString *const AVAudioSessionCategoryAmbient;  
    NSString *const AVAudioSessionCategorySoloAmbient;  
    NSString *const AVAudioSessionCategoryPlayback;  
    NSString *const AVAudioSessionCategoryRecord;  
    NSString *const AVAudioSessionCategoryPlayAndRecord;  
    NSString *const AVAudioSessionCategoryAudioProcessing; 
    
    
    具体每一个category的功能请参考iOS文档,其中AVAudioSessionCategoryRecord为独立录音模式,而
    AVAudioSessionCategoryPlayAndRecord为支持录音盒播放的模式,而
    AVAudioSessionCategoryPlayback为普通播放模式。
    设置category:
     
         - (BOOL)checkAndPrepareCategoryForRecording {  
            recording = YES;  
            BOOL hasMicphone = [self hasMicphone];  
            NSLog(@"Will Set category for recording! hasMicophone = %@", hasMicphone?@"YES":@"NO");  
            if (hasMicphone) {  
                [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord  
                                                       error:nil];  
            }  
            [self resetOutputTarget];  
            return hasMicphone;  
        }  
        - (void)resetCategory {  
            if (!recording) {  
                NSLog(@"Will Set category to static value = AVAudioSessionCategoryPlayback!");  
                [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback  
                                                       error:nil];  
            }  
        }  

    5. 检测耳机插入/拔出事件

    
    耳机插入拔出事件是通过监听AudioSession的RouteChange事件然后判断耳机状态实现的。实现步骤分为两步,首先注册监听函数,然后再监听函数中判断耳机状态。
    注册监听函数:
     AudioSessionAddPropertyListener (kAudioSessionProperty_AudioRouteChange,  
                                         audioRouteChangeListenerCallback,  
                                         self);

     


    我们的需求是当耳机插入或拔出时做出响应,而产生AouteChange事件的原因有多种,所以需要对各种类型进行处理并结合当前耳机状态进行判断。在iOS文档中,产生AouteChange事件的原因有如下几种:
     Audio Session Route Change Reasons  
    Identifiers for the various reasons that an audio route can change while your iOS application is running.  
    enum {  
       kAudioSessionRouteChangeReason_Unknown                    = 0,  
       kAudioSessionRouteChangeReason_NewDeviceAvailable         = 1,  
       kAudioSessionRouteChangeReason_OldDeviceUnavailable       = 2,  
       kAudioSessionRouteChangeReason_CategoryChange             = 3,  
       kAudioSessionRouteChangeReason_Override                   = 4,  
       // this enum has no constant with a value of 5  
       kAudioSessionRouteChangeReason_WakeFromSleep              = 6,  
       kAudioSessionRouteChangeReason_NoSuitableRouteForCategory = 7  
    };
    
    
    具体每个类型的含义请查阅iOS文档,其中我们关注的是kAudioSessionRouteChangeReason_NewDeviceAvailable有新设备插入、
    kAudioSessionRouteChangeReason_OldDeviceUnavailable原有设备被拔出以及
    kAudioSessionRouteChangeReason_NoSuitableRouteForCategory当前工作模式缺少合适设备。
    当有新设备接入时,如果检测耳机,则判定为耳机插入事件;当原有设备移除时,如果无法检测耳机,则判定为耳机拔出事件;当出现“当前工作模式缺少合适设备时”,直接判定为录音时拔出了麦克风。
    很明显,这个判定逻辑实际上不准确,比如原来就有耳机但是插入了一个新的audio设备或者是原来就没有耳机但是拔出了一个原有的audio设备,我们的判定都会出错。但是对于我们的项目来说,其实关注的不是耳机是拔出还是插入,真正关注的是有audio设备插入/拔出时能够根据当前耳机/麦克风状态去调整设置,所以这个判定实现对我们来说是正确的。
    监听函数的实现:
     void audioRouteChangeListenerCallback (  
                                           void                      *inUserData,  
                                           AudioSessionPropertyID    inPropertyID,  
                                           UInt32                    inPropertyValueSize,  
                                           const void                *inPropertyValue  
                                           ) {  
        if (inPropertyID != kAudioSessionProperty_AudioRouteChange) return;  
        // Determines the reason for the route change, to ensure that it is not  
        //        because of a category change.  
      
        CFDictionaryRef    routeChangeDictionary = inPropertyValue;  
        CFNumberRef routeChangeReasonRef =  
        CFDictionaryGetValue (routeChangeDictionary,  
                              CFSTR (kAudioSession_AudioRouteChangeKey_Reason));  
        SInt32 routeChangeReason;  
        CFNumberGetValue (routeChangeReasonRef, kCFNumberSInt32Type, &routeChangeReason);  
        NSLog(@" ======================= RouteChangeReason : %d", routeChangeReason);  
        AudioHelper *_self = (AudioHelper *) inUserData;  
        if (routeChangeReason == kAudioSessionRouteChangeReason_OldDeviceUnavailable) {  
            [_self resetSettings];  
            if (![_self hasHeadset]) {  
                [[NSNotificationCenter defaultCenter] postNotificationName:@"ununpluggingHeadse  
                                                                    object:nil];  
            }  
        } else if (routeChangeReason == kAudioSessionRouteChangeReason_NewDeviceAvailable) {  
            [_self resetSettings];  
            if (![_self hasMicphone]) {  
                [[NSNotificationCenter defaultCenter] postNotificationName:@"pluggInMicrophone"  
                                                                    object:nil];  
            }  
        } else if (routeChangeReason == kAudioSessionRouteChangeReason_NoSuitableRouteForCategory) {  
            [_self resetSettings];  
            [[NSNotificationCenter defaultCenter] postNotificationName:@"lostMicroPhone"  
                                                                object:nil];  
        }  
        //else if (routeChangeReason == kAudioSessionRouteChangeReason_CategoryChange  ) {  
        //    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];         
        //}  
        [_self printCurrentCategory];  
    } 
    检测到相关事件后,通过NSNotificationCenter通知observers耳机(有无麦克风)拔出/插入事件拔出事件,从而触发相关操作。


  • 相关阅读:
    推荐网址:Response.WriteFile Cannot Download a Large File
    为什么是 My?
    Fox开发杂谈
    DCOM配置为匿名访问
    连接到运行 Windows 98 的计算机
    OO面向对象以后是什么
    Com+的未来是什么?
    fox 表单和类库的加密及修复
    来自 COM 经验的八个教训
    VFP的加密问题
  • 原文地址:https://www.cnblogs.com/cdp-snail/p/4969785.html
Copyright © 2011-2022 走看看