国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 系統(tǒng) > iOS > 正文

手把手教你實現(xiàn)微信小視頻iOS代碼實現(xiàn)

2019-10-21 18:53:01
字體:
供稿:網(wǎng)友

前段時間項目要求需要在聊天模塊中加入類似微信的小視頻功能,這邊博客主要是為了總結(jié)遇到的問題和解決方法,希望能夠?qū)τ型瑯有枨蟮呐笥延兴鶐椭?/p>

效果預(yù)覽:

 iOS,微信,小視頻

這里先羅列遇到的主要問題:  
 1.視頻剪裁  微信的小視頻只是取了攝像頭獲取的一部分畫面
 2.滾動預(yù)覽的卡頓問題  AVPlayer播放視頻在滾動中會出現(xiàn)很卡的問題

接下來讓我們一步步來實現(xiàn)。
Part 1 實現(xiàn)視頻錄制
1.錄制類WKMovieRecorder實現(xiàn)
創(chuàng)建一個錄制類WKMovieRecorder,負責視頻錄制。 

@interface WKMovieRecorder : NSObject+ (WKMovieRecorder*) sharedRecorder;  - (instancetype)initWithMaxDuration:(NSTimeInterval)duration; @end 

定義回調(diào)block 

/** * 錄制結(jié)束 * * @param info   回調(diào)信息 * @param isCancle YES:取消 NO:正常結(jié)束 */typedef void(^FinishRecordingBlock)(NSDictionary *info, WKRecorderFinishedReason finishReason);/** * 焦點改變 */typedef void(^FocusAreaDidChanged)();/** * 權(quán)限驗證 * * @param success 是否成功 */typedef void(^AuthorizationResult)(BOOL success);@interface WKMovieRecorder : NSObject//回調(diào)@property (nonatomic, copy) FinishRecordingBlock finishBlock;//錄制結(jié)束回調(diào)@property (nonatomic, copy) FocusAreaDidChanged focusAreaDidChangedBlock;@property (nonatomic, copy) AuthorizationResult authorizationResultBlock;@end

定義一個cropSize用于視頻裁剪 
@property (nonatomic, assign) CGSize cropSize;

接下來就是capture的實現(xiàn)了,這里代碼有點長,懶得看的可以直接看后面的視頻剪裁部分

錄制配置:

@interface WKMovieRecorder ()<AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate,WKMovieWriterDelegate>{  AVCaptureSession* _session;  AVCaptureVideoPreviewLayer* _preview;  WKMovieWriter* _writer;  //暫停錄制  BOOL _isCapturing;  BOOL _isPaused;  BOOL _discont;  int _currentFile;  CMTime _timeOffset;  CMTime _lastVideo;  CMTime _lastAudio;    NSTimeInterval _maxDuration;}// Session management.@property (nonatomic, strong) dispatch_queue_t sessionQueue;@property (nonatomic, strong) dispatch_queue_t videoDataOutputQueue;@property (nonatomic, strong) AVCaptureSession *session;@property (nonatomic, strong) AVCaptureDevice *captureDevice;@property (nonatomic, strong) AVCaptureDeviceInput *videoDeviceInput;@property (nonatomic, strong) AVCaptureStillImageOutput *stillImageOutput;@property (nonatomic, strong) AVCaptureConnection *videoConnection;@property (nonatomic, strong) AVCaptureConnection *audioConnection;@property (nonatomic, strong) NSDictionary *videoCompressionSettings;@property (nonatomic, strong) NSDictionary *audioCompressionSettings;@property (nonatomic, strong) AVAssetWriterInputPixelBufferAdaptor *adaptor;@property (nonatomic, strong) AVCaptureVideoDataOutput *videoDataOutput;//Utilities@property (nonatomic, strong) NSMutableArray *frames;//存儲錄制幀@property (nonatomic, assign) CaptureAVSetupResult result;@property (atomic, readwrite) BOOL isCapturing;@property (atomic, readwrite) BOOL isPaused;@property (nonatomic, strong) NSTimer *durationTimer;@property (nonatomic, assign) WKRecorderFinishedReason finishReason;@end

實例化方法: 

+ (WKMovieRecorder *)sharedRecorder{  static WKMovieRecorder *recorder;  static dispatch_once_t onceToken;  dispatch_once(&onceToken, ^{    recorder = [[WKMovieRecorder alloc] initWithMaxDuration:CGFLOAT_MAX];  });    return recorder;}- (instancetype)initWithMaxDuration:(NSTimeInterval)duration{  if(self = [self init]){    _maxDuration = duration;    _duration = 0.f;  }    return self;}- (instancetype)init{  self = [super init];  if (self) {    _maxDuration = CGFLOAT_MAX;    _duration = 0.f;    _sessionQueue = dispatch_queue_create("wukong.movieRecorder.queue", DISPATCH_QUEUE_SERIAL );    _videoDataOutputQueue = dispatch_queue_create( "wukong.movieRecorder.video", DISPATCH_QUEUE_SERIAL );    dispatch_set_target_queue( _videoDataOutputQueue, dispatch_get_global_queue( DISPATCH_QUEUE_PRIORITY_HIGH, 0 ) );  }  return self;}

2.初始化設(shè)置
初始化設(shè)置分別為session創(chuàng)建、權(quán)限檢查以及session配置
1).session創(chuàng)建
self.session = [[AVCaptureSession alloc] init];
self.result = CaptureAVSetupResultSuccess;

2).權(quán)限檢查

//權(quán)限檢查    switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {      case AVAuthorizationStatusNotDetermined: {        [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {          if (granted) {            self.result = CaptureAVSetupResultSuccess;          }        }];        break;      }      case AVAuthorizationStatusAuthorized: {                break;      }      default:{        self.result = CaptureAVSetupResultCameraNotAuthorized;      }    }        if ( self.result != CaptureAVSetupResultSuccess) {            if (self.authorizationResultBlock) {        self.authorizationResultBlock(NO);      }      return;    }
        

3).session配置
session配置是需要注意的是AVCaptureSession的配置不能在主線程, 需要自行創(chuàng)建串行線程。 
3.1.1 獲取輸入設(shè)備與輸入流

AVCaptureDevice *captureDevice = [[self class] deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack];       _captureDevice = captureDevice;       NSError *error = nil; _videoDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];       if (!_videoDeviceInput) {  NSLog(@"未找到設(shè)備"); }

3.1.2 錄制幀數(shù)設(shè)置
幀數(shù)設(shè)置的主要目的是適配iPhone4,畢竟是應(yīng)該淘汰的機器了

int frameRate;      if ( [NSProcessInfo processInfo].processorCount == 1 )      {        if ([self.session canSetSessionPreset:AVCaptureSessionPresetLow]) {          [self.session setSessionPreset:AVCaptureSessionPresetLow];        }        frameRate = 10;      }else{        if ([self.session canSetSessionPreset:AVCaptureSessionPreset640x480]) {          [self.session setSessionPreset:AVCaptureSessionPreset640x480];        }        frameRate = 30;      }            CMTime frameDuration = CMTimeMake( 1, frameRate );            if ( [_captureDevice lockForConfiguration:&error] ) {        _captureDevice.activeVideoMaxFrameDuration = frameDuration;        _captureDevice.activeVideoMinFrameDuration = frameDuration;        [_captureDevice unlockForConfiguration];      }      else {        NSLog( @"videoDevice lockForConfiguration returned error %@", error );      }

3.1.3 視頻輸出設(shè)置
視頻輸出設(shè)置需要注意的問題是:要設(shè)置videoConnection的方向,這樣才能保證設(shè)備旋轉(zhuǎn)時的顯示正常。 

 

 //Video      if ([self.session canAddInput:_videoDeviceInput]) {                [self.session addInput:_videoDeviceInput];        self.videoDeviceInput = _videoDeviceInput;        [self.session removeOutput:_videoDataOutput];                AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];        _videoDataOutput = videoOutput;        videoOutput.videoSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };                [videoOutput setSampleBufferDelegate:self queue:_videoDataOutputQueue];                videoOutput.alwaysDiscardsLateVideoFrames = NO;                if ( [_session canAddOutput:videoOutput] ) {          [_session addOutput:videoOutput];                    [_captureDevice addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:FocusAreaChangedContext];                    _videoConnection = [videoOutput connectionWithMediaType:AVMediaTypeVideo];                    if(_videoConnection.isVideoStabilizationSupported){            _videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeAuto;          }                    UIInterfaceOrientation statusBarOrientation = [UIApplication sharedApplication].statusBarOrientation;          AVCaptureVideoOrientation initialVideoOrientation = AVCaptureVideoOrientationPortrait;          if ( statusBarOrientation != UIInterfaceOrientationUnknown ) {            initialVideoOrientation = (AVCaptureVideoOrientation)statusBarOrientation;          }                    _videoConnection.videoOrientation = initialVideoOrientation;        }      }      else{        NSLog(@"無法添加視頻輸入到會話");      }

3.1.4 音頻設(shè)置 
需要注意的是為了不丟幀,需要把音頻輸出的回調(diào)隊列放在串行隊列中 

//audio      AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];      AVCaptureDeviceInput *audioDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error];                  if ( ! audioDeviceInput ) {        NSLog( @"Could not create audio device input: %@", error );      }            if ( [self.session canAddInput:audioDeviceInput] ) {        [self.session addInput:audioDeviceInput];              }      else {        NSLog( @"Could not add audio device input to the session" );      }            AVCaptureAudioDataOutput *audioOut = [[AVCaptureAudioDataOutput alloc] init];      // Put audio on its own queue to ensure that our video processing doesn't cause us to drop audio      dispatch_queue_t audioCaptureQueue = dispatch_queue_create( "wukong.movieRecorder.audio", DISPATCH_QUEUE_SERIAL );      [audioOut setSampleBufferDelegate:self queue:audioCaptureQueue];            if ( [self.session canAddOutput:audioOut] ) {        [self.session addOutput:audioOut];      }      _audioConnection = [audioOut connectionWithMediaType:AVMediaTypeAudio];

還需要注意一個問題就是對于session的配置代碼應(yīng)該是這樣的 
[self.session beginConfiguration];

...配置代碼

[self.session commitConfiguration];

由于篇幅問題,后面的錄制代碼我就挑重點的講了。
3.2  視頻存儲
現(xiàn)在我們需要在AVCaptureVideoDataOutputSampleBufferDelegate與AVCaptureAudioDataOutputSampleBufferDelegate的回調(diào)中,將音頻和視頻寫入沙盒。在這個過程中需要注意的,在啟動session后獲取到的第一幀黑色的,需要放棄。
3.2.1 創(chuàng)建WKMovieWriter類來封裝視頻存儲操作
WKMovieWriter的主要作用是利用AVAssetWriter拿到CMSampleBufferRef,剪裁后再寫入到沙盒中。
這是剪裁配置的代碼,AVAssetWriter會根據(jù)cropSize來剪裁視頻,這里需要注意的一個問題是cropSize的width必須是320的整數(shù)倍,不然的話剪裁出來的視頻右側(cè)會出現(xiàn)一條綠色的線

 NSDictionary *videoSettings;  if (_cropSize.height == 0 || _cropSize.width == 0) {        _cropSize = [UIScreen mainScreen].bounds.size;      }    videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:           AVVideoCodecH264, AVVideoCodecKey,           [NSNumber numberWithInt:_cropSize.width], AVVideoWidthKey,           [NSNumber numberWithInt:_cropSize.height], AVVideoHeightKey,           AVVideoScalingModeResizeAspectFill,AVVideoScalingModeKey,           nil];

至此,視頻錄制就完成了。
接下來需要解決的預(yù)覽的問題了 

Part 2 卡頓問題解決
1.1 gif圖生成 
通過查資料發(fā)現(xiàn)了這篇blog 介紹說微信團隊解決預(yù)覽卡頓的問題使用的是播放圖片gif,但是博客中的示例代碼有問題,通過CoreAnimation來播放圖片導(dǎo)致內(nèi)存暴漲而crash。但是,還是給了我一些靈感,因為之前項目的啟動頁用到了gif圖片的播放,所以我就想能不能把視頻轉(zhuǎn)成圖片,然后再轉(zhuǎn)成gif圖進行播放,這樣不就解決了問題了嗎。于是我開始google功夫不負有心人找到了,圖片數(shù)組轉(zhuǎn)gif圖片的方法。

gif圖轉(zhuǎn)換代碼 

static void makeAnimatedGif(NSArray *images, NSURL *gifURL, NSTimeInterval duration) {  NSTimeInterval perSecond = duration /images.count;    NSDictionary *fileProperties = @{                   (__bridge id)kCGImagePropertyGIFDictionary: @{                       (__bridge id)kCGImagePropertyGIFLoopCount: @0, // 0 means loop forever                       }                   };    NSDictionary *frameProperties = @{                   (__bridge id)kCGImagePropertyGIFDictionary: @{                       (__bridge id)kCGImagePropertyGIFDelayTime: @(perSecond), // a float (not double!) in seconds, rounded to centiseconds in the GIF data                       }                   };    CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)gifURL, kUTTypeGIF, images.count, NULL);  CGImageDestinationSetProperties(destination, (__bridge CFDictionaryRef)fileProperties);    for (UIImage *image in images) {    @autoreleasepool {            CGImageDestinationAddImage(destination, image.CGImage, (__bridge CFDictionaryRef)frameProperties);    }  }    if (!CGImageDestinationFinalize(destination)) {    NSLog(@"failed to finalize image destination");  }else{          }  CFRelease(destination);}

轉(zhuǎn)換是轉(zhuǎn)換成功了,但是出現(xiàn)了新的問題,使用ImageIO生成gif圖片時會導(dǎo)致內(nèi)存暴漲,瞬間漲到100M以上,如果多個gif圖同時生成的話一樣會crash掉,為了解決這個問題需要用一個串行隊列來進行g(shù)if圖的生成   

1.2 視頻轉(zhuǎn)換為UIImages
主要是通過AVAssetReader、AVAssetTrack、AVAssetReaderTrackOutput 來進行轉(zhuǎn)換 

 

//轉(zhuǎn)成UIImage- (void)convertVideoUIImagesWithURL:(NSURL *)url finishBlock:(void (^)(id images, NSTimeInterval duration))finishBlock{    AVAsset *asset = [AVAsset assetWithURL:url];    NSError *error = nil;    self.reader = [[AVAssetReader alloc] initWithAsset:asset error:&error];        NSTimeInterval duration = CMTimeGetSeconds(asset.duration);    __weak typeof(self)weakSelf = self;    dispatch_queue_t backgroundQueue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0);    dispatch_async(backgroundQueue, ^{      __strong typeof(weakSelf) strongSelf = weakSelf;      NSLog(@"");                  if (error) {        NSLog(@"%@", [error localizedDescription]);              }            NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];            AVAssetTrack *videoTrack =[videoTracks firstObject];      if (!videoTrack) {        return ;      }      int m_pixelFormatType;      //   視頻播放時,      m_pixelFormatType = kCVPixelFormatType_32BGRA;      // 其他用途,如視頻壓縮      //  m_pixelFormatType = kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange;            NSMutableDictionary *options = [NSMutableDictionary dictionary];      [options setObject:@(m_pixelFormatType) forKey:(id)kCVPixelBufferPixelFormatTypeKey];      AVAssetReaderTrackOutput *videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:options];            if ([strongSelf.reader canAddOutput:videoReaderOutput]) {                [strongSelf.reader addOutput:videoReaderOutput];      }      [strongSelf.reader startReading];                  NSMutableArray *images = [NSMutableArray array];      // 要確保nominalFrameRate>0,之前出現(xiàn)過android拍的0幀視頻      while ([strongSelf.reader status] == AVAssetReaderStatusReading && videoTrack.nominalFrameRate > 0) {         @autoreleasepool {        // 讀取 video sample        CMSampleBufferRef videoBuffer = [videoReaderOutput copyNextSampleBuffer];                if (!videoBuffer) {          break;        }                [images addObject:[WKVideoConverter convertSampleBufferRefToUIImage:videoBuffer]];                CFRelease(videoBuffer);      }                 }      if (finishBlock) {        dispatch_async(dispatch_get_main_queue(), ^{          finishBlock(images, duration);        });      }    }); }

在這里有一個值得注意的問題,在視頻轉(zhuǎn)image的過程中,由于轉(zhuǎn)換時間很短,在短時間內(nèi)videoBuffer不能夠及時得到釋放,在多個視頻同時轉(zhuǎn)換時任然會出現(xiàn)內(nèi)存問題,這個時候就需要用autoreleasepool來實現(xiàn)及時釋放 

@autoreleasepool { // 讀取 video sample CMSampleBufferRef videoBuffer = [videoReaderOutput copyNextSampleBuffer];   if (!videoBuffer) {   break;   }             [images addObject:[WKVideoConverter convertSampleBufferRefToUIImage:videoBuffer]];    CFRelease(videoBuffer); }

至此,微信小視頻的難點(我認為的)就解決了,至于其他的實現(xiàn)代碼請看demo就基本實現(xiàn)了,demo可以從這里下載。

視頻暫停錄制 http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html
視頻crop綠邊解決 http://stackoverflow.com/questions/22883525/avassetexportsession-giving-me-a-green-border-on-right-and-bottom-of-output-vide
視頻裁剪:http://stackoverflow.com/questions/15737781/video-capture-with-11-aspect-ratio-in-ios/16910263#16910263
CMSampleBufferRef轉(zhuǎn)image https://developer.apple.com/library/ios/qa/qa1702/_index.html
微信小視頻分析 http://www.jianshu.com/p/3d5ccbde0de1

感謝以上文章的作者

以上就是本文的全部內(nèi)容,希望對大家的學(xué)習(xí)有所幫助,也希望大家多多支持VEVB武林網(wǎng)。


注:相關(guān)教程知識閱讀請移步到IOS開發(fā)頻道。
發(fā)表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發(fā)表
主站蜘蛛池模板: 岳普湖县| 乌兰察布市| 大兴区| 邓州市| 雅江县| 南川市| 鄂州市| 通辽市| 项城市| 会泽县| 乐都县| 启东市| 凤城市| 马鞍山市| 襄樊市| 涡阳县| 连山| 牟定县| 孝昌县| 黔东| 博爱县| 唐山市| 玛沁县| 清河县| 珲春市| 新平| 明光市| 年辖:市辖区| 龙里县| 西藏| 康定县| 大田县| 靖安县| 长丰县| 西乌珠穆沁旗| 观塘区| 镇巴县| 桐柏县| 澄迈县| 邛崃市| 房产|