最近開發(fā)中遇到一個需求,就是想微信那樣錄制一個小視頻,然后在錄制視頻的圖層上播放,然后發(fā)布到朋友圈,無聲播放,但有滾動起來不影響性能。一開始接到這個需求的時候我是很興奮的,可以好好研究一番 AVFoundation 的東西了。但是在研究中不斷的高潮迭起,也是讓我心力交瘁呀。但是,做程序猿的成長就是這樣的嘛。題外話了,好了,今天我們就說一下怎么用AVCaptureSession+AVCaptureMovieFileOutput來錄制視頻,并通過AVAssetExportSeeion手段來壓縮視頻并轉(zhuǎn)換為 MP4 格式
一開始我們要了解一下 AVFoundation 做視頻的類應該有那些,并且他們有什么用呢?
AVCaptureSession
AVCaptureSession:媒體(音、視頻)捕獲會話,負責把捕獲的音視頻數(shù)據(jù)輸出到輸出設備中。一個AVCaptureSession可以有多個輸入輸出。AVCaptureDevice:輸入設備,包括麥克風、攝像頭,通過該對象可以設置物理設備的一些屬性(例如相機聚焦、白平衡等)。AVCaptureDeviceInput:設備輸入數(shù)據(jù)管理對象,可以根據(jù)AVCaptureDevice創(chuàng)建對應的AVCaptureDeviceInput對象,該對象將會被添加到AVCaptureSession中管理。AVCaptureVideoPreviewLayer:相機拍攝預覽圖層,是CALayer的子類,使用該對象可以實時查看拍照或視頻錄制效果,創(chuàng)建該對象需要指定對應的AVCaptureSession對象。AVCaptureOutput:輸出數(shù)據(jù)管理對象,用于接收各類輸出數(shù)據(jù),通常使用對應的子類AVCaptureAudioDataOutput、AVCaptureStillImageOutput、AVCaptureVideoDataOutput、AVCaptureFileOutput, 該對象將會被添加到AVCaptureSession中管理。 注意:前面幾個對象的輸出數(shù)據(jù)都是NSData類型,而AVCaptureFileOutput代表數(shù)據(jù)以文件形式輸出,類似的,AVCcaptureFileOutput也不會直接創(chuàng)建使用,通常會使用其子類:AVCaptureAudioFileOutput、AVCaptureMovieFileOutput。當把一個輸入或者輸出添加到AVCaptureSession之后AVCaptureSession就會在所有相符的輸入、輸出設備之間 建立連接(AVCaptionConnection)。
那么建立視頻拍攝的步驟如下 :
1.創(chuàng)建AVCaptureSession對象。
// 創(chuàng)建會話 (AVCaptureSession) 對象。_captureSession = [[AVCaptureSessionalloc] init];if([_captureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) {// 設置會話的 sessionPreset 屬性, 這個屬性影響視頻的分辨率[_captureSession setSessionPreset:AVCaptureSessionPreset640x480];}
2.使用AVCaptureDevice的靜態(tài)方法獲得需要使用的設備,例如拍照和錄像就需要獲得攝像頭設備,錄音就要獲得麥克風設備。
// 獲取攝像頭輸入設備, 創(chuàng)建 AVCaptureDeviceInput 對象// 在獲取攝像頭的時候,攝像頭分為前后攝像頭,我們創(chuàng)建了一個方法通過用攝像頭的位置來獲取攝像頭AVCaptureDevice*videoCaptureDevice = [selfgetCameraDeviceWithPosition:AVCaptureDevicePositionBack];if(!captureDevice) {NSLog(@"---- 取得后置攝像頭時出現(xiàn)問題---- ");return;}// 添加一個音頻輸入設備// 直接可以拿數(shù)組中的數(shù)組中的第一個AVCaptureDevice*audioCaptureDevice = [[AVCaptureDevicedevicesWithMediaType:AVMediaTypeAudio] firstObject];
3.利用輸入設備AVCaptureDevice初始化AVCaptureDeviceInput對象。
// 視頻輸入對象// 根據(jù)輸入設備初始化輸入對象,用戶獲取輸入數(shù)據(jù)_videoCaptureDeviceInput = [[AVCaptureDeviceInputalloc] initWithDevice:captureDevice error:&error];if(error) {NSLog(@"---- 取得設備輸入對象時出錯 ------ %@",error);return;}//? 音頻輸入對象//根據(jù)輸入設備初始化設備輸入對象,用于獲得輸入數(shù)據(jù)_audioCaptureDeviceInput = [[AVCaptureDeviceInputalloc] initWithDevice:audioCaptureDevice error:&error];if(error) {NSLog(@"取得設備輸入對象時出錯 ------ %@",error);return;}
4.初始化輸出數(shù)據(jù)管理對象,如果要拍照就初始化AVCaptureStillImageOutput對象;如果拍攝視頻就初始化AVCaptureMovieFileOutput對象。
// 拍攝視頻輸出對象// 初始化輸出設備對象,用戶獲取輸出數(shù)據(jù)_caputureMovieFileOutput = [[AVCaptureMovieFileOutputalloc] init];
5.將數(shù)據(jù)輸入對象AVCaptureDeviceInput、數(shù)據(jù)輸出對象AVCaptureOutput添加到媒體會話管理對象AVCaptureSession中。
// 將視頻輸入對象添加到會話 (AVCaptureSession) 中if([_captureSession canAddInput:_videoCaptureDeviceInput]) {? ? [_captureSession addInput:_videoCaptureDeviceInput];}// 將音頻輸入對象添加到會話 (AVCaptureSession) 中if([_captureSession canAddInput:_captureDeviceInput]) {? ? [_captureSession addInput:audioCaptureDeviceInput];AVCaptureConnection*captureConnection = [_caputureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];// 標識視頻錄入時穩(wěn)定音頻流的接受,我們這里設置為自動if([captureConnection isVideoStabilizationSupported]) {? ? ? ? captureConnection.preferredVideoStabilizationMode=AVCaptureVideoStabilizationModeAuto;? ? }}
6.創(chuàng)建視頻預覽圖層AVCaptureVideoPreviewLayer并指定媒體會話,添加圖層到顯示容器中,調(diào)用AVCaptureSession的startRuning方法開始捕獲。
// 通過會話 (AVCaptureSession) 創(chuàng)建預覽層_captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayeralloc] initWithSession:_captureSession];// 顯示在視圖表面的圖層CALayer*layer =self.viewContrain.layer;layer.masksToBounds=true;_captureVideoPreviewLayer.frame= layer.bounds;_captureVideoPreviewLayer.masksToBounds=true;_captureVideoPreviewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;//填充模式[layer addSublayer:_captureVideoPreviewLayer];// 讓會話(AVCaptureSession)勾搭好輸入輸出,然后把視圖渲染到預覽層上[_captureSession startRunning];
7.將捕獲的音頻或視頻數(shù)據(jù)輸出到指定文件。
創(chuàng)建一個拍攝的按鈕,當我們點擊這個按鈕就會觸發(fā)視頻錄制,并將這個錄制的視頻放到 temp 文件夾中- (IBAction)takeMovie:(id)sender {[(UIButton*)sender setSelected:![(UIButton*)sender isSelected]];if([(UIButton*)sender isSelected]) {AVCaptureConnection*captureConnection=[self.caputureMovieFileOutputconnectionWithMediaType:AVMediaTypeVideo];// 開啟視頻防抖模式AVCaptureVideoStabilizationModestabilizationMode =AVCaptureVideoStabilizationModeCinematic;if([self.captureDeviceInput.device.activeFormatisVideoStabilizationModeSupported:stabilizationMode]) {? ? ? ? [captureConnection setPreferredVideoStabilizationMode:stabilizationMode];? ? }//如果支持多任務則則開始多任務if([[UIDevicecurrentDevice] isMultitaskingSupported]) {self.backgroundTaskIdentifier= [[UIApplicationsharedApplication] beginBackgroundTaskWithExpirationHandler:nil];? ? }// 預覽圖層和視頻方向保持一致,這個屬性設置很重要,如果不設置,那么出來的視頻圖像可以是倒向左邊的。captureConnection.videoOrientation=[self.captureVideoPreviewLayerconnection].videoOrientation;// 設置視頻輸出的文件路徑,這里設置為 temp 文件NSString*outputFielPath=[NSTemporaryDirectory() stringByAppendingString:MOVIEPATH];// 路徑轉(zhuǎn)換成 URL 要用這個方法,用 NSBundle 方法轉(zhuǎn)換成 URL 的話可能會出現(xiàn)讀取不到路徑的錯誤NSURL*fileUrl=[NSURLfileURLWithPath:outputFielPath];// 往路徑的 URL 開始寫入錄像 Buffer ,邊錄邊寫[self.caputureMovieFileOutputstartRecordingToOutputFileURL:fileUrl recordingDelegate:self];}else{// 取消視頻拍攝[self.caputureMovieFileOutputstopRecording];? ? [self.captureSessionstopRunning];? ? [selfcompleteHandle];}}
當然我們錄制的開始與結束都是有監(jiān)聽方法的,AVCaptureFileOutputRecordingDelegate這個代理里面就有我們想要做的
- (void)captureOutput:(AVCaptureFileOutput*)captureOutput didStartRecordingToOutputFileAtURL:(NSURL*)fileURL fromConnections:(NSArray*)connections{NSLog(@"---- 開始錄制 ----");}- (void)captureOutput:(AVCaptureFileOutput*)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL*)outputFileURL fromConnections:(NSArray*)connections error:(NSError*)error{NSLog(@"---- 錄制結束 ----");}
到此,我們錄制視頻就結束了,那么是不是我們錄制好了視頻,就可以馬上把這個視頻上傳給服務器分享給你的小伙伴們看了呢?
我們可以用如下方法測試一下我們錄制出來的視頻有多大 (m)
- (CGFloat)getfileSize:(NSString*)path{NSDictionary*outputFileAttributes = [[NSFileManagerdefaultManager] attributesOfItemAtPath:path error:nil];NSLog(@"file size: %f", (unsignedlonglong)[outputFileAttributes fileSize]/1024.00/1024.00);return(CGFloat)[outputFileAttributes fileSize]/1024.00/1024.00;}
個人在這里做過測試,錄制了 10s 的小視頻得到的文件大小為 4.1M 左右,而且我用的分辨率還是640x480。。。很無語了是不是?
如果我們錄制的視頻,錄制完成后要與服務器進行必要的上傳,那么,我們肯定不能把這個剛剛錄制出來的視頻上傳給服務器的,我們有必要對這個視頻進行壓縮了。那么我們的壓縮方法,就要用到AVAssetExportSeeion這個類了。
// 這里我們創(chuàng)建一個按鈕,當點擊這個按鈕,我們就會調(diào)用壓縮視頻的方法,然后再去重新計算大小,這樣就會跟未被壓縮前的大小有個明顯的對比了// 壓縮視頻- (IBAction)compressVideo:(id)sender{NSString*cachePath=[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES) lastObject];NSString*savePath=[cachePath stringByAppendingPathComponent:MOVIEPATH];NSURL*saveUrl=[NSURLfileURLWithPath:savePath];// 通過文件的 url 獲取到這個文件的資源AVURLAsset*avAsset = [[AVURLAssetalloc] initWithURL:saveUrl options:nil];// 用 AVAssetExportSession 這個類來導出資源中的屬性NSArray*compatiblePresets = [AVAssetExportSessionexportPresetsCompatibleWithAsset:avAsset];// 壓縮視頻if([compatiblePresets containsObject:AVAssetExportPresetLowQuality]) {// 導出屬性是否包含低分辨率// 通過資源(AVURLAsset)來定義 AVAssetExportSession,得到資源屬性來重新打包資源 (AVURLAsset, 將某一些屬性重新定義AVAssetExportSession*exportSession = [[AVAssetExportSessionalloc] initWithAsset:avAsset presetName:AVAssetExportPresetLowQuality];// 設置導出文件的存放路徑NSDateFormatter*formatter = [[NSDateFormatteralloc] init];? ? [formatter setDateFormat:@"yyyy-MM-dd-HH:mm:ss"];NSDate*date = [[NSDatealloc] init];NSString*outPutPath = [[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,true) lastObject] stringByAppendingPathComponent:[NSStringstringWithFormat:@"output-%@.mp4",[formatter stringFromDate:date]]];? ? exportSession.outputURL= [NSURLfileURLWithPath:outPutPath];// 是否對網(wǎng)絡進行優(yōu)化exportSession.shouldOptimizeForNetworkUse=true;// 轉(zhuǎn)換成MP4格式exportSession.outputFileType=AVFileTypeMPEG4;// 開始導出,導出后執(zhí)行完成的block[exportSession exportAsynchronouslyWithCompletionHandler:^{// 如果導出的狀態(tài)為完成if([exportSession status] ==AVAssetExportSessionStatusCompleted) {dispatch_async(dispatch_get_main_queue(), ^{// 更新一下顯示包的大小self.videoSize.text= [NSStringstringWithFormat:@"%f MB",[selfgetfileSize:outPutPath]];? ? ? ? ? ? });? ? ? ? }? ? }];}}
經(jīng)過我們的壓縮,這個時候10s 的 4M 視頻就只剩下不夠 1M 了。
以下是一些擴展
自動閃光燈開啟
- (IBAction)flashAutoClick:(UIButton*)sender {? ? [selfsetFlashMode:AVCaptureFlashModeAuto];? ? [selfsetFlashModeButtonStatus];}
打開閃光燈
- (IBAction)flashOnClick:(UIButton*)sender {? ? [selfsetFlashMode:AVCaptureFlashModeOn];? ? [selfsetFlashModeButtonStatus];}
關閉閃光燈
- (IBAction)flashOffClick:(UIButton*)sender {? ? [selfsetFlashMode:AVCaptureFlashModeOff];? ? [selfsetFlashModeButtonStatus];}
通知
/**
*? 給輸入設備添加通知
*/-(void)addNotificationToCaptureDevice:(AVCaptureDevice*)captureDevice{//注意添加區(qū)域改變捕獲通知必須首先設置設備允許捕獲[selfchangeDeviceProperty:^(AVCaptureDevice*captureDevice) {? ? captureDevice.subjectAreaChangeMonitoringEnabled=YES;}];NSNotificationCenter*notificationCenter= [NSNotificationCenterdefaultCenter];//捕獲區(qū)域發(fā)生改變[notificationCenter addObserver:selfselector:@selector(areaChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotificationobject:captureDevice];}-(void)removeNotificationFromCaptureDevice:(AVCaptureDevice*)captureDevice{NSNotificationCenter*notificationCenter= [NSNotificationCenterdefaultCenter];[notificationCenter removeObserver:selfname:AVCaptureDeviceSubjectAreaDidChangeNotificationobject:captureDevice];}/**
*? 移除所有通知
*/-(void)removeNotification{NSNotificationCenter*notificationCenter= [NSNotificationCenterdefaultCenter];? ? [notificationCenter removeObserver:self];}-(void)addNotificationToCaptureSession:(AVCaptureSession*)captureSession{NSNotificationCenter*notificationCenter= [NSNotificationCenterdefaultCenter];//會話出錯[notificationCenter addObserver:selfselector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotificationobject:captureSession];}/**
*? 設備連接成功
*
*? @param notification 通知對象
*/-(void)deviceConnected:(NSNotification*)notification{NSLog(@"設備已連接...");}/**
*? 設備連接斷開
*
*? @param notification 通知對象
*/-(void)deviceDisconnected:(NSNotification*)notification{NSLog(@"設備已斷開.");}/**
*? 捕獲區(qū)域改變
*
*? @param notification 通知對象
*/-(void)areaChange:(NSNotification*)notification{NSLog(@"捕獲區(qū)域改變...");}/**
*? 會話出錯
*
*? @param notification 通知對象
*/-(void)sessionRuntimeError:(NSNotification*)notification{NSLog(@"會話發(fā)生錯誤.");
}
私有方法
/**
*? 取得指定位置的攝像頭
*
*? @param position 攝像頭位置
*
*? @return 攝像頭設備
*/-(AVCaptureDevice*)getCameraDeviceWithPosition:(AVCaptureDevicePosition)position{NSArray*cameras= [AVCaptureDevicedevicesWithMediaType:AVMediaTypeVideo];for(AVCaptureDevice*cameraincameras) {if([camera position]==position) {returncamera;? ? ? ? }? ? }returnnil;}/**
*? 改變設備屬性的統(tǒng)一操作方法
*
*? @param propertyChange 屬性改變操作
*/-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{AVCaptureDevice*captureDevice= [self.captureDeviceInputdevice];NSError*error;//注意改變設備屬性前一定要首先調(diào)用lockForConfiguration:調(diào)用完之后使用unlockForConfiguration方法解鎖if([captureDevice lockForConfiguration:&error]) {? ? ? ? propertyChange(captureDevice);? ? ? ? [captureDevice unlockForConfiguration];? ? }else{NSLog(@"設置設備屬性過程發(fā)生錯誤,錯誤信息:%@",error.localizedDescription);? ? }}/**
*? 設置閃光燈模式
*
*? @param flashMode 閃光燈模式
*/-(void)setFlashMode:(AVCaptureFlashMode)flashMode{? ? [selfchangeDeviceProperty:^(AVCaptureDevice*captureDevice) {if([captureDevice isFlashModeSupported:flashMode]) {? ? ? ? ? ? [captureDevice setFlashMode:flashMode];? ? ? ? }? ? }];}/**
*? 設置聚焦模式
*
*? @param focusMode 聚焦模式
*/-(void)setFocusMode:(AVCaptureFocusMode)focusMode{? ? [selfchangeDeviceProperty:^(AVCaptureDevice*captureDevice) {if([captureDevice isFocusModeSupported:focusMode]) {? ? ? ? ? ? [captureDevice setFocusMode:focusMode];? ? ? ? }? ? }];}/**
*? 設置曝光模式
*
*? @param exposureMode 曝光模式
*/-(void)setExposureMode:(AVCaptureExposureMode)exposureMode{? ? [selfchangeDeviceProperty:^(AVCaptureDevice*captureDevice) {if([captureDevice isExposureModeSupported:exposureMode]) {? ? ? ? ? ? [captureDevice setExposureMode:exposureMode];? ? ? ? }? ? }];}/**
*? 設置聚焦點
*
*? @param point 聚焦點
*/-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{? ? [selfchangeDeviceProperty:^(AVCaptureDevice*captureDevice) {if([captureDevice isFocusModeSupported:focusMode]) {? ? ? ? ? ? [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus];? ? ? ? }if([captureDevice isFocusPointOfInterestSupported]) {? ? ? ? ? ? [captureDevice setFocusPointOfInterest:point];? ? ? ? }if([captureDevice isExposureModeSupported:exposureMode]) {? ? ? ? ? ? [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose];? ? ? ? }if([captureDevice isExposurePointOfInterestSupported]) {? ? ? ? ? ? [captureDevice setExposurePointOfInterest:point];? ? ? ? }? ? }];}/**
*? 添加點按手勢,點按時聚焦
*/-(void)addGenstureRecognizer{UITapGestureRecognizer*tapGesture=[[UITapGestureRecognizeralloc]initWithTarget:selfaction:@selector(tapScreen:)];? ? [self.viewContaineraddGestureRecognizer:tapGesture];}-(void)tapScreen:(UITapGestureRecognizer*)tapGesture{CGPointpoint= [tapGesture locationInView:self.viewContainer];//將UI坐標轉(zhuǎn)化為攝像頭坐標CGPointcameraPoint= [self.captureVideoPreviewLayercaptureDevicePointOfInterestForPoint:point];? ? [selfsetFocusCursorWithPoint:point];? ? [selffocusWithMode:AVCaptureFocusModeAutoFocusexposureMode:AVCaptureExposureModeAutoExposeatPoint:cameraPoint];}/**
*? 設置閃光燈按鈕狀態(tài)
*/-(void)setFlashModeButtonStatus{AVCaptureDevice*captureDevice=[self.captureDeviceInputdevice];AVCaptureFlashModeflashMode=captureDevice.flashMode;if([captureDevice isFlashAvailable]){self.flashAutoButton.hidden=NO;self.flashOnButton.hidden=NO;self.flashOffButton.hidden=NO;self.flashAutoButton.enabled=YES;self.flashOnButton.enabled=YES;self.flashOffButton.enabled=YES;switch(flashMode) {caseAVCaptureFlashModeAuto:self.flashAutoButton.enabled=NO;break;caseAVCaptureFlashModeOn:self.flashOnButton.enabled=NO;break;caseAVCaptureFlashModeOff:self.flashOffButton.enabled=NO;break;default:break;? ? }}else{self.flashAutoButton.hidden=YES;self.flashOnButton.hidden=YES;self.flashOffButton.hidden=YES;}}/**
*? 設置聚焦光標位置
*
*? @param point 光標位置
*/-(void)setFocusCursorWithPoint:(CGPoint)point{self.focusCursor.center=point;self.focusCursor.transform=CGAffineTransformMakeScale(1.5,1.5);self.focusCursor.alpha=1.0;? ? [UIViewanimateWithDuration:1.0animations:^{self.focusCursor.transform=CGAffineTransformIdentity;? ? } completion:^(BOOLfinished) {self.focusCursor.alpha=0;? ? }];}
@end
文/止于浮水(簡書作者)
原文鏈接:http://www.lxweimin.com/p/7c57c58c253d/comments/1184468
著作權歸作者所有,轉(zhuǎn)載請聯(lián)系作者獲得授權,并標注“簡書作者”。