現狀:現在視頻直播非常的火,所以在視頻直播開發中,使用的對視頻進行遍解碼的框架顯得尤為重要了,其實,這種框架蠻多的,這次主要介紹一下FFmpeg視頻播放器的集成和使用,FFmpeg是視頻編解碼的利器。
介紹:視頻播放過程
首先簡單介紹以下視頻文件的相關知識。我們平時看到的視頻文件有許多格式,比如 avi, mkv, rmvb, mov, mp4等等,這些被稱為容器(Container), 不同的容器格式規定了其中音視頻數據的組織方式(也包括其他數據,比如字幕等)。容器中一般會封裝有視頻和音頻軌,也稱為視頻流(stream)和音頻 流,播放視頻文件的第一步就是根據視頻文件的格式,解析(demux)出其中封裝的視頻流、音頻流以及字幕(如果有的話),解析的數據讀到包 (packet)中,每個包里保存的是視頻幀(frame)或音頻幀,然后分別對視頻幀和音頻幀調用相應的解碼器(decoder)進行解碼,比如使用 H.264編碼的視頻和MP3編碼的音頻,會相應的調用H.264解碼器和MP3解碼器,解碼之后得到的就是原始的圖像(YUV or RGB)和聲音(PCM)數據,然后根據同步好的時間將圖像顯示到屏幕上,將聲音輸出到聲卡,最終就是我們看到的視頻。
FFmpeg的API就是根據這個過程設計的,因此使用FFmpeg來處理視頻文件的方法非常直觀簡單。下面就一步一步介紹從視頻文件中解碼出圖片的過程。
屬性:聲明變量
AVFormatContext:保存需要讀入的文件的格式信息,比如流的個數以及流數據等
AVCodecCotext:保存了相應流的詳細編碼信息,比如視頻的寬、高,編碼類型等。
pCodec:真正的編解碼器,其中有編解碼需要調用的函數
AVFrame:用于保存數據幀的數據結構,這里的兩個幀分別是保存顏色轉換前后的兩幀圖像
AVPacket:解析文件時會將音/視頻幀讀入到packet中
一 本播放器原理:
通過ffmpeg對視頻進行解碼,解碼出每一幀圖片,然后根據一定時間播放每一幀圖
二 如何集成 ffmpeg
下載腳本ffmpeg腳本
根據上面鏈接的 README 進行編譯
大致步驟:
1. 下載腳本:https://github.com/kewlbear/FFmpeg-iOS-build-script
2. 解壓,找到文件 build-ffmpeg.sh
3. 進入終端,執行服本文件:./build-ffmpeg.sh, 由于本人沒有事先安裝Yasm,執行腳本文件會出錯,提示Homebrew not found,Trying 頭install.....如圖:
根據提示,按下enter鍵進行安裝并編譯靜態庫FFmpeg,如下圖:
這是編譯后的靜態庫,截圖如下:
集成到項目,新建工程,將編譯好的靜態庫以及頭文件導入工程(demo)
導入依賴庫
設置頭文件路徑,路徑一定要對,不然胡找不到頭文件
我設置路徑如下圖:
先 command + B 編譯一下,確保能編譯成功
三 開始編寫代碼
新建一個OC文件
////SJMoiveObject.h//SJLiveVideo////Created by king on 16/6/16.//Copyright ? 2016年 king. All rights reserved.//#import"Common.h"#import#import"NSString+Extions.h"#include#include#include@interfaceSJMoiveObject : NSObject/*解碼后的UIImage*/@property (nonatomic, strong,readonly) UIImage *currentImage;/*視頻的frame高度*/@property (nonatomic, assign,readonly)intsourceWidth, sourceHeight;/*輸出圖像大小。默認設置為源大小。*/@property (nonatomic,assign)intoutputWidth, outputHeight;/*視頻的長度,秒為單位*/@property (nonatomic, assign,readonly)doubleduration;/*視頻的當前秒數*/@property (nonatomic, assign,readonly)doublecurrentTime;/*視頻的幀率*/@property (nonatomic, assign,readonly)doublefps;/*視頻路徑。*/- (instancetype)initWithVideo:(NSString *)moviePath;/*切換資源*/- (void)replaceTheResources:(NSString *)moviePath;/*重撥*/- (void)redialPaly;/*從視頻流中讀取下一幀。返回假,如果沒有幀讀取(視頻)。*/-(BOOL)stepFrame;/*尋求最近的關鍵幀在指定的時間*/- (void)seekTime:(double)seconds;@end開始實現API////SJMoiveObject.m//SJLiveVideo////Created by king on 16/6/16.//Copyright ? 2016年 king. All rights reserved.//#import"SJMoiveObject.h"@interfaceSJMoiveObject ()
@property (nonatomic, copy) NSString*cruutenPath;@end@implementationSJMoiveObject
{
AVFormatContext*SJFormatCtx;
AVCodecContext*SJCodecCtx;
AVFrame*SJFrame;
AVStream*stream;
AVPacket? ? ? ? ? ? packet;
AVPicture? ? ? ? ? picture;intvideoStream;doublefps;
BOOL? ? ? ? ? ? ? ? isReleaseResources;
}#pragmamark ------------------------------------#pragmamark? 初始化- (instancetype)initWithVideo:(NSString *)moviePath {if(!(self=[super init]))returnnil;if([self initializeResources:[moviePath UTF8String]]) {
self.cruutenPath=[moviePath copy];returnself;
}else{returnnil;
}
}- (BOOL)initializeResources:(constchar*)filePath {
isReleaseResources=NO;
AVCodec*pCodec;//注冊所有解碼器avcodec_register_all();
av_register_all();
avformat_network_init();//打開視頻文件if(avformat_open_input(&SJFormatCtx, filePath, NULL, NULL) !=0) {
SJLog(@"打開文件失敗");gotoinitError;
}//檢查數據流if(avformat_find_stream_info(SJFormatCtx, NULL) <0) {
SJLog(@"檢查數據流失敗");gotoinitError;
}//根據數據流,找到第一個視頻流if((videoStream =? av_find_best_stream(SJFormatCtx, AVMEDIA_TYPE_VIDEO, -1, -1, &pCodec,0)) <0) {
SJLog(@"沒有找到第一個視頻流");gotoinitError;
}//獲取視頻流的編解碼上下文的指針stream? ? ? = SJFormatCtx->streams[videoStream];
SJCodecCtx= stream->codec;#ifDEBUG//打印視頻流的詳細信息av_dump_format(SJFormatCtx, videoStream, filePath,0);#endifif(stream->avg_frame_rate.den && stream->avg_frame_rate.num) {
fps= av_q2d(stream->avg_frame_rate);
}else{ fps =30; }//查找解碼器pCodec = avcodec_find_decoder(SJCodecCtx->codec_id);if(pCodec ==NULL) {
SJLog(@"沒有找到解碼器");gotoinitError;
}//打開解碼器if(avcodec_open2(SJCodecCtx, pCodec, NULL) <0) {
SJLog(@"打開解碼器失敗");gotoinitError;
}//分配視頻幀SJFrame =av_frame_alloc();
_outputWidth= SJCodecCtx->width;
_outputHeight= SJCodecCtx->height;returnYES;
initError:returnNO;
}- (void)seekTime:(double)seconds {
AVRational timeBase= SJFormatCtx->streams[videoStream]->time_base;
int64_t targetFrame= (int64_t)((double)timeBase.den / timeBase.num *seconds);
avformat_seek_file(SJFormatCtx,
videoStream,0,
targetFrame,
targetFrame,
AVSEEK_FLAG_FRAME);
avcodec_flush_buffers(SJCodecCtx);
}-(BOOL)stepFrame {intframeFinished =0;while(!frameFinished && av_read_frame(SJFormatCtx, &packet) >=0) {if(packet.stream_index ==videoStream) {
avcodec_decode_video2(SJCodecCtx,
SJFrame,&frameFinished,&packet);
}
}if(frameFinished ==0&& isReleaseResources ==NO) {
[self releaseResources];
}returnframeFinished !=0;
}- (void)replaceTheResources:(NSString *)moviePath {if(!isReleaseResources) {
[self releaseResources];
}
self.cruutenPath=[moviePath copy];
[self initializeResources:[moviePath UTF8String]];
}- (void)redialPaly {
[self initializeResources:[self.cruutenPath UTF8String]];
}#pragmamark ------------------------------------#pragmamark? 重寫屬性訪問方法-(void)setOutputWidth:(int)newValue {if(_outputWidth == newValue)return;
_outputWidth=newValue;
}-(void)setOutputHeight:(int)newValue {if(_outputHeight == newValue)return;
_outputHeight=newValue;
}-(UIImage *)currentImage {if(!SJFrame->data[0])returnnil;return[self imageFromAVPicture];
}-(double)duration {return(double)SJFormatCtx->duration /AV_TIME_BASE;
}- (double)currentTime {
AVRational timeBase= SJFormatCtx->streams[videoStream]->time_base;returnpacket.pts * (double)timeBase.num /timeBase.den;
}- (int)sourceWidth {returnSJCodecCtx->width;
}- (int)sourceHeight {returnSJCodecCtx->height;
}- (double)fps {returnfps;
}#pragmamark --------------------------#pragmamark - 內部方法- (UIImage *)imageFromAVPicture
{
avpicture_free(&picture);
avpicture_alloc(&picture, AV_PIX_FMT_RGB24, _outputWidth, _outputHeight);structSwsContext * imgConvertCtx = sws_getContext(SJFrame->width,
SJFrame->height,
AV_PIX_FMT_YUV420P,
_outputWidth,
_outputHeight,
AV_PIX_FMT_RGB24,
SWS_FAST_BILINEAR,
NULL,
NULL,
NULL);if(imgConvertCtx == nil)returnnil;
sws_scale(imgConvertCtx,
SJFrame->data,
SJFrame->linesize,0,
SJFrame->height,
picture.data,
picture.linesize);
sws_freeContext(imgConvertCtx);
CGBitmapInfo bitmapInfo=kCGBitmapByteOrderDefault;
CFDataRef data=CFDataCreate(kCFAllocatorDefault,
picture.data[0],
picture.linesize[0] *_outputHeight);
CGDataProviderRef provider=CGDataProviderCreateWithCFData(data);
CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
CGImageRef cgImage=CGImageCreate(_outputWidth,
_outputHeight,8,24,
picture.linesize[0],
colorSpace,
bitmapInfo,
provider,
NULL,
NO,
kCGRenderingIntentDefault);
UIImage*image =[UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
CGColorSpaceRelease(colorSpace);
CGDataProviderRelease(provider);
CFRelease(data);returnimage;
}#pragmamark --------------------------#pragmamark - 釋放資源- (void)releaseResources {
SJLog(@"釋放資源");
SJLogFunc
isReleaseResources=YES;//釋放RGBavpicture_free(&picture);//釋放frameav_packet_unref(&packet);//釋放YUV frameav_free(SJFrame);//關閉解碼器if(SJCodecCtx) avcodec_close(SJCodecCtx);//關閉文件if(SJFormatCtx) avformat_close_input(&SJFormatCtx);
avformat_network_deinit();
}@end
為了方便,在SB 拖一個 UIImageView 控件 和按鈕 ?并連好線
////ViewController.m//SJLiveVideo////Created by king on 16/6/14.//Copyright ? 2016年 king. All rights reserved.//#import"ViewController.h"#import"SJMoiveObject.h"#import#import"SJAudioObject.h"#import"SJAudioQueuPlay.h"#defineLERP(A,B,C) ((A)*(1.0-C)+(B)*C)@interfaceViewController ()
@property (weak, nonatomic) IBOutlet UIImageView*ImageView;
@property (weak, nonatomic) IBOutlet UILabel*fps;
@property (weak, nonatomic) IBOutlet UIButton*playBtn;
@property (weak, nonatomic) IBOutlet UIButton*TimerBtn;
@property (weak, nonatomic) IBOutlet UILabel*TimerLabel;
@property (nonatomic, strong) SJMoiveObject*video;
@property (nonatomic, strong) SJAudioObject*audio;
@property (nonatomic, strong) SJAudioQueuPlay*audioPlay;
@property (nonatomic, assign)floatlastFrameTime;@end@implementationViewController@synthesizeImageView, fps, playBtn, video;- (void)viewDidLoad {
[super viewDidLoad];
self.video= [[SJMoiveObject alloc] initWithVideo:[NSString bundlePath:@"Dalshabet.mp4"]];//self.video = [[SJMoiveObject alloc] initWithVideo:@"/Users/king/Desktop/Stellar.mp4"];//self.video = [[SJMoiveObject alloc] initWithVideo:@"/Users/king/Downloads/Worth it - Fifth Harmony ft.Kid Ink - May J Lee Choreography.mp4"];//self.video = [[SJMoiveObject alloc] initWithVideo:@"/Users/king/Downloads/4K.mp4"];//self.video = [[SJMoiveObject alloc] initWithVideo:@"http://wvideo.spriteapp.cn/video/2016/0328/56f8ec01d9bfe_wpd.mp4"];//video.outputWidth = 800;//video.outputHeight = 600;self.audio = [[SJAudioObject alloc] initWithVideo:@"/Users/king/Desktop/Stellar.mp4"];
NSLog(@"視頻總時長>>>video duration: %f",video.duration);
NSLog(@"源尺寸>>>video size: %d x %d", video.sourceWidth, video.sourceHeight);
NSLog(@"輸出尺寸>>>video size: %d x %d", video.outputWidth, video.outputHeight);////[self.audio seekTime:0.0];//SJLog(@"%f", [self.audio duration])//AVPacket *packet = [self.audio readPacket];//SJLog(@"%ld", [self.audio decode])inttns, thh, tmm, tss;
tns=video.duration;
thh= tns /3600;
tmm= (tns %3600) /60;
tss= tns %60;//NSLog(@"fps --> %.2f", video.fps);////? ? ? ? [ImageView setTransform:CGAffineTransformMakeRotation(M_PI)];//NSLog(@"%02d:%02d:%02d",thh,tmm,tss);}- (IBAction)PlayClick:(UIButton *)sender {
[playBtn setEnabled:NO];
_lastFrameTime= -1;//seek to 0.0 seconds[video seekTime:0.0];
[NSTimer scheduledTimerWithTimeInterval:1/video.fps
target:self
selector:@selector(displayNextFrame:)
userInfo:nil
repeats:YES];
}- (IBAction)TimerCilick:(id)sender {//NSLog(@"current time: %f s",video.currentTime);//[video seekTime:150.0];//[video replaceTheResources:@"/Users/king/Desktop/Stellar.mp4"];if(playBtn.enabled) {
[video redialPaly];
[self PlayClick:playBtn];
}
}-(void)displayNextFrame:(NSTimer *)timer {
NSTimeInterval startTime=[NSDate timeIntervalSinceReferenceDate];//self.TimerLabel.text = [NSString stringWithFormat:@"%f s",video.currentTime];self.TimerLabel.text? =[self dealTime:video.currentTime];if(![video stepFrame]) {
[timer invalidate];
[playBtn setEnabled:YES];return;
}
ImageView.image=video.currentImage;floatframeTime =1.0/ ([NSDate timeIntervalSinceReferenceDate] -startTime);if(_lastFrameTime <0) {
_lastFrameTime=frameTime;
}else{
_lastFrameTime= LERP(frameTime, _lastFrameTime,0.8);
}
[fps setText:[NSString stringWithFormat:@"fps %.0f",_lastFrameTime]];
}- (NSString *)dealTime:(double)time {inttns, thh, tmm, tss;
tns=time;
thh= tns /3600;
tmm= (tns %3600) /60;
tss= tns %60;//[ImageView setTransform:CGAffineTransformMakeRotation(M_PI)];return[NSString stringWithFormat:@"%02d:%02d:%02d",thh,tmm,tss];
}@end
運程序 ,點擊播放
我的測試結果如下:
原文地址:http://bbs.520it.com/forum.php?mod=viewthread&tid=707&page=1&extra=#pid3821
我集成后的demo:github源碼下載:https://github.com/xiayuanquan/FFmpegDemo