最近在學(xué)習(xí)OpenGL,本篇文章是學(xué)習(xí)OpenGL一段時(shí)間后做的練手項(xiàng)目的總結(jié)。先來(lái)看看最終的效果:
練手項(xiàng)目就不使用第三方框架了,就使用 AVFoundation 和 OpenGLES 來(lái)實(shí)現(xiàn)。AVFoudation用來(lái)采集攝像頭的每幀數(shù)據(jù);OpenGLES用于處理特效,并將圖像顯示到界面上。
(一)AVFoudation 采集視頻數(shù)據(jù)
(1)幾個(gè)重要的類(lèi)
AVCaptureSession:整個(gè)視頻捕捉功能的管理
AVCaptureDevice:捕捉設(shè)備,代表攝像頭,麥克風(fēng)等硬件
AVCaptureDeviceInput:AVCaptureDevice不能直接使用,需要包裝成 AVCaptureDeviceInput,才能傳入AVCaptureSession中
AVCaptureOutput:結(jié)果輸出類(lèi),設(shè)置了什么輸出,最后就會(huì)把捕捉結(jié)果以設(shè)置的格式輸出
a AVCaptureStillImageOutput 輸出靜態(tài)圖片
b AVCaputureMovieFileOutput 輸出視頻
c AVCaputureAudioDataOutput 輸出每幀音頻數(shù)據(jù)
d AVCaputureVideoDataOutput 輸出每幀視頻數(shù)據(jù)
例如,我只需要用到每幀視頻數(shù)據(jù),那么設(shè)置 AVCaputureVideoDataOutput 就可以了
AVCaptureConnection:代表輸入和輸出設(shè)備之間的連接,設(shè)置一些輸入或者輸出的屬性
AVCaptureVideoPreviewLayer:照片/視頻捕捉結(jié)果的預(yù)覽圖層
(2)初始化和設(shè)置 session
基本思路就是創(chuàng)建session,然后將輸入設(shè)備添加到session中,再設(shè)置捕捉之后需要輸出的數(shù)據(jù)格式,然后開(kāi)啟session,就能捕捉到數(shù)據(jù)了。這里要注意的是改變session的配置時(shí),都需要在改變前后寫(xiě)上 beginConfiguration 和 commitConfiguration 方法。
- (void)setup {
//所有video設(shè)備
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
//前置攝像頭
self.frontCamera = [AVCaptureDeviceInput deviceInputWithDevice:videoDevices.lastObject error:nil];
self.backCamera = [AVCaptureDeviceInput deviceInputWithDevice:videoDevices.firstObject error:nil];
//設(shè)置當(dāng)前設(shè)備為前置
self.videoInputDevice = self.backCamera;
//視頻輸出
self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[self.videoDataOutput setSampleBufferDelegate:self queue:self.captureQueue];
// 丟棄延遲的視頻幀
self.videoDataOutput.alwaysDiscardsLateVideoFrames = YES;
// 指定像素的輸出格式
self.videoDataOutput.videoSettings = @{
(__bridge NSString *)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
};
//配置
[self.captureSession beginConfiguration];
if ([self.captureSession canAddInput:self.videoInputDevice]) {
[self.captureSession addInput:self.videoInputDevice];
}
if([self.captureSession canAddOutput:self.videoDataOutput]){
[self.captureSession addOutput:self.videoDataOutput];
}
// 設(shè)置分辨率
[self setVideoPreset];
[self.captureSession commitConfiguration];
self.videoConnection = [self.videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
//設(shè)置視頻輸出方向
self.videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;
// 設(shè)置fps
[self updateFps:30];
}
// 設(shè)置分辨率
- (void)setVideoPreset{
if ([self.captureSession canSetSessionPreset:AVCaptureSessionPreset1920x1080]) {
self.captureSession.sessionPreset = AVCaptureSessionPreset1920x1080;
_witdh = 1080; _height = 1920;
}else if ([self.captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) {
self.captureSession.sessionPreset = AVCaptureSessionPreset1280x720;
_witdh = 720; _height = 1280;
}else{
self.captureSession.sessionPreset = AVCaptureSessionPreset640x480;
_witdh = 480; _height = 640;
}
}
// 設(shè)置fps
-(void)updateFps:(NSInteger) fps{
//獲取當(dāng)前capture設(shè)備
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
//遍歷所有設(shè)備(前后攝像頭)
for (AVCaptureDevice *vDevice in videoDevices) {
//獲取當(dāng)前支持的最大fps
float maxRate = [(AVFrameRateRange *)[vDevice.activeFormat.videoSupportedFrameRateRanges objectAtIndex:0] maxFrameRate];
//如果想要設(shè)置的fps小于或等于做大fps,就進(jìn)行修改
if (maxRate >= fps) {
//實(shí)際修改fps的代碼
if ([vDevice lockForConfiguration:NULL]) {
vDevice.activeVideoMinFrameDuration = CMTimeMake(10, (int)(fps * 10));
vDevice.activeVideoMaxFrameDuration = vDevice.activeVideoMinFrameDuration;
[vDevice unlockForConfiguration];
}
}
}
}
- (AVCaptureSession *)captureSession{
if (!_captureSession) {
_captureSession = [[AVCaptureSession alloc] init];
}
return _captureSession;
}
- (dispatch_queue_t)captureQueue{
if (!_captureQueue) {
_captureQueue = dispatch_queue_create("TMCapture Queue", NULL);
}
return _captureQueue;
}
輸出的視頻幀以代理方式回調(diào):
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
[self.delegate captureSampleBuffer:sampleBuffer];
}
(二) OpenGLES 處理特效
這一部分內(nèi)容較多,需要有OpenGL的基礎(chǔ)知識(shí),分為如下幾個(gè)步驟:
a 創(chuàng)建 frameBuffer 和 renderBuffer
b 創(chuàng)建紋理緩沖區(qū),從視頻幀數(shù)據(jù)獲取紋理
c 編譯鏈接自定義shader
d 將 attributes,uniforms,texture 傳入 shader
e 繪制與顯示
首先自定義一個(gè)類(lèi)繼承于 CAEAGLLayer(這個(gè)類(lèi)是蘋(píng)果提供的專(zhuān)門(mén)用于顯示OpenGL圖像數(shù)據(jù)的layer),提供一個(gè)方法接收外部的視頻幀數(shù)據(jù):
typedef NS_ENUM(NSInteger, LZProgramType) {
LZProgramTypeVertigo, // 幻影
LZProgramTypeRag, // 局部模糊
LZProgramTypeShake, // 抖動(dòng)
LZProgramTypeMosaic // 馬賽克
};
@interface LZDisplayLayer : CAEAGLLayer
// 使用哪一種特效
@property(nonatomic, assign) LZProgramType useProgram;
- (instancetype)initWithFrame:(CGRect)frame;
- (void)displayWithPixelBuffer:(CVPixelBufferRef)pixelBuffer;
@end
(1)創(chuàng)建 frameBuffer 和 renderBuffer
我們最終繪制完成的每幀數(shù)據(jù)將保存在這兩個(gè)Buffer中,renderBuffer會(huì)與CAEAGLLayer綁定。
- (void)createBuffers
{
// 創(chuàng)建幀緩存區(qū)
glGenFramebuffers(1, &_frameBufferHandle);
glBindFramebuffer(GL_FRAMEBUFFER, _frameBufferHandle);
// 創(chuàng)建color緩存區(qū)
glGenRenderbuffers(1, &_colorBufferHandle);
glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);
// 綁定渲染緩存區(qū)
[_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:self];
// 得到渲染緩存區(qū)的尺寸
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &_backingWidth);
glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &_backingHeight);
// 綁定renderBuffer到FrameBuffer
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _colorBufferHandle);
}
(2)創(chuàng)建紋理緩沖區(qū),從視頻幀數(shù)據(jù)獲取紋理
紋理在OpenGL中就代表圖像的原始數(shù)據(jù)(位圖),由于視頻幀數(shù)據(jù)是YUV420格式的數(shù)據(jù)(AVCaptureSession 采集時(shí)設(shè)置的kCVPixelFormatType_420YpCbCr8BiPlanarFullRange),會(huì)有兩個(gè)平面(Y平面和UV平面),所以對(duì)應(yīng)的紋理也需要?jiǎng)?chuàng)建兩個(gè)。后面在shader的編寫(xiě)中,會(huì)把YUV數(shù)據(jù)轉(zhuǎn)化為RGB數(shù)據(jù)。
由于是視頻數(shù)據(jù)渲染比較頻繁,所以使用紋理緩沖區(qū),其工作原理就是創(chuàng)建一塊專(zhuān)門(mén)用于存放紋理的緩沖區(qū),每次創(chuàng)建新的紋理都使用緩沖區(qū)的內(nèi)存,這樣不用重新創(chuàng)建,在需要頻繁創(chuàng)建紋理時(shí)可以提高效率。
創(chuàng)建紋理緩沖區(qū):
/*
CVOpenGLESTextureCacheCreate
功能: 創(chuàng)建 CVOpenGLESTextureCacheRef 創(chuàng)建新的紋理緩存
參數(shù)1: kCFAllocatorDefault默認(rèn)內(nèi)存分配器.
參數(shù)2: NULL
參數(shù)3: EAGLContext 圖形上下文
參數(shù)4: NULL
參數(shù)5: 新創(chuàng)建的紋理緩存
@result kCVReturnSuccess
*/
CVReturn err;
err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, _context, NULL, &_videoTextureCache);
if (err != noErr) {
NSLog(@"Error at CVOpenGLESTextureCacheCreate %d", err);
return;
}
創(chuàng)建紋理:
// 返回像素緩沖區(qū)的平面數(shù)
size_t planeCount = CVPixelBufferGetPlaneCount(pixelBuffer);
/*
從像素緩存區(qū)pixelBuffer創(chuàng)建Y和UV紋理,這些紋理會(huì)被繪制在幀緩存區(qū)的Y平面上.
*/
// 激活紋理
glActiveTexture(GL_TEXTURE0);
// 創(chuàng)建亮度紋理-Y紋理
/*
CVOpenGLESTextureCacheCreateTextureFromImage
功能:根據(jù)CVImageBuffer創(chuàng)建CVOpenGlESTexture 紋理對(duì)象
參數(shù)1: 內(nèi)存分配器,kCFAllocatorDefault
參數(shù)2: 紋理緩存.紋理緩存將管理紋理的紋理緩存對(duì)象
參數(shù)3: sourceImage.
參數(shù)4: 紋理屬性.默認(rèn)給NULL
參數(shù)5: 目標(biāo)紋理,GL_TEXTURE_2D
參數(shù)6: 指定紋理中顏色組件的數(shù)量(GL_RGBA, GL_LUMINANCE, GL_RGBA8_OES, GL_RG, and GL_RED (NOTE: 在 GLES3 使用 GL_R8 替代 GL_RED).)
參數(shù)7: 幀寬度
參數(shù)8: 幀高度
參數(shù)9: 格式指定像素?cái)?shù)據(jù)的格式
參數(shù)10: 指定像素?cái)?shù)據(jù)的數(shù)據(jù)類(lèi)型,GL_UNSIGNED_BYTE
參數(shù)11: planeIndex
參數(shù)12: 紋理輸出新創(chuàng)建的紋理對(duì)象將放置在此處。
*/
CVReturn err;
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_videoTextureCache,
pixelBuffer,
NULL,
GL_TEXTURE_2D,
GL_RED_EXT,
frameWidth,
frameHeight,
GL_RED_EXT,
GL_UNSIGNED_BYTE,
0,
&_lumaTexture);
if (err) {
NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
}
// 配置亮度紋理屬性
// 綁定紋理.
glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture));
// 配置紋理放大/縮小過(guò)濾方式以及紋理圍繞S/T環(huán)繞方式
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
// 如果顏色通道個(gè)數(shù)>1,則除了Y還有UV-Plane.
if(planeCount == 2) {
// 激活UV-plane紋理
glActiveTexture(GL_TEXTURE1);
// 創(chuàng)建UV-plane紋理
/*
CVOpenGLESTextureCacheCreateTextureFromImage
功能:根據(jù)CVImageBuffer創(chuàng)建CVOpenGlESTexture 紋理對(duì)象
參數(shù)1: 內(nèi)存分配器,kCFAllocatorDefault
參數(shù)2: 紋理緩存.紋理緩存將管理紋理的紋理緩存對(duì)象
參數(shù)3: sourceImage.
參數(shù)4: 紋理屬性.默認(rèn)給NULL
參數(shù)5: 目標(biāo)紋理,GL_TEXTURE_2D
參數(shù)6: 指定紋理中顏色組件的數(shù)量(GL_RGBA, GL_LUMINANCE, GL_RGBA8_OES, GL_RG, and GL_RED (NOTE: 在 GLES3 使用 GL_R8 替代 GL_RED).)
參數(shù)7: 幀寬度
參數(shù)8: 幀高度
參數(shù)9: 格式指定像素?cái)?shù)據(jù)的格式
參數(shù)10: 指定像素?cái)?shù)據(jù)的數(shù)據(jù)類(lèi)型,GL_UNSIGNED_BYTE
參數(shù)11: planeIndex
參數(shù)12: 紋理輸出新創(chuàng)建的紋理對(duì)象將放置在此處。
*/
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
_videoTextureCache,
pixelBuffer,
NULL,
GL_TEXTURE_2D,
GL_RG_EXT,
frameWidth / 2,
frameHeight / 2,
GL_RG_EXT,
GL_UNSIGNED_BYTE,
1,
&_chromaTexture);
if (err) {
NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
}
// 綁定紋理
glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture));
// 配置紋理放大/縮小過(guò)濾方式以及紋理圍繞S/T環(huán)繞方式
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
}
(3)編譯鏈接自定義shader
特效的實(shí)現(xiàn)需要我們自定義片元著色器,使用OpenGL和蘋(píng)果封裝的著色器無(wú)法實(shí)現(xiàn),所以需要自己編譯鏈接編寫(xiě)的shader;分為2步:
a 編譯shader
b 將shader和program鏈接
編譯 shader:
- (GLuint)compileShaderWithName:(NSString *)name type:(GLenum)shaderType {
//1.獲取shader 路徑
NSString *shaderPath = [[NSBundle mainBundle] pathForResource:name ofType:shaderType == GL_VERTEX_SHADER ? @"vsh" : @"fsh"];
NSError *error;
NSString *shaderString = [NSString stringWithContentsOfFile:shaderPath encoding:NSUTF8StringEncoding error:&error];
if (!shaderString) {
NSAssert(NO, @"讀取shader失敗");
exit(1);
}
//2. 創(chuàng)建shader->根據(jù)shaderType
GLuint shader = glCreateShader(shaderType);
//3.獲取shader source
const char *shaderStringUTF8 = [shaderString UTF8String];
int shaderStringLength = (int)[shaderString length];
glShaderSource(shader, 1, &shaderStringUTF8, &shaderStringLength);
//4.編譯shader
glCompileShader(shader);
//5.查看編譯是否成功
GLint compileSuccess;
glGetShaderiv(shader, GL_COMPILE_STATUS, &compileSuccess);
if (compileSuccess == GL_FALSE) {
GLchar messages[256];
glGetShaderInfoLog(shader, sizeof(messages), 0, &messages[0]);
NSString *messageString = [NSString stringWithUTF8String:messages];
NSAssert(NO, @"shader編譯失敗:%@", messageString);
exit(1);
}
//6.返回shader
return shader;
}
將shader和program鏈接:
- (GLuint)programWithShaderName:(NSString *)shaderName {
//1. 編譯頂點(diǎn)著色器/片元著色器
GLuint vertexShader = [self compileShaderWithName:@"Vertex" type:GL_VERTEX_SHADER];
GLuint fragmentShader = [self compileShaderWithName:shaderName type:GL_FRAGMENT_SHADER];
//2. 將頂點(diǎn)/片元附著到program
GLuint program = glCreateProgram();
glAttachShader(program, vertexShader);
glAttachShader(program, fragmentShader);
//3.linkProgram
glLinkProgram(program);
//4.檢查是否link成功
GLint linkSuccess;
glGetProgramiv(program, GL_LINK_STATUS, &linkSuccess);
if (linkSuccess == GL_FALSE) {
GLchar messages[256];
glGetProgramInfoLog(program, sizeof(messages), 0, &messages[0]);
NSString *messageString = [NSString stringWithUTF8String:messages];
NSAssert(NO, @"program鏈接失敗:%@", messageString);
exit(1);
}
//5.返回program
return program;
}
(4)將 attributes,uniforms,texture 傳入 shader
attributes:頂點(diǎn)數(shù)據(jù)和紋理數(shù)據(jù),確定圖像的位置和尺寸,可傳入頂點(diǎn)著色器,再籍由頂點(diǎn)著色器傳入片元著色器
uniforms:應(yīng)用傳給shader的常量,可傳入頂點(diǎn)著色器和片元著色器
texture:紋理id,代表圖像數(shù)據(jù),可傳入頂點(diǎn)著色器和片元著色器,本項(xiàng)目中頂點(diǎn)著色器不會(huì)用到紋理,因此只傳入片元著色器
頂點(diǎn)數(shù)據(jù)和紋理數(shù)據(jù)的計(jì)算:
// 根據(jù)視頻的方向和縱橫比設(shè)置四邊形頂點(diǎn)
CGRect viewBounds = self.bounds;
CGSize contentSize = CGSizeMake(frameWidth, frameHeight);
/*
AVMakeRectWithAspectRatioInsideRect
功能: 返回一個(gè)按比例縮放的CGRect,該CGRect保持由邊界CGRect內(nèi)的CGSize指定的縱橫比
參數(shù)1:希望保持的寬高比或縱橫比
參數(shù)2:填充的rect
*/
CGRect vertexSamplingRect = AVMakeRectWithAspectRatioInsideRect(contentSize, viewBounds);
// 計(jì)算四邊形坐標(biāo)以將幀繪制到其中
CGSize normalizedSamplingSize = CGSizeMake(0.0, 0.0);
CGSize cropScaleAmount = CGSizeMake(vertexSamplingRect.size.width/viewBounds.size.width,vertexSamplingRect.size.height/viewBounds.size.height);
if (cropScaleAmount.width > cropScaleAmount.height) {
normalizedSamplingSize.width = 1.0;
normalizedSamplingSize.height = cropScaleAmount.height/cropScaleAmount.width;
}
else {
normalizedSamplingSize.width = cropScaleAmount.width/cropScaleAmount.height;
normalizedSamplingSize.height = 1.0;;
}
/*
四頂點(diǎn)數(shù)據(jù)定義了繪制像素緩沖區(qū)的二維平面區(qū)域。
使用(-1,-1)和(1,1)分別作為左下角和右上角坐標(biāo)形成的頂點(diǎn)數(shù)據(jù)覆蓋整個(gè)屏幕。
*/
GLfloat quadVertexData [] = {
-1 * normalizedSamplingSize.width, -1 * normalizedSamplingSize.height,
normalizedSamplingSize.width, -1 * normalizedSamplingSize.height,
-1 * normalizedSamplingSize.width, normalizedSamplingSize.height,
normalizedSamplingSize.width, normalizedSamplingSize.height,
};
/*
紋理頂點(diǎn)的設(shè)置使我們垂直翻轉(zhuǎn)紋理。這使得我們的左上角原點(diǎn)緩沖區(qū)匹配OpenGL的左下角紋理坐標(biāo)系
*/
CGRect textureSamplingRect = CGRectMake(0, 0, 1, 1);
GLfloat quadTextureData[] = {
CGRectGetMinX(textureSamplingRect), CGRectGetMaxY(textureSamplingRect),
CGRectGetMaxX(textureSamplingRect), CGRectGetMaxY(textureSamplingRect),
CGRectGetMinX(textureSamplingRect), CGRectGetMinY(textureSamplingRect),
CGRectGetMaxX(textureSamplingRect), CGRectGetMinY(textureSamplingRect)
};
將 attributes,uniforms,texture 傳入 shader:
// 坐標(biāo)數(shù)據(jù)
int position = glGetAttribLocation(self.usingProgram, "position");
glVertexAttribPointer(position, 2, GL_FLOAT, 0, 0, quadVertexData);
glEnableVertexAttribArray(position);
// 更新紋理坐標(biāo)屬性值
int texCoord = glGetAttribLocation(self.usingProgram, "texCoord");
glVertexAttribPointer(texCoord, 2, GL_FLOAT, 0, 0, quadTextureData);
glEnableVertexAttribArray(texCoord);
// 使用shaderProgram
glUseProgram(self.program[self.useProgram]);
self.usingProgram = self.program[0];
// 獲取uniform的位置
// Y亮度紋理
uniforms[UNIFORM_Y] = glGetUniformLocation(self.usingProgram, "SamplerY");
// UV色量紋理
uniforms[UNIFORM_UV] = glGetUniformLocation(self.usingProgram, "SamplerUV");
// YUV->RGB
uniforms[UNIFORM_COLOR_CONVERSION_MATRIX] = glGetUniformLocation(self.usingProgram, "colorConversionMatrix");
// 時(shí)間差
uniforms[UNIFORM_TIME] = glGetUniformLocation(self.usingProgram, "Time");
glUniform1i(uniforms[UNIFORM_Y], 0);
glUniform1i(uniforms[UNIFORM_UV], 1);
glUniformMatrix3fv(uniforms[UNIFORM_COLOR_CONVERSION_MATRIX], 1, GL_FALSE, _preferredConversion);
//傳遞Uniform屬性到shader
//UNIFORM_COLOR_CONVERSION_MATRIX YUV->RGB顏色矩陣
glUniformMatrix3fv(uniforms[UNIFORM_COLOR_CONVERSION_MATRIX], 1, GL_FALSE, _preferredConversion);
// 傳入當(dāng)前時(shí)間與繪制開(kāi)始時(shí)間的時(shí)間差
NSTimeInterval time = [[NSDate date] timeIntervalSinceDate:self.startDate];
glUniform1f(uniforms[UNIFORM_TIME], time);
(5)繪制與顯示
// 綁定幀緩存區(qū)
glBindFramebuffer(GL_FRAMEBUFFER, _frameBufferHandle);
// 設(shè)置視口.
glViewport(0, 0, _backingWidth, _backingHeight);
// 繪制圖形
glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
// 綁定渲染緩存區(qū)
glBindRenderbuffer(GL_RENDERBUFFER, _colorBufferHandle);
// 顯示到屏幕
[_context presentRenderbuffer:GL_RENDERBUFFER];
(三)shader 特效
特效是自定義片元著色器編寫(xiě)的,馬賽克特效:
precision mediump float;
varying highp vec2 texCoordVarying;
uniform sampler2D SamplerY;
uniform sampler2D SamplerUV;
uniform mat3 colorConversionMatrix;
uniform float Time;
const vec2 TexSize = vec2(375.0, 667.0);
const vec2 mosaicSize = vec2(20.0, 20.0);
const float PI = 3.1415926;
vec4 getRgba(vec2 texCoordVarying) {
mediump vec3 yuv;
lowp vec3 rgb;
yuv.x = (texture2D(SamplerY, texCoordVarying).r - (16.0/255.0));
yuv.yz = (texture2D(SamplerUV, texCoordVarying).rg - vec2(0.5, 0.5));
rgb = colorConversionMatrix * yuv;
return vec4(rgb, 1);
}
void main () {
float duration = 3.0;
float maxScale = 1.0;
float time = mod(Time, duration);
float progress = sin(time * (PI / duration));
float scale = maxScale * progress;
vec2 finSize = mosaicSize * scale;
vec2 intXY = vec2(texCoordVarying.x*TexSize.x, texCoordVarying.y*TexSize.y);
vec2 XYMosaic = vec2(floor(intXY.x/finSize.x)*finSize.x, floor(intXY.y/finSize.y)*finSize.y);
vec2 UVMosaic = vec2(XYMosaic.x/TexSize.x, XYMosaic.y/TexSize.y);
gl_FragColor = getRgba(UVMosaic);
}
自己搞的,實(shí)際應(yīng)該不會(huì)有這種特效吧哈哈哈。原理就是把整個(gè)紋理當(dāng)成是一張375x667的圖片,把圖片一塊塊小區(qū)域,切分區(qū)域的大小隨時(shí)間變化(正弦函數(shù)取上半部分)。根據(jù)當(dāng)前像素點(diǎn)的坐標(biāo)數(shù)據(jù)可以確定其在哪一塊區(qū)域,然后像素點(diǎn)的顏色值就取其所在區(qū)域左上角第一個(gè)像素點(diǎn)的顏色。
幻影,局部模糊,抖動(dòng)特效是參考雷曼同學(xué)的文章。
至此,從采集視頻到添加濾鏡整個(gè)過(guò)程就完成了。完整項(xiàng)目的github地址:
https://github.com/linzhesheng/AVFoundationAndOpenGLES。