前言
本文是关于OpenGL ES的系统性学习过程,记录了自己在学习OpenGL ES时的收获。
这篇文章的作用是利用学习的OpenGL ES知识去渲染从iOS相机中获取的视频数据。
环境是Xcode8.1+OpenGL ES 2.0
目前代码已经放到github上面,OpenGL ES入门11-相机视频渲染
欢迎关注我的 OpenGL ES入门专题
实现效果
知识点
- yuv格式是一种图片储存格式,跟RGB格式类似。yuv中,y表示亮度,单独只有y数据就可以形成一张图片,只不过这张图片是灰色的。u和v表示色差(u和v也被称为:Cb-蓝色差,Cr-红色差)。最早的电视信号,为了兼容黑白电视,采用的就是yuv格式。一张yuv的图像,去掉uv,只保留y,这张图片就是黑白的。yuv可以通过抛弃色差来进行带宽优化。比如yuv420格式图像相比RGB来说,要节省一半的字节大小,抛弃相邻的色差对于人眼来说,差别不大。
- yuv图像占用字节数为 :
size = width * height + (width * height) / 4 + (width * height) / 4
- RGB格式的图像占用字节数为:
size = width * height * 3
- RGBA格式的图像占用字节数为:
size = width * height * 4
- yuv420也包含不同的数据排列格式:I420,NV12,NV21.
I420格式:y,u,v 3个部分分别存储:Y0,Y1…Yn,U0,U1…Un/2,V0,V1…Vn/2
NV12格式:y和uv 2个部分分别存储:Y0,Y1…Yn,U0,V0,U1,V1…Un/2,Vn/2
NV21格式:同NV12,只是U和V的顺序相反。
- iOS相机输出图片格式,下图为设备支持的格式:
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v',表示输出的视频格式为NV12;范围: (luma=[16,235] chroma=[16,240])
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange = '420f',表示输出的视频格式为NV12;范围: (luma=[0,255] chroma=[1,255])
kCVPixelFormatType_32BGRA = 'BGRA', 输出的是BGRA的格式
实现过程
1、 捕获视频数据,设置输出格式。
- (void)setupSession
{
_captureSession = [[AVCaptureSession alloc] init];
[_captureSession beginConfiguration];
// 设置换面尺寸
[_captureSession setSessionPreset:AVCaptureSessionPreset640x480];
// 设置输入设备
AVCaptureDevice *inputCamera = nil;
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices)
{
if ([device position] == AVCaptureDevicePositionBack)
{
inputCamera = device;
}
}
if (!inputCamera) {
return;
}
NSError *error = nil;
_videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:inputCamera error:&error];
if ([_captureSession canAddInput:_videoInput])
{
[_captureSession addInput:_videoInput];
}
// 设置输出数据
_videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[_videoOutput setAlwaysDiscardsLateVideoFrames:NO];
//设置输出格式
[_videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[_videoOutput setSampleBufferDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
if ([_captureSession canAddOutput:_videoOutput]) {
[_captureSession addOutput:_videoOutput];
}
[_captureSession commitConfiguration];
}
2、实现代理方法,获取视频输出数据。
#pragma mark - <AVCaptureVideoDataOutputSampleBufferDelegate>
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if (!self.captureSession.isRunning) {
return;
}else if (captureOutput == _videoOutput) {
OpenGLESView *glView = (OpenGLESView *)self.view;
// 根据设置的纹理格式进行处理
if ([glView.render isMemberOfClass:[GLRenderRGB class]]) {
[self processVideoSampleBufferToRGB1:sampleBuffer];
}else {
[self processVideoSampleBufferToYUV:sampleBuffer];
}
}
}
3、设置OpenGLESView,初始化的时候我们需要传入一个渲染器GLRender,它的作用是负责处理不同类型数据(RGBA以及YUV)的渲染。
//
// OpenGLESView.m
// OpenGLES01-环境搭建
//
// Created by qinmin on 2017/2/9.
// Copyright © 2017年 qinmin. All rights reserved.
//
#import "OpenGLESView.h"
#import <OpenGLES/ES2/gl.h>
#import "GLUtil.h"
@interface OpenGLESView : UIView
@property (nonatomic, strong) GLRender *render;
- (void)setTexture:(GLTexture *)texture;
- (void)setNeedDraw;
@end
@interface OpenGLESView ()
{
CAEAGLLayer *_eaglLayer;
EAGLContext *_context;
GLuint _colorRenderBuffer;
GLuint _frameBuffer;
GLRender *_render;
}
@end
@implementation OpenGLESView
+ (Class)layerClass
{
// 只有 [CAEAGLLayer class] 类型的 layer 才支持在其上描绘 OpenGL 内容。
return [CAEAGLLayer class];
}
- (void)dealloc
{
}
- (instancetype)initWithFrame:(CGRect)frame
{
if (self = [super initWithFrame:frame]) {
[self setupLayer];
[self setupContext];
}
return self;
}
- (void)layoutSubviews
{
[EAGLContext setCurrentContext:_context];
[self destoryRenderAndFrameBuffer];
[self setupFrameAndRenderBuffer];
}
#pragma mark - Setup
- (void)setupLayer
{
_eaglLayer = (CAEAGLLayer*) self.layer;
// CALayer 默认是透明的,必须将它设为不透明才能让其可见
_eaglLayer.opaque = YES;
// 设置描绘属性,在这里设置不维持渲染内容以及颜色格式为 RGBA8
_eaglLayer.drawableProperties = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:NO], kEAGLDrawablePropertyRetainedBacking, kEAGLColorFormatRGBA8, kEAGLDrawablePropertyColorFormat, nil];
}
- (void)setupContext
{
// 设置OpenGLES的版本为2.0 当然还可以选择1.0和最新的3.0的版本,以后我们会讲到2.0与3.0的差异,目前为了兼容性选择2.0的版本
_context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
if (!_context) {
NSLog(@"Failed to initialize OpenGLES 2.0 context");
exit(1);
}
// 将当前上下文设置为我们创建的上下文
if (![EAGLContext setCurrentContext:_context]) {
NSLog(@"Failed to set current OpenGL context");
exit(1);
}
}
- (void)setupFrameAndRenderBuffer
{
glGenRenderbuffers(1, &_colorRenderBuffer);
glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderBuffer);
// 为 color renderbuffer 分配存储空间
[_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:_eaglLayer];
glGenFramebuffers(1, &_frameBuffer);
// 设置为当前 framebuffer
glBindFramebuffer(GL_FRAMEBUFFER, _frameBuffer);
// 将 _colorRenderBuffer 装配到 GL_COLOR_ATTACHMENT0 这个装配点上
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
GL_RENDERBUFFER, _colorRenderBuffer);
}
#pragma mark - Clean
- (void)destoryRenderAndFrameBuffer
{
glDeleteFramebuffers(1, &_frameBuffer);
_frameBuffer = 0;
glDeleteRenderbuffers(1, &_colorRenderBuffer);
_colorRenderBuffer = 0;
}
#pragma mark - Render
- (void)draw
{
glClearColor(1.0, 1.0, 1.0, 1.0);
glClear(GL_COLOR_BUFFER_BIT);
glViewport(0, 0, self.frame.size.width, self.frame.size.height);
// 绘制
[_render prepareRender];
//将指定 renderbuffer 呈现在屏幕上,在这里我们指定的是前面已经绑定为当前 renderbuffer 的那个,在 renderbuffer 可以被呈现之前,必须调用renderbufferStorage:fromDrawable: 为之分配存储空间。
[_context presentRenderbuffer:GL_RENDERBUFFER];
}
#pragma mark - PublicMethod
- (void)setRender:(GLRender *)render
{
_render = render;
}
- (void)setTexture:(GLTexture *)texture
{
[_render setTexture:texture];
}
- (void)setNeedDraw
{
[self draw];
}
@end
4、创建渲染器。渲染器负责创建GL程序,负责创建定点缓存数据,以及负责生成纹理缓存对象。
//
// GLRGBRender.h
// OpenGLES11-相机视频渲染
//
// Created by mac on 17/3/24.
// Copyright © 2017年 Qinmin. All rights reserved.
//
#import <UIKit/UIKit.h>
#import "GLTexture.h"
#import "GLUtil.h"
@interface GLRender : NSObject
@property (nonatomic, assign) GLuint program;
@property (nonatomic, assign) GLuint vertexVBO;
@property (nonatomic, assign) int vertCount;
- (void)setTexture:(GLTexture *)texture;
- (void)prepareRender;
@end
@interface GLRenderRGB : GLRender
@property(nonatomic, assign, readonly) GLuint rgb;
@end
@interface GLRenderYUV : GLRender
@property(nonatomic, assign, readonly) GLuint y;
@property(nonatomic, assign, readonly) GLuint u;
@property(nonatomic, assign, readonly) GLuint v;
@end
////////////////GLRender//////////////////////////
@implementation GLRender
- (void)setupGLProgram
{
}
- (void)setTexture:(GLTexture *)texture
{
}
- (void)prepareRender
{
}
@end
////////////////GLRenderRGB//////////////////////////
@implementation GLRenderRGB
- (instancetype)init
{
if (self = [super init]) {
[self setupGLProgram];
[self setupVBO];
_rgb = createTexture2D(GL_RGBA, 640, 480, NULL);
}
return self;
}
- (void)setupGLProgram
{
NSString *vertFile = [[NSBundle mainBundle] pathForResource:@"vert.glsl" ofType:nil];
NSString *fragFile = [[NSBundle mainBundle] pathForResource:@"frag_rgb.glsl" ofType:nil];
self.program = createGLProgramFromFile(vertFile.UTF8String, fragFile.UTF8String);
glUseProgram(self.program);
}
- (void)setupVBO
{
self.vertCount = 6;
GLfloat vertices[] = {
0.8f, 0.6f, 0.0f, 1.0f, 0.0f, // 右上
0.8f, -0.6f, 0.0f, 1.0f, 1.0f, // 右下
-0.8f, -0.6f, 0.0f, 0.0f, 1.0f, // 左下
-0.8f, -0.6f, 0.0f, 0.0f, 1.0f, // 左下
-0.8f, 0.6f, 0.0f, 0.0f, 0.0f, // 左上
0.8f, 0.6f, 0.0f, 1.0f, 0.0f, // 右上
};
// 创建VBO
self.vertexVBO = createVBO(GL_ARRAY_BUFFER, GL_STATIC_DRAW, sizeof(vertices), vertices);
}
- (void)setTexture:(GLTexture *)texture
{
if ([texture isMemberOfClass:[GLTextureRGB class]]) {
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
GLTextureRGB *rgbTexture = (GLTextureRGB *)texture;
glBindTexture(GL_TEXTURE_2D, _rgb);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture.width, texture.height, GL_RGBA, GL_UNSIGNED_BYTE, rgbTexture.RGBA);
}
}
- (void)prepareRender
{
glBindBuffer(GL_ARRAY_BUFFER, self.vertexVBO);
glEnableVertexAttribArray(glGetAttribLocation(self.program, "position"));
glVertexAttribPointer(glGetAttribLocation(self.program, "position"), 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL);
glEnableVertexAttribArray(glGetAttribLocation(self.program, "texcoord"));
glVertexAttribPointer(glGetAttribLocation(self.program, "texcoord"), 2, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL+sizeof(GL_FLOAT)*3);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, _rgb);
glUniform1i(glGetUniformLocation(self.program, "image0"), 0);
glDrawArrays(GL_TRIANGLES, 0, self.vertCount);
}
@end
////////////////GLRenderYUV//////////////////////////
@implementation GLRenderYUV
- (instancetype)init
{
if (self = [super init]) {
[self setupGLProgram];
[self setupVBO];
_y = createTexture2D(GL_LUMINANCE, 640, 480, NULL);
_u = createTexture2D(GL_LUMINANCE, 640/2, 480/2, NULL);
_v = createTexture2D(GL_LUMINANCE, 640/2, 480/2, NULL);
}
return self;
}
- (void)setupGLProgram
{
NSString *vertFile = [[NSBundle mainBundle] pathForResource:@"vert.glsl" ofType:nil];
NSString *fragFile = [[NSBundle mainBundle] pathForResource:@"frag.glsl" ofType:nil];
self.program = createGLProgramFromFile(vertFile.UTF8String, fragFile.UTF8String);
glUseProgram(self.program);
}
- (void)setupVBO
{
self.vertCount = 6;
GLfloat vertices[] = {
0.8f, 0.6f, 0.0f, 1.0f, 0.0f, // 右上
0.8f, -0.6f, 0.0f, 1.0f, 1.0f, // 右下
-0.8f, -0.6f, 0.0f, 0.0f, 1.0f, // 左下
-0.8f, -0.6f, 0.0f, 0.0f, 1.0f, // 左下
-0.8f, 0.6f, 0.0f, 0.0f, 0.0f, // 左上
0.8f, 0.6f, 0.0f, 1.0f, 0.0f, // 右上
};
// 创建VBO
self.vertexVBO = createVBO(GL_ARRAY_BUFFER, GL_STATIC_DRAW, sizeof(vertices), vertices);
}
- (void)setTexture:(GLTexture *)texture
{
if ([texture isMemberOfClass:[GLTextureYUV class]]) {
GLTextureYUV *rgbTexture = (GLTextureYUV *)texture;
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glBindTexture(GL_TEXTURE_2D, _y);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture.width, texture.height, GL_LUMINANCE, GL_UNSIGNED_BYTE, rgbTexture.Y);
glBindTexture(GL_TEXTURE_2D, 0);
glBindTexture(GL_TEXTURE_2D, _u);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture.width/2, texture.height/2, GL_LUMINANCE, GL_UNSIGNED_BYTE, rgbTexture.U);
glBindTexture(GL_TEXTURE_2D, 0);
glBindTexture(GL_TEXTURE_2D, _v);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texture.width/2, texture.height/2, GL_LUMINANCE, GL_UNSIGNED_BYTE, rgbTexture.V);
glBindTexture(GL_TEXTURE_2D, 0);
}
}
- (void)prepareRender
{
glBindBuffer(GL_ARRAY_BUFFER, self.vertexVBO);
glEnableVertexAttribArray(glGetAttribLocation(self.program, "position"));
glVertexAttribPointer(glGetAttribLocation(self.program, "position"), 3, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL);
glEnableVertexAttribArray(glGetAttribLocation(self.program, "texcoord"));
glVertexAttribPointer(glGetAttribLocation(self.program, "texcoord"), 2, GL_FLOAT, GL_FALSE, sizeof(GLfloat)*5, NULL+sizeof(GL_FLOAT)*3);
glActiveTexture(GL_TEXTURE0);
glBindTexture(GL_TEXTURE_2D, _y);
glUniform1i(glGetUniformLocation(self.program, "image0"), 0);
glActiveTexture(GL_TEXTURE1);
glBindTexture(GL_TEXTURE_2D, _u);
glUniform1i(glGetUniformLocation(self.program, "image1"), 1);
glActiveTexture(GL_TEXTURE2);
glBindTexture(GL_TEXTURE_2D, _v);
glUniform1i(glGetUniformLocation(self.program, "image2"), 2);
glDrawArrays(GL_TRIANGLES, 0, self.vertCount);
}
5、创建纹理描述信息。在这里,将RGBA数据打包存储,将YUV数据分开存储。
//
// GLTexture.h
// OpenGLES11-相机视频渲染
//
// Created by mac on 17/3/24.
// Copyright © 2017年 Qinmin. All rights reserved.
//
#import <Foundation/Foundation.h>
@interface GLTexture : NSObject
@property (assign, nonatomic) int width;
@property (assign, nonatomic) int height;
@end
@interface GLTextureRGB : GLTexture
@property (nonatomic, assign) uint8_t *RGBA;
@end
@interface GLTextureYUV : GLTexture
@property (nonatomic, assign) uint8_t *Y;
@property (nonatomic, assign) uint8_t *U;
@property (nonatomic, assign) uint8_t *V;
@end
@implementation GLTexture
@end
@implementation GLTextureRGB
- (void)dealloc
{
if (_RGBA) {
free(_RGBA);
_RGBA = NULL;
}
}
@end
@implementation GLTextureYUV
- (void)dealloc
{
if (_Y) {
free(_Y);
_Y = NULL;
}
if (_U) {
free(_U);
_U = NULL;
}
if (_V) {
free(_V);
_V = NULL;
}
}
@end
6、初始化OpenGLESView。并传入渲染方式。
- (void)viewDidLoad {
[super viewDidLoad];
OpenGLESView *glView = [[OpenGLESView alloc] initWithFrame:self.view.frame];
// RGB方式渲染
GLRender *render = [[GLRenderRGB alloc] init];
[glView setRender:render];
self.view = glView;
UIButton *btn = [[UIButton alloc] initWithFrame:CGRectMake(0, 30, 120, 30)];
[btn addTarget:self action:@selector(startBtnClick:) forControlEvents:UIControlEventTouchUpInside];
[btn setTitle:@"开始" forState:UIControlStateNormal];
[btn setBackgroundColor:[UIColor greenColor]];
[self.view addSubview:btn];
[self setupSession];
}
7、按照不同的方式处理视频数据,并传递给OpenGLES渲染。
// 视频格式为:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange或kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
- (void)processVideoSampleBufferToYUV:(CMSampleBufferRef)sampleBuffer
{
//CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//表示开始操作数据
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
int pixelWidth = (int) CVPixelBufferGetWidth(pixelBuffer);
int pixelHeight = (int) CVPixelBufferGetHeight(pixelBuffer);
GLTextureYUV *yuv = [[GLTextureYUV alloc] init];
yuv.width = pixelWidth;
yuv.height = pixelHeight;
//size_t count = CVPixelBufferGetPlaneCount(pixelBuffer);
//获取CVImageBufferRef中的y数据
size_t y_size = pixelWidth * pixelHeight;
uint8_t *yuv_frame = malloc(y_size);
uint8_t *y_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
memcpy(yuv_frame, y_frame, y_size);
yuv.Y = yuv_frame;
// UV数据
uint8_t *uv_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
size_t uv_size = y_size/2;
//获取CMVImageBufferRef中的u数据
size_t u_size = y_size/4;
uint8_t *u_frame = malloc(u_size);
for (int i = 0, j = 0; i < uv_size; i += 2, j++) {
u_frame[j] = uv_frame[i];
}
yuv.U = u_frame;
//获取CMVImageBufferRef中的v数据
size_t v_size = y_size/4;
uint8_t *v_frame = malloc(v_size);
for (int i = 1, j = 0; i < uv_size; i += 2, j++) {
v_frame[j] = uv_frame[i];
}
yuv.V = v_frame;
// Unlock
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
dispatch_async(dispatch_get_main_queue(), ^{
OpenGLESView *glView = (OpenGLESView *)self.view;
[glView setTexture:yuv];
[glView setNeedDraw];
});
}
// 视频格式为:kCVPixelFormatType_32BGRA
- (void)processVideoSampleBufferToRGB:(CMSampleBufferRef)sampleBuffer
{
//CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//size_t count = CVPixelBufferGetPlaneCount(pixelBuffer);
//printf("%zud\n", count);
//表示开始操作数据
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
int pixelWidth = (int) CVPixelBufferGetWidth(pixelBuffer);
int pixelHeight = (int) CVPixelBufferGetHeight(pixelBuffer);
GLTextureRGB *rgb = [[GLTextureRGB alloc] init];
rgb.width = pixelWidth;
rgb.height = pixelHeight;
// BGRA数据
//size_t y_size = pixelWidth * pixelHeight;
uint8_t *frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
uint8_t *bgra = malloc(pixelHeight * pixelWidth * 4);
memcpy(bgra, frame, pixelHeight * pixelWidth * 4);
rgb.RGBA = bgra;
// Unlock
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
dispatch_async(dispatch_get_main_queue(), ^{
OpenGLESView *glView = (OpenGLESView *)self.view;
[glView setTexture:rgb];
[glView setNeedDraw];
});
}
// 视频格式为:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange或kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
- (void)processVideoSampleBufferToRGB1:(CMSampleBufferRef)sampleBuffer
{
//CFAbsoluteTime startTime = CFAbsoluteTimeGetCurrent();
CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
//size_t count = CVPixelBufferGetPlaneCount(pixelBuffer);
//printf("%zud\n", count);
//表示开始操作数据
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
int pixelWidth = (int) CVPixelBufferGetWidth(pixelBuffer);
int pixelHeight = (int) CVPixelBufferGetHeight(pixelBuffer);
GLTextureRGB *rgb = [[GLTextureRGB alloc] init];
rgb.width = pixelWidth;
rgb.height = pixelHeight;
// Y数据
//size_t y_size = pixelWidth * pixelHeight;
uint8_t *y_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
// UV数据
uint8_t *uv_frame = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
//size_t uv_size = y_size/2;
// ARGB = BGRA 大小端问题 转换出来的数据是BGRA
uint8_t *bgra = malloc(pixelHeight * pixelWidth * 4);
NV12ToARGB(y_frame, pixelWidth, uv_frame, pixelWidth, bgra, pixelWidth * 4, pixelWidth, pixelHeight);
rgb.RGBA = bgra;
// Unlock
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
dispatch_async(dispatch_get_main_queue(), ^{
OpenGLESView *glView = (OpenGLESView *)self.view;
[glView setTexture:rgb];
[glView setNeedDraw];
});
}
8、旋转图像与像素转换。由于相机拍摄出的图像是经过旋转了的图像,因此显示的时候需要对图像进行一个绕z轴的旋转,这里我们将着色器顶点旋转-90度。OpenGLES是按照RGBA方式处理像素,因此我们还需要把ARGB以及YUV转换为RGBA格式。
绕 Z 轴旋转的旋转矩阵可表示为:
- 顶点着色器,对position进行-90度旋转。
attribute vec3 position;
attribute vec3 color;
attribute vec2 texcoord;
varying vec2 v_texcoord;
void main()
{
const float degree = radians(-90.0);
//构建旋转矩阵
const mat3 rotate = mat3(
cos(degree), sin(degree), 0.0,
-sin(degree), cos(degree), 0.0,
0.0, 0.0, 1.0
);
gl_Position = vec4(rotate*position, 1.0);
v_texcoord = texcoord;
}
- RGBA渲染着色器,在这里由于数据是BGRA格式的,因此进行了BGRA到RGBA的转换。
precision mediump float;
varying vec2 v_texcoord;
uniform sampler2D image0;
void main()
{
// bgra
// rgba
vec4 color = texture2D(image0, v_texcoord);
// 像素转换
gl_FragColor = vec4(color.bgr, 1.0);
}
- YUV渲染着色器,在这里由于数据是YUV格式的,因此进行了YUV到RGB的转换。
precision mediump float;
varying vec2 v_texcoord;
uniform sampler2D image0;
uniform sampler2D image1;
uniform sampler2D image2;
void main()
{
highp float y = texture2D(image0, v_texcoord).r;
highp float u = texture2D(image1, v_texcoord).r - 0.5;
highp float v = texture2D(image2, v_texcoord).r - 0.5;
// 像素转换
highp float r = y + 0.000 + 1.402 * v;
highp float g = y - 0.344 * u - 0.714 * v;
highp float b = y + 1.772 * u;
gl_FragColor = vec4(r, g, b, 1.0)
}
注意
在iOS中使用OpenGL进行绘图的时候,我们需要在主线程中调用GL命令,在其它线程中使用并不会有效果。由于相机代理输出的视频数据不在主线程中,因此我们需要手动转到主线程中调用GL相关的命令。
在这里使用了libyuv的NV12ToARGB函数,该函数转换的实际结果并不是ARGB格式,而是BGRA格式,应该是和大小端有关。因此,在着色器中只需要手动将BGRA转为RGBA就行了。
参考资料
http://www.cnblogs.com/kesalin/archive/2012/12/06/3D_math.html