iOS Camera开发指南

AVFoundation

是苹果在iOS和OS X系统中用于处理基于时间的媒体数据的Objective-C框架. 供使用者来开发媒体类型的应用程序.

在iOS端的架构:
AVFoundation stack on iOS.jpg

该框架框架包含视频相关的APIs和音频相关的APIs。

针对音频及视频,其主要提供的功能可以归纳为4各方面:


AVFoundation function.jpg

1)、Capture
音频、视频、图像媒体的捕捉,并输出我们可用的数据对象的过程。
2)、Edit
针对现有的媒体片段(音频片段或视频片段),重新创建Assets,重新加工、生成新的媒体片段。例如,Reading, Writing, Reencoding Assets, Thumbnails

3)、Export
提供导出音视频的API。例如,修改文件格式、消减时长等。

4)、Presentation
例如,播放、音视频的预览

Capture

设备的输入源主要包括:麦克风(Microphone),摄像头(Camera),屏幕等
输入源一般包括:AVCaptureVideoPreviewLayer,AVCaptureAudioPreviewOutput,文件、Raw Buffer等。

如何完成一次音视频媒体捕获?

从一个设备,例如照相机或者麦克风管理捕获,组合对象来表示输入和输出,并使用 AVCaptureSession 的实例来协调它们之间的数据流。

一个简单的会话协调:


A single session can configure multiple inputs and outputs.jpg

AVCapture Session作为整个Capture的核心,不断从输入源获取数据,然后分发给各个输出源,从而完成一次简单的会话。

AVCaptureSession参数配置

//配置会话
- (void)setupSession:(AVCaptureDevicePosition)position error:(NSError * __autoreleasing *)error
{
    dispatch_async(self.sessionQueue, ^{
        [self.session beginConfiguration];
        [self setupSessionInputs:error];
        [self setupSessionOutputs];
        [[self session] commitConfiguration];
    });
    
}

开始会话

- (void)startSession {
    
    dispatch_async([self sessionQueue], ^{
        if (![[self session] isRunning]) {
            [[self session] startRunning];
        }
    });
}

结束会话

- (void)stopSession {
    dispatch_async([self sessionQueue], ^{
        [[self session] stopRunning];
    });
}

Monitoring Capture Session State - 监视捕获会话状态
捕获会话发出你能观察并被通知到的 notifications,例如,当它开始或者停止运行,或者当它被中断。你可以注册,如果发生了运行阶段的错误,可以接收 AVCaptureSessionRuntimeErrorNotification 。也可以询问会话的 running 属性去发现它正在运行的状态,并且它的 interrupted 属性可以找到它是否被中断了。此外, runninginterrupted 属性是遵从key-value observing ,并且在通知都是在主线程上发布的。

然而,很多情况下,需要考虑多个输入源是如何被表示以及如何连接到输出。
输入源有自己的硬件参数可以设置流控,输出源作为一个被动接受对象,它并没有太多流控设置,所以苹果巧妙的引入AVCaptureConnections。每个Output与Session建立连接后,都会分配一个默认的AVCpatureConnection。
AVCaptureConnections就是Session和Output中间的控制节点。很多实时数据,也都是从connection得到的。


AVCaptureConnection represents a connection between an input and output.jpg

输入设备的配置

// 视频输入
        AVCaptureDevice *videoDevice = [XYCameraClient deviceWithMediaType:AVMediaTypeVideo preferringPosition:self.captureDevicePosition];
        AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:error];
        if (videoInput) {
            if ([self.session canAddInput:videoInput]){
                self.videoDeviceInput = videoInput;
                self.videoDevice = videoInput.device;
                [self.session addInput:videoInput];
            }
        }

输出的配置

//===VideoDataOutPut===
        AVCaptureVideoDataOutput *videoDataOutPut = [[AVCaptureVideoDataOutput alloc] init];
        NSDictionary *videooutputSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        [videoDataOutPut setVideoSettings:videooutputSettings];
        dispatch_queue_t videoCaptureQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
        [videoDataOutPut setSampleBufferDelegate:self queue:videoCaptureQueue];
        
        if ([_session canAddOutput:videoDataOutPut]) {
            
            [_session addOutput:videoDataOutPut];
            AVCaptureConnection *videoConnection = [videoDataOutPut connectionWithMediaType:AVMediaTypeVideo];
            
            if ([videoConnection isVideoStabilizationSupported ]) {
                videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
            }
            
            [videoConnection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
            if([self isFrontCamera]){//mirror the front camera
                [videoConnection setVideoMirrored:YES];
            }else{
                [videoConnection setVideoMirrored:NO];
            }
            [self setVideoConnection:videoConnection];
            [self setVideoDataOutPut:videoDataOutPut];
        }
// 照片输出的配置
 if (@available(iOS 10.0, *)) {
            AVCapturePhotoOutput *photoOutput = [[AVCapturePhotoOutput alloc] init];
            photoOutput.highResolutionCaptureEnabled = YES;

            if ([self.session canAddOutput:photoOutput]) {
                [self.session addOutput:photoOutput];
                self.imageOutput = photoOutput;
            }
        } else {
            // 静态图片输出
            AVCaptureStillImageOutput *imageOutput = [[AVCaptureStillImageOutput alloc] init];
            imageOutput.outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG, (NSString *)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]};

            if ([_session canAddOutput:imageOutput]) {
                [_session addOutput:imageOutput];
                self.imageOutput = imageOutput;
            }
        }

接收输出资源
1)、视频

// AVCaptureVideoDataOutputSampleBufferDelegate & AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;

2)、照片输出

- (void)takePhoto:(AVCaptureVideoOrientation)videoOrientation
          success:(void(^)(CMSampleBufferRef sampleBufferRef))success
          failure:(void(^)(NSError *error))failure
{
    
    id takePhotoBlock = ^(CMSampleBufferRef sampleBuffer, NSError *error) {
        
        if (error) {
            if (failure) failure(error);
        }else {
            if (success) success(sampleBuffer);
        }
    };
    
    if (@available(iOS 10.0, *)) {
        AVCapturePhotoOutput *photoOutput = (AVCapturePhotoOutput *)self.imageOutput;
        AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettingsWithFormat:@{(NSString *)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]}];
        photoSettings.highResolutionPhotoEnabled = YES;
        if ( photoSettings.availablePreviewPhotoPixelFormatTypes.count > 0 ) {
            photoSettings.previewPhotoFormat = @{(NSString *)kCVPixelBufferPixelFormatTypeKey : photoSettings.availablePreviewPhotoPixelFormatTypes.firstObject};
        }
        AVCaptureConnection *connection = [(AVCapturePhotoOutput *)self.imageOutput connectionWithMediaType:AVMediaTypeVideo];
        if (connection.isVideoOrientationSupported) {
            connection.videoOrientation = videoOrientation;
        }
        
        [photoOutput capturePhotoWithSettings:photoSettings delegate:self];
        self.takePhotoActionBlock = takePhotoBlock;

    }else {
        AVCaptureConnection *connection = [(AVCaptureStillImageOutput *)self.imageOutput connectionWithMediaType:AVMediaTypeVideo];
        if (connection.isVideoOrientationSupported) {
            connection.videoOrientation = videoOrientation;
        }
        
        [(AVCaptureStillImageOutput *)self.imageOutput captureStillImageAsynchronouslyFromConnection:connection
                                                                                   completionHandler:takePhotoBlock];
    }
}

#pragma mark - AVCapturePhotoCaptureDelegate
// 获取拍照后的资源
// NS_DEPRECATED_IOS(10_0, 11_0, "Use -captureOutput:didFinishProcessingPhoto:error: instead.");
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(nullable CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(nullable CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(nullable AVCaptureBracketedStillImageSettings *)bracketSettings error:(nullable NSError *)error
API_AVAILABLE(ios(10.0))
{
    if (self.takePhotoActionBlock) {
        self.takePhotoActionBlock(photoSampleBuffer, error);
    }
}

- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(nullable NSError *)error
API_AVAILABLE(ios(11.0))
{
    CMSampleBufferRef sampleBuffer = [XYCMSampleBufferUtils sampleBufferFromCGImage:photo.pixelBuffer];
    if (self.takePhotoActionBlock) {
        self.takePhotoActionBlock(sampleBuffer, error);
    }
}
最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
【社区内容提示】社区部分内容疑似由AI辅助生成,浏览时请结合常识与多方信息审慎甄别。
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

相关阅读更多精彩内容

友情链接更多精彩内容