AVFoundation
是苹果在iOS和OS X系统中用于处理基于时间的媒体数据的Objective-C框架. 供使用者来开发媒体类型的应用程序.
在iOS端的架构:该框架框架包含视频相关的APIs和音频相关的APIs。
针对音频及视频,其主要提供的功能可以归纳为4各方面:
1)、Capture
音频、视频、图像媒体的捕捉,并输出我们可用的数据对象的过程。
2)、Edit
针对现有的媒体片段(音频片段或视频片段),重新创建Assets,重新加工、生成新的媒体片段。例如,Reading, Writing, Reencoding Assets, Thumbnails
3)、Export
提供导出音视频的API。例如,修改文件格式、消减时长等。
4)、Presentation
例如,播放、音视频的预览
Capture
设备的输入源主要包括:麦克风(Microphone),摄像头(Camera),屏幕等
输入源一般包括:AVCaptureVideoPreviewLayer,AVCaptureAudioPreviewOutput,文件、Raw Buffer等。
如何完成一次音视频媒体捕获?
从一个设备,例如照相机或者麦克风管理捕获,组合对象来表示输入和输出,并使用 AVCaptureSession 的实例来协调它们之间的数据流。
- AVCaptureDevice 的实例表示输入设备,比如照相机或麦克风
- AVCaptureInput 的具体子类的实例从输入设备配置端口
- AVCaptureOutput 的具体子类的实例来管理输出一个电影文件或者静态图像
- AVCaptureSession 的实例从输入到输出协调数据流
一个简单的会话协调:
AVCapture Session作为整个Capture的核心,不断从输入源获取数据,然后分发给各个输出源,从而完成一次简单的会话。
//配置会话
- (void)setupSession:(AVCaptureDevicePosition)position error:(NSError * __autoreleasing *)error
{
dispatch_async(self.sessionQueue, ^{
[self.session beginConfiguration];
[self setupSessionInputs:error];
[self setupSessionOutputs];
[[self session] commitConfiguration];
});
}
开始会话
- (void)startSession {
dispatch_async([self sessionQueue], ^{
if (![[self session] isRunning]) {
[[self session] startRunning];
}
});
}
结束会话
- (void)stopSession {
dispatch_async([self sessionQueue], ^{
[[self session] stopRunning];
});
}
Monitoring Capture Session State - 监视捕获会话状态
捕获会话发出你能观察并被通知到的notifications
,例如,当它开始或者停止运行,或者当它被中断。你可以注册,如果发生了运行阶段的错误,可以接收 AVCaptureSessionRuntimeErrorNotification 。也可以询问会话的 running 属性去发现它正在运行的状态,并且它的 interrupted 属性可以找到它是否被中断了。此外, running 和 interrupted 属性是遵从key-value observing
,并且在通知都是在主线程上发布的。
然而,很多情况下,需要考虑多个输入源是如何被表示以及如何连接到输出。
输入源有自己的硬件参数可以设置流控,输出源作为一个被动接受对象,它并没有太多流控设置,所以苹果巧妙的引入AVCaptureConnections。每个Output与Session建立连接后,都会分配一个默认的AVCpatureConnection。
AVCaptureConnections就是Session和Output中间的控制节点。很多实时数据,也都是从connection得到的。
输入设备的配置
// 视频输入
AVCaptureDevice *videoDevice = [XYCameraClient deviceWithMediaType:AVMediaTypeVideo preferringPosition:self.captureDevicePosition];
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:error];
if (videoInput) {
if ([self.session canAddInput:videoInput]){
self.videoDeviceInput = videoInput;
self.videoDevice = videoInput.device;
[self.session addInput:videoInput];
}
}
输出的配置
//===VideoDataOutPut===
AVCaptureVideoDataOutput *videoDataOutPut = [[AVCaptureVideoDataOutput alloc] init];
NSDictionary *videooutputSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
[videoDataOutPut setVideoSettings:videooutputSettings];
dispatch_queue_t videoCaptureQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
[videoDataOutPut setSampleBufferDelegate:self queue:videoCaptureQueue];
if ([_session canAddOutput:videoDataOutPut]) {
[_session addOutput:videoDataOutPut];
AVCaptureConnection *videoConnection = [videoDataOutPut connectionWithMediaType:AVMediaTypeVideo];
if ([videoConnection isVideoStabilizationSupported ]) {
videoConnection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeStandard;
}
[videoConnection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
if([self isFrontCamera]){//mirror the front camera
[videoConnection setVideoMirrored:YES];
}else{
[videoConnection setVideoMirrored:NO];
}
[self setVideoConnection:videoConnection];
[self setVideoDataOutPut:videoDataOutPut];
}
// 照片输出的配置
if (@available(iOS 10.0, *)) {
AVCapturePhotoOutput *photoOutput = [[AVCapturePhotoOutput alloc] init];
photoOutput.highResolutionCaptureEnabled = YES;
if ([self.session canAddOutput:photoOutput]) {
[self.session addOutput:photoOutput];
self.imageOutput = photoOutput;
}
} else {
// 静态图片输出
AVCaptureStillImageOutput *imageOutput = [[AVCaptureStillImageOutput alloc] init];
imageOutput.outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG, (NSString *)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]};
if ([_session canAddOutput:imageOutput]) {
[_session addOutput:imageOutput];
self.imageOutput = imageOutput;
}
}
接收输出资源
1)、视频
// AVCaptureVideoDataOutputSampleBufferDelegate & AVCaptureAudioDataOutputSampleBufferDelegate
- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;
2)、照片输出
- (void)takePhoto:(AVCaptureVideoOrientation)videoOrientation
success:(void(^)(CMSampleBufferRef sampleBufferRef))success
failure:(void(^)(NSError *error))failure
{
id takePhotoBlock = ^(CMSampleBufferRef sampleBuffer, NSError *error) {
if (error) {
if (failure) failure(error);
}else {
if (success) success(sampleBuffer);
}
};
if (@available(iOS 10.0, *)) {
AVCapturePhotoOutput *photoOutput = (AVCapturePhotoOutput *)self.imageOutput;
AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettingsWithFormat:@{(NSString *)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]}];
photoSettings.highResolutionPhotoEnabled = YES;
if ( photoSettings.availablePreviewPhotoPixelFormatTypes.count > 0 ) {
photoSettings.previewPhotoFormat = @{(NSString *)kCVPixelBufferPixelFormatTypeKey : photoSettings.availablePreviewPhotoPixelFormatTypes.firstObject};
}
AVCaptureConnection *connection = [(AVCapturePhotoOutput *)self.imageOutput connectionWithMediaType:AVMediaTypeVideo];
if (connection.isVideoOrientationSupported) {
connection.videoOrientation = videoOrientation;
}
[photoOutput capturePhotoWithSettings:photoSettings delegate:self];
self.takePhotoActionBlock = takePhotoBlock;
}else {
AVCaptureConnection *connection = [(AVCaptureStillImageOutput *)self.imageOutput connectionWithMediaType:AVMediaTypeVideo];
if (connection.isVideoOrientationSupported) {
connection.videoOrientation = videoOrientation;
}
[(AVCaptureStillImageOutput *)self.imageOutput captureStillImageAsynchronouslyFromConnection:connection
completionHandler:takePhotoBlock];
}
}
#pragma mark - AVCapturePhotoCaptureDelegate
// 获取拍照后的资源
// NS_DEPRECATED_IOS(10_0, 11_0, "Use -captureOutput:didFinishProcessingPhoto:error: instead.");
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(nullable CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(nullable CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(nullable AVCaptureBracketedStillImageSettings *)bracketSettings error:(nullable NSError *)error
API_AVAILABLE(ios(10.0))
{
if (self.takePhotoActionBlock) {
self.takePhotoActionBlock(photoSampleBuffer, error);
}
}
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(nullable NSError *)error
API_AVAILABLE(ios(11.0))
{
CMSampleBufferRef sampleBuffer = [XYCMSampleBufferUtils sampleBufferFromCGImage:photo.pixelBuffer];
if (self.takePhotoActionBlock) {
self.takePhotoActionBlock(sampleBuffer, error);
}
}