关于iOS调用摄像机来获取照片,通常我们都会调用UIImagePickerController来调用系统提供的相机来拍照,这个控件非常好用。但是有时UIImagePickerController控件无法满足我们的需求,例如我们需要更加复杂的OverlayerView,这时候我们就要自己构造一个摄像机控件了。
0.AVCapture <AVFoundation/AVFoundation.h>
媒体采集需要的几个对象:
1、AVCaptureDevice: 代表抽象的硬件设备(如前置摄像头,后置摄像头等)。
2、AVCaptureInput: 代表输入设备(可以是它的子类),它配置抽象硬件设备的ports。
3、AVCaptureOutput: 它代表输出数据,管理着输出到一个movie或者图像。
4、AVCaptureSession: 它是input和output的桥梁。它协调着input到output的数据传输。
关系:
有很多Device的input,也有很多数据类型的Output,都通过一个Capture Session来控制进行传输。也即:CaptureDevice适配AVCaptureInput,通过Session来输入到AVCaptureOutput中。这样也就达到了从设备到文件等持久化输入目的(如从相机设备采集图像到UIImage中)。
那么存在一个问题了:视频输入(input)就对应视频的输出(output),而音频输入就对应音频的输出,因而需要建立对应的Connections,来各自连接它们。而这样的连接对象,是由AVCaptureSession来持有的,这个对象叫AVCaptureConnection。
在一个AVCaptureConnection中,这里维持着对应的数据传输输入到数据输出的过程(detail过程)。这里,AVCaptureInput或其子类对象包含着各种input port,通过各种input port,我们的AVCaptureOutput可以获取相应的数据。
一个AVCaptureConnection可以控制input到output的数据传输。
1.Session及其使用模式
You use an instance to coordinate the flow of data from AV input devices to outputs. You add the capture devices and outputs you want to the session, then start data flow by sending the session a startRunning message, and stop recording by sending a stopRunning message.
AVCaptureSession *session = [AVCaptureSession alloc] init];
[session startRunning];
这里表明了,需要create一个session,然后发running消息给它,它会自动跑起来,把输入设备的东西,提交到输出设备中。
若想在一个已经使用上的session中(已经startRunning了)做更换新的device、删除旧的device等一系列的操作,那么就需要使用如下方法:
AVCaptureSession *session;
[session beginConfiguration];
// Remove an existing capture device.
// Add a new capture device.
// Reset the preset.
[session commitConfiguration];
当然,如果session的时候发生了异常,那么我们可以通过notification去observe相关的事件(可以在AVCaptureSession Class Reference中的Notifications中找到相应的情况),而session如果出现相应问题的时候,它会post出来,此时我们就可以处理了。
2.AVCaptureDevice,主要用来获取iPhone一些关于相机设备的属性。
InputDevice即是对硬件的抽象,一对一的。一个AVCaptureDevice对象,对应一个实际的硬件设备。
那么显然,我们可以通过AVCaptureDevice的类方法devices或devicesWithMediaType去获取全部或局部设备列表。(当然也可以检测相应的设备是否可以使用,这里注意有设备抢占问题,当前是否可用)
相机设备可以用下面的方法判断设备是否支持相关属性(property),比如对焦方式或者对焦状态Focus modes。
[currentDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus];
前置和后置摄像头
enum {
AVCaptureDevicePositionBack = 1,
AVCaptureDevicePositionFront = 2
};
typedef NSInteger AVCaptureDevicePosition;闪光灯开关
enum {
AVCaptureFlashModeOff = 0,
AVCaptureFlashModeOn = 1,
AVCaptureFlashModeAuto = 2
};
typedef NSInteger AVCaptureFlashMode;手电筒开关
enum {
AVCaptureTorchModelOff = 0,
AVCaptureTorchModelOn = 1,
AVCaptureTorchModeAuto = 2
};
typedef NSInteger AVCaptureTorchMode;焦距调整
enum {
AVCaptureFocusModelLocked = 0,
AVCaptureFocusModeAutoFocus = 1,
AVCaptureFocusModeContinousAutoFocus = 2
};
typedef NSInteger AVCaptureFocusMode;曝光量调节
enum {
AVCaptureExposureModeLocked = 0,
AVCaptureExposureModeAutoExpose = 1,
AVCaptureExposureModeContinuousAutoExposure = 2
};
typedef NSInteger AVCaptureExposureMode;白平衡
enum {
AVCaptureWhiteBalanceModeLocked = 0,
AVCaptureWhiteBalanceModeAutoWhiteBalance = 1,
AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance = 2
};
typedef NSInteger AVCaptureWhiteBalanceMode;CaptureInput的构建和添加到Session中的方法
创建并配置输入设备
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
// 添加input到session的模式是(检查可否添加到session,然后根据情况添加或者不添加)
AVCaptureSession *captureSession = <#Get a capture session#>;
if ([captureSession canAddInput:input]) {
[captureSession addInput:input];
}
4.output的分类和使用
在ios中,分为MovieFile、VideoData、AudioData和StillImage几种output,使用方式类似,只是范围不同。另外,它们都继承于AVCaptureOutput。
第一个是输出成movie文件,第二个适用于逐个Frame的处理,第三个适用于声音采集,第四个是still image(静态图像<拍照>)相关。
他们的添加方式都是使用session的addOutput方法。
5.AVCaptureStillImageOutput 照片输出流对象
- AVCaptureVideoPreviewLayer 预览图层,来显示照相机拍摄到的画面
#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
@interface ViewController ()
{
}
// AVCaptureSession对象来执行输入设备和输出设备之间的数据传递
@property (nonatomic, strong)AVCaptureSession *session;
// AVCaptureDeviceInput对象是输入流
@property (nonatomic, strong)AVCaptureDeviceInput *videoInput;
// 照片输出流对象
@property (nonatomic, strong)AVCaptureStillImageOutput *stillImageOutput;
// 预览图层,来显示照相机拍摄到的画面
@property (nonatomic, strong)AVCaptureVideoPreviewLayer *previewLayer;
// 切换前后镜头的按钮
@property (nonatomic, strong)UIButton *toggleButton;
// 拍照按钮
@property (nonatomic, strong)UIButton *shutterButton;
// 放置预览图层的View
@property (nonatomic, strong)UIView *cameraShowView;
// 用来展示拍照获取的照片
@property (nonatomic, strong)UIImageView *imageShowView;
@end
@implementation ViewController
- (id)init {
self = [super init];
if (self) {
[self initialSession];
[self initCameraShowView];
[self initImageShowView];
[self initButton];
}
return self;
}
- (void)viewDidLoad {
[super viewDidLoad];
// Do any additional setup after loading the view, typically from a nib.
}
- (void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
[self setUpCameraLayer];
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
if (self.session) {
[self.session startRunning];
}
}
- (void)viewDidDisappear:(BOOL)animated {
[super viewDidDisappear:animated];
if (self.session) {
[self.session stopRunning];
}
}
- (void)initialSession {
self.session = [[AVCaptureSession alloc] init];
self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backCamera] error:nil];
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
// 这是输出流的设置参数AVVideoCodecJPEG参数表示以JPEG的图片格式输出图片
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
if ([self.session canAddInput:self.videoInput]) {
[self.session addInput:self.videoInput];
}
if ([self.session canAddOutput:self.stillImageOutput]) {
[self.session addOutput:self.stillImageOutput];
}
}
- (void)initCameraShowView {
self.cameraShowView = [[UIView alloc] initWithFrame:self.view.frame];
[self.view addSubview:self.cameraShowView];
}
- (void)initImageShowView {
self.imageShowView = [[UIImageView alloc] initWithFrame:CGRectMake(0, self.view.frame.size.height -200, 200, 200)];
self.imageShowView.contentMode = UIViewContentModeScaleToFill;
self.imageShowView.backgroundColor = [UIColor whiteColor];
[self.view addSubview:self.imageShowView];
}
- (void)initButton {
self.shutterButton = [UIButton buttonWithType:UIButtonTypeSystem];
self.shutterButton.frame = CGRectMake(10, 30, 60, 30);
self.shutterButton.backgroundColor = [UIColor cyanColor];
[self.shutterButton setTitle:@"拍照" forState:UIControlStateNormal];
[self.shutterButton addTarget:self action:@selector(shutterCamera)forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:self.shutterButton];
self.toggleButton = [UIButton buttonWithType:UIButtonTypeSystem];
self.toggleButton.frame = CGRectMake(80, 30, 60, 30);
self.toggleButton.backgroundColor = [UIColor cyanColor];
[self.toggleButton setTitle:@"切换摄像头" forState:UIControlStateNormal];
[self.toggleButton addTarget:self action:@selector(toggleCamera)forControlEvents:UIControlEventTouchUpInside];
[self.view addSubview:self.toggleButton];
}
// 这是获取前后摄像头对象的方法
- (AVCaptureDevice *)cameraWithPosition:(AVCaptureDevicePosition)position {
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if (device.position == position) {
return device;
}
}
return nil;
}
- (AVCaptureDevice *)frontCamera {
return [self cameraWithPosition:AVCaptureDevicePositionFront];
}
- (AVCaptureDevice *)backCamera {
return [self cameraWithPosition:AVCaptureDevicePositionBack];
}
- (void)setUpCameraLayer {
if (self.previewLayer == nil) {
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
UIView * view = self.cameraShowView;
CALayer * viewLayer = [view layer];
// UIView的clipsToBounds属性和CALayer的setMasksToBounds属性表达的意思是一致的,决定子视图的显示范围。当取值为YES的时候,剪裁超出父视图范围的子视图部分,当取值为NO时,不剪裁子视图。
[viewLayer setMasksToBounds:YES];
CGRect bounds = [view bounds];
[self.previewLayer setFrame:bounds];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[viewLayer addSublayer:self.previewLayer];
}
}
// 这是拍照按钮的方法
- (void)shutterCamera {
AVCaptureConnection *videoConnection = [self.stillImageOutputconnectionWithMediaType:AVMediaTypeVideo];
if (!videoConnection) {
return;
}
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnectioncompletionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer == NULL) {
return;
}
NSData *imageData = [AVCaptureStillImageOutputjpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
NSLog(@"image size = %@", NSStringFromCGSize(image.size));
self.imageShowView.image = image;
}];
}
// 这是切换镜头的按钮方法
- (void)toggleCamera {
NSUInteger cameraCount = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] count];
if (cameraCount > 1) {
NSError *error;
AVCaptureDeviceInput *newVideoInput;
AVCaptureDevicePosition position = [[_videoInput device] position];
if (position == AVCaptureDevicePositionBack) {
newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self frontCamera] error:&error];
} else if (position == AVCaptureDevicePositionFront) {
newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backCamera] error:&error];
} else {
return;
}
if (newVideoInput != nil) {
[self.session beginConfiguration];
[self.session removeInput:self.videoInput];
if ([self.session canAddInput:newVideoInput]) {
[self.session addInput:newVideoInput];
self.videoInput = newVideoInput;
} else {
[self.session addInput:self.videoInput];
}
[self.session commitConfiguration];
} else if (error) {
NSLog(@"toggle carema failed, error = %@", error);
}
}
}
- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
// Dispose of any resources that can be recreated.
}
@end