最近在做一个自定义相机的Demo, Demo的需求是相机对着某一处,当自动对焦成功后,然后拍摄图片。
- (void)initAVCaptureSession{
self.session = [[AVCaptureSession alloc] init];
NSError *error;
self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[self.device addObserver:self
forKeyPath:@"adjustingFocus"
options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld
context:nil];
//更改这个设置的时候必须先锁定设备,修改完后再解锁,否则崩溃
[self.device lockForConfiguration:nil];
//设置闪光灯为自动
// [device setFlashMode:AVCaptureFlashModeOff];//AVCaptureFlashModeAuto
[self.device unlockForConfiguration];
self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:self.device error:&error];
if (error) {
NSLog(@"%@",error);
}
self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
//输出设置。AVVideoCodecJPEG 输出jpeg格式图片
NSDictionary * outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey, nil];
[self.stillImageOutput setOutputSettings:outputSettings];
if ([self.session canAddInput:self.videoInput]) {
[self.session addInput:self.videoInput];
}
if ([self.session canAddOutput:self.stillImageOutput]) {
[self.session addOutput:self.stillImageOutput];
}
self.session.sessionPreset = AVCaptureSessionPresetHigh;
//初始化预览图层
WEAKSELF
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
weakSelf.previewLayer.frame = CGRectMake(0, 0,kScreenWidth, kScreenHeight);
dispatch_async(dispatch_get_main_queue(), ^{
weakSelf.cameraView.layer.masksToBounds = YES;
[weakSelf.cameraView.layer addSublayer:self.previewLayer];
});
});
}
自动对焦的功能使用的是KVO实现的, 在上面的方法中添加了一个KVO:
[self.device addObserver:self
forKeyPath:@"adjustingFocus"
options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld
context:nil];
下面是自动对焦KVO的方法:
-(void)observeValueForKeyPath:(NSString*)keyPath ofObject:(id)object change:(NSDictionary*)change context:(void*)context {
if([keyPath isEqualToString:@"adjustingFocus"]){
BOOL adjustingFocus =[[change objectForKey:NSKeyValueChangeNewKey] isEqualToNumber:[NSNumber numberWithInt:1]];
if (adjustingFocus == 0) {
[timer setFireDate:[NSDate distantFuture]];//停止
[self.device removeObserver:self forKeyPath:@"adjustingFocus"];
AVCaptureConnection *stillImageConnection = [self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
UIDeviceOrientation curDeviceOrientation = [[UIDevice currentDevice] orientation];
AVCaptureVideoOrientation avcaptureOrientation = [self avOrientationForDeviceOrientation:curDeviceOrientation];
[stillImageConnection setVideoOrientation:avcaptureOrientation];
[stillImageConnection setVideoScaleAndCropFactor:self.effectiveScale];
WEAKSELF
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
dispatch_sync(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
if (!error) {
if (imageDataSampleBuffer == NULL) {
//没有图片
GPLog(@"当前拍摄的图片没有数据!");
}
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAuthorizationStatus author = [ALAssetsLibrary authorizationStatus];
if (author == ALAuthorizationStatusRestricted || author == ALAuthorizationStatusDenied){
//无权限
[weakSelf showHUDText:@"没有权限!"];
return;
}
UIImage *key_image = [UIImage imageWithData:jpegData];
if (!weakSelf.isPacket) {
weakSelf.cameraImg.hidden = NO;
weakSelf.cameraImg.image = [UIImage scaleImage:key_image WithSize:CGSizeMake(480, 480)];
weakSelf.cameraObj = key_image;
weakSelf.packetCannelBtn.hidden = NO;
weakSelf.packetHereBtn.hidden = NO;
} else {
NSURL *key_file = [NSURL URLWithString:weakSelf.storeModel.key_file_url];
[[SDWebImageManager sharedManager] loadImageWithURL:key_file options:0 progress:nil completed:^(UIImage * _Nullable image, NSData * _Nullable data, NSError * _Nullable error, SDImageCacheType cacheType, BOOL finished, NSURL * _Nullable imageURL) {
double key_image_double = [GetSimilarity getSimilarityValueWithImgA:image ImgB:key_image];
if (key_image_double >= 0.75) {
[weakSelf performSegueWithIdentifier:@"PacketDetailVC" sender:weakSelf.storeModel];
} else {
[self.device addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:nil];
}
}];
}
}
});
}];
}
}
}
首先来说说Demo的需求, 其实需求也就是做一个类似支付宝AR红包的APP,新版本的支付宝现在查看不了这个功能了。需求是这样的, 首先相机使用自动对焦拍摄一张图片,然后埋下红包或者图片或者视频, 然后上传服务器。 附近的人查看到当前位置有藏得红包, 图片或者视频的时候, 可以通过支付宝相机的自动对焦来获取一张图片和之前上传服务器的图片进行对比,如果图片匹配相同的, 就会打开红包, 图片或者视频, 整个Demo的功能就是这样的。
但是呢, 在iPhone 5s(我自己的手机), 上面使用这份代码测试没有一点问题, 但是使用iPhone7测试这份代码的时候, 发现根本就不能。在iPhone 7上面, 第一次相机会很快的会自动对焦成功, 然后自动拍摄一张图片, 但是这张图片很模糊或者就是一张纯黑的图片,所以需要重新拍摄一张图片。在自动对焦成功后的代码中, 我有使用:
[self.device removeObserver:self forKeyPath:@"adjustingFocus"];
下面的方法是点击重新拍摄的方法,里面有重新添加自动对焦的KVO:
- (IBAction)actionForPacketCannel:(id)sender {
[self addTimer];
self.cameraImg.hidden = YES;
self.packetHereBtn.hidden = YES;
self.packetCannelBtn.hidden = YES;
[self.device addObserver:self
forKeyPath:@"adjustingFocus"
options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionOld
context:nil];
}
但是现在的问题是:在iPhone7上面, 进入到VC后,没有多久, 他就自动对焦成功,然后会拍摄图片,但是这个图片和iPhone5s上面对比,相差太远了,简直就不像是对焦成功后的,有时候既然是一张全黑的图片。既然不是我想要的图片, 我就按重新选择图片,然后重新添加KVO,然相机又自动对焦,可是这个时候,怎么对焦都不成功!除非我把相机对着地板,或者相机要靠近物件只有几厘米进行对焦才会成功, 这个时候的图片拍出来也是很模糊的。Why? 这是为什么啊 ?怎么感觉iPhone5s 和 iPhone 7 上面的不是一样的效果,难道是iPhone 7 的手机相机又不同之处 ?求大神指点一二,谢谢!