仿微信小视频-拍照-iOS

最近闲来无事,看到微信的拍照和小视频功能,觉得可以尝试自己仿照开发一个,并且幸运的话,可以从开发过程和功能实现中对其进行优化和功能增加,当然这个是以后的事情,now,先来看一下效果:

正常界面

WechatIMG4.png

录制视频:

WechatIMG5.png

拍照 - 视频录制成功

WechatIMG2.png

下面来看一下核心的代码

1、调用方法

GJVideoViewController *ctrl = [[NSBundle mainBundle] loadNibNamed:@"GJVideoViewController" owner:nil options:nil].lastObject;
    ctrl.GJSeconds = 30;//设置可录制最长时间
// 通过block 回调获得图片或者视频的地址,便于对其进行处理
    ctrl.takeBlock = ^(id item) {
        if ([item isKindOfClass:[NSURL class]]) {
            NSURL *videoURL = item;
            //视频url
            NSLog(@"视频是 %@",item);
        } else {
            //图片
            NSLog(@"图片是 %@",item);
            
        }
    };
    [self presentViewController:ctrl animated:YES completion:nil];

核心代码思路:

/**
 *
 * 1、核心是自定义相机
 *      1> 初始化捕捉设备 AVCaptureSession - 会话
 *      2> 对设备进行设置,例如分辨率等
 *      3> 获取硬件设备,摄像头 - AVCaptureDevice
 *    ps: 4> 这里需要录制视频,所以还要加上获取音频设备 - audioCaptureDevice
 *      5> AVCaptureDeviceInput - 初始化设备输入
 *      6> 初始化设备输出  -  AVCaptureMovieFileOutput
 *    ps: 可以设置 防抖设置、 图层等
 *      7> 设备输入、输出添加到会话
 *    PS: 想要设置对焦、录制视频,需要给设备增加通知,实时监听捕捉画面的移动!
 */

自定义相机:

- (void)customCamera {

    //初始化会话,用来结合输入输出
    self.session = [[AVCaptureSession alloc] init];
    //设置分辨率 (设备支持的最高分辨率)
    if ([self.session canSetSessionPreset:AVCaptureSessionPresetHigh]) {
        self.session.sessionPreset = AVCaptureSessionPresetHigh;
    }
    //取得后置摄像头
    AVCaptureDevice *captureDevice = [self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];
    
    
    //添加一个音频输入设备
    AVCaptureDevice *audioCaptureDevice;
    if (@available(iOS 10.0, *)) {
        //   改版后的新方法
        AVCaptureDeviceDiscoverySession *audioCaptureDeviceSession = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInMicrophone] mediaType:AVMediaTypeAudio position:0];
        NSArray *devicesIOS  = audioCaptureDeviceSession.devices;
        audioCaptureDevice = devicesIOS.lastObject;
    } else {
        
        //添加一个音频输入设备
        audioCaptureDevice= [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];
        
    }
    

    //初始化输入设备
    NSError *error = nil;
    self.captureDeviceInput = [[AVCaptureDeviceInput alloc] initWithDevice:captureDevice error:&error];
    if (error) {
        NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);
        return;
    }
    
    //添加音频
    error = nil;
    AVCaptureDeviceInput *audioCaptureDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:audioCaptureDevice error:&error];
    if (error) {
        NSLog(@"取得设备输入对象时出错,错误原因:%@",error.localizedDescription);
        return;
    }
    
    //输出对象
    self.captureMovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];//视频输出
    
    //将输入设备添加到会话
    if ([self.session canAddInput:self.captureDeviceInput]) {
        [self.session addInput:self.captureDeviceInput];
        [self.session addInput:audioCaptureDeviceInput];
        
        //设置视频防抖
        AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo];
        
        if ([connection isVideoStabilizationSupported]) {
            connection.preferredVideoStabilizationMode = AVCaptureVideoStabilizationModeCinematic;
        }
    }
    
    //将输出设备添加到会话 (刚开始 是照片为输出对象)
    if ([self.session canAddOutput:self.captureMovieFileOutput]) {
        [self.session addOutput:self.captureMovieFileOutput];
    }
    
    //创建视频预览层,用于实时展示摄像头状态
    self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.session];
    self.previewLayer.frame = self.view.bounds;//CGRectMake(0, 0, self.view.width, self.view.height);
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;//填充模式
    [self.bgView.layer addSublayer:self.previewLayer];
    
     // 给设备添加通知,实时监听画面移动
    [self addNotificationToCaptureDevice:captureDevice];
    [self addGenstureRecognizer];
}

/**
 *  给输入设备添加通知
 */
-(void)addNotificationToCaptureDevice:(AVCaptureDevice *)captureDevice{
    //注意添加区域改变捕获通知必须首先设置设备允许捕获
    [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) {
        captureDevice.subjectAreaChangeMonitoringEnabled=YES;
    }];
    NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter];
    //捕获区域发生改变
    [notificationCenter addObserver:self selector:@selector(areaChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];
}

// 当设备区域发生变化,设备断开,设备连接,可以执行相应的方法对其处理!

视频录制:

 //根据设备输出获得连接
        AVCaptureConnection *connection = [self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeAudio];
        //根据连接取得设备输出的数据
        if (![self.captureMovieFileOutput isRecording]) {
            //如果支持多任务则开始多任务
            if ([[UIDevice currentDevice] isMultitaskingSupported]) {
                self.backgroundTaskIdentifier = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil];
            }
            if (self.saveVideoUrl) {
                [[NSFileManager defaultManager] removeItemAtURL:self.saveVideoUrl error:nil];
            }
            //预览图层和视频方向保持一致
            connection.videoOrientation = [self.previewLayer connection].videoOrientation;
            NSString *outputFielPath=[NSTemporaryDirectory() stringByAppendingString:@"myMovie.mov"];
            NSLog(@"save path is :%@",outputFielPath);
            NSURL *fileUrl=[NSURL fileURLWithPath:outputFielPath];
            NSLog(@"fileUrl:%@",fileUrl);
//             开始录制视频
            [self.captureMovieFileOutput startRecordingToOutputFileURL:fileUrl recordingDelegate:self];
        } else {
            [self.captureMovieFileOutput stopRecording];
        }

          

PS: 在AVCaptureFileOutput 的代理方法中 设置视频的时长,存储位置和对视频的处理等

#pragma mark - 视频输出代理
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections{
    NSLog(@"开始录制...");
    self.seconds = self.GJSeconds;
    [self performSelector:@selector(onStartTranscribe:) withObject:fileURL afterDelay:1.0];
}

#pragma mark - 视频录制完成后
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
    NSLog(@"视频录制完成.");
    [self changeLayout];
    if (self.isVideo) {
        self.saveVideoUrl = outputFileURL;
        if (!self.player) {
            self.player = [[GJAVPlayer alloc] initWithFrame:self.bgView.bounds withShowInView:self.bgView url:outputFileURL];
        } else {
            if (outputFileURL) {
                self.player.videoUrl = outputFileURL;
                self.player.hidden = NO;
            }
        }
    } else {
        //照片
        self.saveVideoUrl = nil;
      //  处理照片
        [self videoHandlePhoto:outputFileURL];
    }
    
}

剩下的工作就简单多了,如果有兴趣,可以demo中看!

看完记得点赞哦!!!

©著作权归作者所有,转载或内容合作请联系作者
【社区内容提示】社区部分内容疑似由AI辅助生成,浏览时请结合常识与多方信息审慎甄别。
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

相关阅读更多精彩内容

友情链接更多精彩内容