iOS-简单的AVFoundation视频处理

最近加班中TAT,具体思路暂时没时间写,先把干货放上来😂,后面有时间了再完善。


公司要做直播,是语音带连麦+PPT课件的,课件有图片有视频。需要将主播对PPT的操作记录下来,同时主播及连麦者的声音,最后待直播完成将语音及PPT课件合成一段视频。其中对音视频相关的需求归纳如下:

需求拆分:

  1. 静态图片转视频
  2. 图片加水印转视频
  3. 视频截取
  4. 视频分辨率修改
  5. 视频拼接
  6. 音、视频并轨

具体实现,CTRL+C/CTRL+V直接可用,伸手党福音。

静态图片转视频(加水印)

+ (void)writeImageAsMovie:(UIImage *)image
                     watermark:(UIImage *)watermark
                        toPath:(NSString*)path
                          size:(CGSize)size
                      duration:(double)duration
                           fps:(int)fps
             withCallbackBlock:(void(^)(BOOL success))callbackBlock
{
    [[NSFileManager defaultManager] removeItemAtPath:path error:NULL];
    
    NSError *error = nil;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path]
                                                           fileType:AVFileTypeMPEG4
                                                              error:&error];
    if (error) {
        if (callbackBlock) {
            callbackBlock(NO);
        }
        return;
    }
    NSParameterAssert(videoWriter);
    
    NSDictionary *videoSettings = @{AVVideoCodecKey: AVVideoCodecH264,
                                    AVVideoWidthKey: [NSNumber numberWithInt:size.width],
                                    AVVideoHeightKey: [NSNumber numberWithInt:size.height]};
    
    AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                         outputSettings:videoSettings];
    
    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                     sourcePixelBufferAttributes:nil];
    NSParameterAssert(writerInput);
    NSParameterAssert([videoWriter canAddInput:writerInput]);
    [videoWriter addInput:writerInput];
    
    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];
    
    CVPixelBufferRef buffer;
    CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &buffer);
    
    CMTime presentTime = CMTimeMake(0, fps);
    
    while (1)
    {
        if(writerInput.readyForMoreMediaData){
            buffer = [LTVideoTools pixelBufferFromCGImage:[image CGImage] watermark:[watermark CGImage] size:size];
            BOOL appendSuccess = [LTVideoTools appendToAdapter:adaptor
                                                                  pixelBuffer:buffer
                                                                       atTime:presentTime
                                                                    withInput:writerInput];
            
            NSAssert(appendSuccess, @"Failed to append");
            
            CMTime endTime = CMTimeMakeWithSeconds(duration, fps);
            BOOL appendSuccess2 = [LTVideoTools appendToAdapter:adaptor
                                                                   pixelBuffer:buffer
                                                                        atTime:endTime
                                                                     withInput:writerInput];
            
            NSAssert(appendSuccess2, @"Failed to append");
            
            
            //Finish the session:
            [writerInput markAsFinished];
            
            [videoWriter finishWritingWithCompletionHandler:^{
                NSLog(@"Successfully closed video writer");
                if (videoWriter.status == AVAssetWriterStatusCompleted) {
                    if (callbackBlock) {
                        callbackBlock(YES);
                    }
                } else {
                    if (callbackBlock) {
                        callbackBlock(NO);
                    }
                }
            }];
            CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
            break;
        }
    }
}

+ (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
                                 watermark:(CGImageRef)watermark
                                      size:(CGSize)imageSize
{
    NSDictionary *options = @{(id)kCVPixelBufferCGImageCompatibilityKey: @YES,
                              (id)kCVPixelBufferCGBitmapContextCompatibilityKey: @YES};
    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, imageSize.width,
                                          imageSize.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    
    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);
    
    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, imageSize.width,
                                                 imageSize.height, 8, 4*imageSize.width, rgbColorSpace,
                                                 kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);
    
    CGContextDrawImage(context, CGRectMake(0 + (imageSize.width-CGImageGetWidth(image))/2,
                                           (imageSize.height-CGImageGetHeight(image))/2,
                                           CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    if (watermark) {
        CGContextDrawImage(context, CGRectMake(0 + (imageSize.width-CGImageGetWidth(watermark))/2,
                                               (imageSize.height-CGImageGetHeight(watermark))/2,
                                               CGImageGetWidth(watermark),
                                               CGImageGetHeight(watermark)), watermark);
    }
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);
    
    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    
    return pxbuffer;
}

+ (BOOL)appendToAdapter:(AVAssetWriterInputPixelBufferAdaptor*)adaptor
            pixelBuffer:(CVPixelBufferRef)buffer
                 atTime:(CMTime)presentTime
              withInput:(AVAssetWriterInput*)writerInput
{
    while (!writerInput.readyForMoreMediaData) {
        usleep(1);
    }
    
    return [adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];
}

+ (void)trimVideoWithVideoUrlStr:(NSURL *)videoUrl captureVideoWithStartTime:(double)start endTime:(double)end outputPath:(NSURL *)outputURL completion:(void(^)(NSURL *outputURL,NSError *error))completionHandle {
    CMTime startTime = CMTimeMakeWithSeconds(start, 1);
    CMTime videoDuration = CMTimeMakeWithSeconds(end - start, 1);
    CMTimeRange videoTimeRange = CMTimeRangeMake(startTime, videoDuration);
    
    AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:[AVAsset assetWithURL:videoUrl] presetName:AVAssetExportPresetMediumQuality];
    session.outputURL = outputURL;
    session.outputFileType = AVFileTypeMPEG4;
    session.timeRange = videoTimeRange;
    session.shouldOptimizeForNetworkUse = YES;
    [session exportAsynchronouslyWithCompletionHandler:^{
        if (completionHandle) {
            if (session.error) {
                completionHandle(nil,session.error);
            }else {
                completionHandle(outputURL,nil);
            }
        }
    }];
}

截取视频

+ (void)trimVideoWithVideoUrlStr:(NSURL *)videoUrl captureVideoWithStartTime:(double)start endTime:(double)end outputPath:(NSURL *)outputURL completion:(void(^)(NSURL *outputURL,NSError *error))completionHandle {
    CMTime startTime = CMTimeMakeWithSeconds(start, 1);
    CMTime videoDuration = CMTimeMakeWithSeconds(end - start, 1);
    CMTimeRange videoTimeRange = CMTimeRangeMake(startTime, videoDuration);
    
    AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:[AVAsset assetWithURL:videoUrl] presetName:AVAssetExportPresetMediumQuality];
    session.outputURL = outputURL;
    session.outputFileType = AVFileTypeMPEG4;
    session.timeRange = videoTimeRange;
    session.shouldOptimizeForNetworkUse = YES;
    [session exportAsynchronouslyWithCompletionHandler:^{
        if (completionHandle) {
            if (session.error) {
                completionHandle(nil,session.error);
            }else {
                completionHandle(outputURL,nil);
            }
        }
    }];
}

修改视频分辨率

+ (void)resizeVideoWithAssetURL:(NSURL *)assetURL outputURL:(NSURL *)outputURL preferSize:(CGSize)preferSize doneHandler:(void(^)(NSURL *outputURL,NSError *error))doneHandler {
    AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:assetURL options:nil];
    
    AVAssetTrack *assetVideoTrack = nil;
    AVAssetTrack *assetAudioTrack = nil;
    
    if ([[asset tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
        assetVideoTrack = [asset tracksWithMediaType:AVMediaTypeVideo][0];
    }
    if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
        assetAudioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
    }
    
    NSError *error = nil;
    
    AVMutableComposition* mixComposition = [AVMutableComposition composition];
    if (assetVideoTrack) {
        AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetVideoTrack atTime:kCMTimeZero error:&error];
    }
    if (assetAudioTrack) {
        AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:assetAudioTrack atTime:kCMTimeZero error:&error];
    }
    
    AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
    mutableVideoComposition.renderSize = preferSize;
    mutableVideoComposition.frameDuration = CMTimeMake(1, 24);
    
    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]);
    AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:(mixComposition.tracks)[0]];
    BOOL isPortrait_ = [LTVideoTools isVideoPortrait:asset];
    CGAffineTransform t = CGAffineTransformIdentity;
    if (isPortrait_) {
        t = CGAffineTransformRotate(t, M_PI_2);
        t = CGAffineTransformTranslate(t, 0, -preferSize.width);
    }
    preferSize = isPortrait_ ? CGSizeMake(preferSize.height, preferSize.width):preferSize;
    t = CGAffineTransformScale(t, preferSize.width / assetVideoTrack.naturalSize.width, preferSize.height / assetVideoTrack.naturalSize.height);
    [layerInstruction setTransform:t  atTime:kCMTimeZero];
    
    instruction.layerInstructions = @[layerInstruction];
    mutableVideoComposition.instructions = @[instruction];
    
    if ([[NSFileManager defaultManager] fileExistsAtPath:outputURL.path]) {
        [[NSFileManager defaultManager] removeItemAtPath:outputURL.path error:&error];
    }
    
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
    exportSession.videoComposition = mutableVideoComposition;
    exportSession.outputURL = outputURL;
    exportSession.outputFileType = AVFileTypeMPEG4;
    exportSession.shouldOptimizeForNetworkUse = YES;
    
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        if (exportSession.status == AVAssetExportSessionStatusCompleted) {
            doneHandler(outputURL,nil);
        }else {
            doneHandler(nil,exportSession.error);
        }
    }];
}

+ (BOOL)isVideoPortrait:(AVAsset *)asset {
    BOOL isPortrait = NO;
    NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
    if([tracks    count] > 0) {
        AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
        
        CGAffineTransform t = videoTrack.preferredTransform;
        // Portrait
        if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0)
        {
            isPortrait = YES;
        }
        // PortraitUpsideDown
        if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0)  {
            
            isPortrait = YES;
        }
        // LandscapeRight
        if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0)
        {
            isPortrait = NO;
        }
        // LandscapeLeft
        if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0)
        {
            isPortrait = NO;
        }
    }
    return isPortrait;
}

音视频并轨

+ (void)mergeVideoAssetAndAudioAssetWithVideoAssetURL:(NSURL *)videoAssetURL audioAssetURL:(NSURL *)audioAssetURL outputURL:(NSURL *)outputURL doneHandler:(void(^)(NSURL *outputURL,NSError *error))doneHandler {
    AVAsset *videoAsset = [AVAsset assetWithURL:videoAssetURL];
    AVAsset *audioAsset = [AVAsset assetWithURL:audioAssetURL];
    
    AVMutableComposition *mainComposition = [[AVMutableComposition alloc] init];
    AVMutableCompositionTrack *videoCompositionTrack = [mainComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *soundCompositionTrack = [mainComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *soundCompositionTrack2 = [mainComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    
    AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    if (videoTrack) {
        [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil];
        [videoCompositionTrack setPreferredTransform:videoTrack.preferredTransform];
    }
    AVAssetTrack *audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
    if (audioTrack) {
        [soundCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:audioTrack atTime:kCMTimeZero error:nil];
    }
    
    AVAssetTrack *audioTrack2 = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
    if (audioTrack2) {
        [soundCompositionTrack2 insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:audioTrack2 atTime:kCMTimeZero error:nil];
    }
    
    if ([[NSFileManager defaultManager] fileExistsAtPath:outputURL.path]) {
        [[NSFileManager defaultManager] removeItemAtPath:outputURL.path error:nil];
    }
    
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mainComposition presetName:AVAssetExportPresetMediumQuality];
    exportSession.outputURL=outputURL;
    exportSession.outputFileType =AVFileTypeMPEG4;
    exportSession.shouldOptimizeForNetworkUse = YES;
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        if (exportSession.status == AVAssetExportSessionStatusCompleted) {
            doneHandler(outputURL,nil);
        }else {
            doneHandler(nil,exportSession.error);
        }
    }];
}

视频拼接

+ (void)mergeVideoWithAssetURLs:(NSArray <NSURL *>*)assetURLs outputURL:(NSURL *)outputURL doneHandler:(void(^)(NSURL *outputURL,NSError *error))doneHandler {
    NSMutableArray *assets = [NSMutableArray array];
    for (NSURL *url in assetURLs) {
        [assets addObject:[AVAsset assetWithURL:url]];
    }
    
    AVMutableComposition *mainComposition = [[AVMutableComposition alloc] init];
    AVMutableCompositionTrack *videoCompositionTrack = [mainComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *soundCompositionTrack = [mainComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    
    CMTime insertTime = kCMTimeZero;
    
    for (AVAsset *videoAsset in assets) {
        AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
        if (videoTrack) {
            [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:videoTrack atTime:insertTime error:nil];
            [videoCompositionTrack setPreferredTransform:videoTrack.preferredTransform];
        }
        AVAssetTrack *audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        if (audioTrack) {
            [soundCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:audioTrack atTime:insertTime error:nil];
        }
        
        insertTime = CMTimeAdd(insertTime, videoAsset.duration);
    }
    
    if ([[NSFileManager defaultManager] fileExistsAtPath:outputURL.path]) {
        [[NSFileManager defaultManager] removeItemAtPath:outputURL.path error:nil];
    }
    
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mainComposition presetName:AVAssetExportPresetMediumQuality];
    exportSession.outputURL=outputURL;
    exportSession.outputFileType =AVFileTypeMPEG4;
    exportSession.shouldOptimizeForNetworkUse = YES;
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        for (NSURL *url in assets) {
            if ([[NSFileManager defaultManager] fileExistsAtPath:url.path]) {
                [[NSFileManager defaultManager] removeItemAtURL:url error:nil];
            }
        }
        
        if (exportSession.status == AVAssetExportSessionStatusCompleted) {
            doneHandler(outputURL,nil);
        }else {
            doneHandler(nil,exportSession.error);
        }
    }];
}
最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 215,794评论 6 498
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 92,050评论 3 391
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 161,587评论 0 351
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 57,861评论 1 290
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 66,901评论 6 388
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 50,898评论 1 295
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 39,832评论 3 416
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 38,617评论 0 271
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 45,077评论 1 308
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 37,349评论 2 331
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 39,483评论 1 345
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 35,199评论 5 341
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 40,824评论 3 325
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 31,442评论 0 21
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 32,632评论 1 268
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 47,474评论 2 368
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 44,393评论 2 352

推荐阅读更多精彩内容