1.AVFoundation有关资源组合的功能源于AVAsset的子类AVComposition. 一个组合就是将其他几种媒体资源合成一个自定义临时排列, 再将临时排列视为一个可以呈现或处理的独立媒体项目. 就比如AVAsset对象, 组合相当于包含一个或多个给定类型的媒体轨道容器. AVComposition中的轨道都是AVAssetTrack的子类AVCompositionTrack. 一个组合本身有一个或者多个媒体片段组成. AVComposition没有遵循NSCoding协议, 说以不能归档到磁盘, 需要自定义模型来保存这个状态. AVComposition和AVCompositionTrack都是不可变对象, 提供对资源的只读操作. 当创建自己的组合时, 需要使用AVMutableComposition和AVMutableCompositionTrack所提供的可变子类.
2. 时间处理, 由于浮点型和双精度类型的天然不精确性, 无法应用于更多的高级时基媒体开发, 比如一个单一的舍入错误就会导致丢帧或音频丢失. 苹果使用Core Media框架定义CMTime, CMTime实例可以标记特定的时间点或用于表示持续时间.
typedef struct {
CMTimeValue value; //64位有符号整形变量(分子)
CMTimeValue timescale; //32位有符号整形变量(分母)
CMTimeFlags flags; //表示时间的指定状态, 比如数据是否有效, 不确定或是否出现舍入.
CMTimeEpoch epoch;
} CMTime;
3. CMTime的创建方式
CMTime t1 = CMTimeMake(3, 1);
CMTimeShow(t1); // --> {3/1 = 3.000}
4. 时间相加减
CMTime t1 = CMTimeMake(5,1);
CMTime t2 = CMTimeMake(3,1);
CMTime result = CMTimeAdd(t1, t2); //相加
CMTimeShow(result); // --> {8/1 = 8.000}
result = CMTimeSubstract(t1, t2); //相减
CMTimeShow(result); // --> {2/1 = 2.000}
5. CMTimeRange, 时间范围, 由两个CMTime组成, 第一个值定义时间的起点, 第二个值定义时间的持续时间.
typedef struct {
CMTime start;
CMTime duration;
}
CMTime t = CMTimeMake(5, 1);
CMTime t2 = CMTimeMake(10,1);
CMTimeRange timeRange = CMTimeRangeMake(t, t);
CMTimeRangeShow(timeRange); // --> {{5/1 == 5.000}, {5/1 = 5.000}};
timeRange = CMTimeRangeFromTimeToTime(t, t2); // --> {{5/1 == 5.000}, {5/1 == 5.000}};
6. 时间范围计算.
交叉时间范围
CMTimeRange r1 = CMTimeRangeMake(kCMTimeZero, CMTimeMake(5,1));
CMTimeRange r2 = CMTimeRangeMake(CMTimeMake(2,1), CMTimeMake(5, 1));
CMTimeRange range = CMTimeRangeGEtIntersection(r1, r2); // --> {{2/1=2.000}, {3/1 = 3.000}};
两个时间范围总和
range = CMTimeRangeGetUnion(r1, r2);
CMTimeRangeShow(range); // --> {{0/1 = 0.000}, {7/1 = 7.000}};
7. 基础方法. 当创建一个资源用于组合时, 应该直接使用URLAssetWithURL:options方法实例化一个AVURLAsset. options参数允许通过传递一个带有一个或多个初始化选项的NSDictionary来自定义资源初始化的方式. 载入mp4的示例
NSURL *url = [[NSBundle mainBundle] URLForResource:@"video", withExtentsion:@"mp4"];
NSDictionary *options = @{AVURLAssetPreciseDurationAndTimingKey: @YES}; //使用AVAsynchronousKeyValueLoading协议载入时可以计算出准确的时长和时间信息. 会造成一些额外开销.
AVAsset *asset = [AVURLAsset URLAssetWithURL:url options:options];
NSArray *keys = @[@"tracks", @"duration", @"commonMEtadata"];
[asset loadValuesAsynchronouslyForKeys:keys completionHandler:^{
}];
8. 创建组合资源, 当创建组合轨道时, 必须指明所能支持的媒体类型, 并给出一个轨道标识符. 这个标识符在我们之后需要返回轨道时会用到, 不过一般来说都赋给他一个kCMPersistentTrackID_Invalid常量, 意思是将创建一个合适轨道的ID的任务委托给框架, 标识符会以1...n排列.
AVAsset *goldenGateAsset = //prepared golden gate asset
AVAsset *teaGardenAsset = //prepared tea garden asset
AVAsset *soundTrackAsset = //prepared sound track asset
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *videoTack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; //添加Video轨道
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; //添加Audio轨道
9. 将独立的媒体片段添加到组合轨道内
CMTime cursorTime = kCMTimeZero;
CMTime videoDuration = kCMTimeMake(5, 1);
CMTimeRange videoTimeRange = CMTimeRangeMake(kCMTimeZero, videoDuration);
AVAssetTrack *assetTrack = [[goldenGateAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
[videoTrack insertTimeRange:videoTimeRange ofTrack:assetTrack atTime:cursorTime error:nil];
cursorTime = CMTimeAdd(cursorTime, videoDuration); //add cursor time
assetTrack = [[teaGardenAsset trackWithMediaType:AVMediaTypeVideo] firstObject];
[videoTrack insertTimeRange:videoTimeRange ofTrack:assetTrack atTime:cursorTime error:nil];
cursorTime = kCMTimeZero; //reset cursor time
CMTime audioDuration = composition.duration;
CMTimeRange audioTimeRange = CMTimeRAngeMake(kCMTimeZero, audioDuration);
assetTrack = [[soundtrackAsset tracksWithMediaType:AVMediaTypeAudio] firstObject];
[audioTrack insertTimeRange:audioTimeRange ofTrack:assetTrack atTime:cursorTime error:nil];
这个组合资源现在与其他AVAsset一样, 可以播放, 导出或处理.
10. 15 Seconds实例应用. (包括视频播放, 读取元数据信息, 图片提取)
//这个接口用来创建一个组合的可播放版本和可导出版本.
@protocol THComposition <NSObject>
- (AVPlayerItem *)makePlayable;
- (AVAssetExportSession *)makeExportable;
@end
//创建一个基础composition
@interface THBasicComposition : NSObject <THComposition>
@property (nonatomic, readonly, strong) AVComposition *composition;
@property (nonatomic, strong) AVAudioMix *audioMix;
+ (instancetype)compositionWithComposition:(AVComposition *)composition;
- (instancetype)initWithComposition:(AVCompostion *)composition;
@end
@implementation THBasicCompostion
- (instancetype)initWithComposition:(AVCompostion *)compostion audioMix:(AVAudioMix *)audioMix {
self = [super init];
if (self) {
_composition = compostion;
_audioMix = audioMix;
}
retrun self;
}
- (AVPlayerItem *)makePlayable {
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:[self.composition copy]];
playerItem.audioMix = self.audioMix;
return playerItem;
}
- (AVAssetExportSession *)makeExportable {
NSString *preset = AVAssetExportPresetHigestQuality;
AVAssetExportSession *session = [AVAssetExportSession exportSessionWithAsset:[self.composition copy] presetName:preset];
session.audioMix = self.audioMix;
return session;
}
@end
11. 创建一个组合composition, 对于composition实现, 应用程序内都会有一个相应的THCompositionBuilder负责构建实例.
@protocol THCompositionBuilder <NSObject>
- (id <THComposition>)buildComposition; //构建AVComposition以及相关的轨道
@end
@interface THBasicCompositionBuilder : NSObject <THCompositionBuilder>
@property (strong, nonatomic) THTimeline *timeline;
@property (strong, nonatomic) AVMutableComposition *composition;
- (id)initWithTimeline:(THTimeline *)timeline;
@end
@implementation THBasicCompositionBuilder
- (id)initWithTimeline:(THTimeline *)timeline {
self = [super init];
if (self) {
_timeline = timeline;
}
return self;
}
- (id <THComposition>)buildComposition {
self.composition = [AVMutableComposition composition];
[self addCompositionTrackOfType:AVMediaTypeVideo withMediaItems:self.timeline.videos];
[self addCompositionTrackOfType:AVMediaTypeAudio withMediaItems:self.timeline.voiceOvers];
[self addCompositionTrackOfType:AVMediaTypeAudio withMediaItems:self.timeline.musicItems];
return [THBasicComposition compositionWithComposition:self.composition];
}
- (void)addCompositionTrackOfType:(NSString *)mediaType withMediaItems:(NSArray *)mediaItems {
if (!THIsEmpty(mediaItems)) {
CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;
AVMutableCompositionTrack *compositionTrack = [self.composition addMutableTrackWithMediaType:mediaType preferredTrackID:trackID];
CMTime cursorTime = kCMTimeZero;
for (THMediaItem *item in mediaItems) {
if (CMTIME_COMPARE_INLINE(item.startTimeInTimeline != kCMTimeInvalid)) {
cursorTime = item.startTimeInTimeline;
}
AVAssetTrack *assetTrack = [[item.asset tracksWithMediaType:mediaType] firstObject];
[compositionTrack insertTimeRange:item.timeRange ofTrack:assetTrack atTime:cursorTime error:nil];
cursorTime = CMTimeAdd (cursorTime, item.timeRange.duration);
}
}
}
@end
12. 导出组合
@interface THCompositionExporter : NSObject
@property (nonatomic) BOOL exporting;
@property (nonatomic) BOOL progress;
- (instancetype)initWithComposition:(id<THComposition>)composition;
- (void)beginExport; //负责实际的导出过程
@end
@interface THCompositionExporter ()
@property (strong, nonatomic) id<THComposition>composition;
@property (strong, nonatomic) AVAssetExportSession *exportSession;
@end
@implementation THCompositionExporter
- (instancetype)initWithComposition:(id<THComposition>)composition {
self = [super init];
if (self) {
_composition = compostion;
}
return self;
}
- (void)beginExport { //处理导出语句
self.exportSession = [self.composition makeExportable];
self.exportSession.outputURL = [self exportURL];
self.exportSession.outputFileType = AVFileTypeMPEG4;
[self.exportSesssion exportAsynchronouslyWithCompletionHandler:^{
dispatch_async(dispatch_get_main_queue(), ^{
AVAssetExportSessionStatus status = self.exportSession.status;
if (status == AVAssetExportSessionStatusCompleted) {
[self writeExportedVideoToAssetsLibraray];
}else {
[UIAlertView showAlertWithTitle:@"Export Failed" message:@"The requested export failed."];
}
});
}];
self.exporting = YES;
[self monitorExportProgress];
}
- (void)monitorExportProgress { //监视导出过程
double delayInSecond = 0.1;
int64_t delta = (int64_t)delayInSeconds * NSEC_PRE_SEC;
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, delta);
dispatch_after(popTime, dispatch_get_main_queue(), ^{
AVAssetExportSessionStatus status = self.exportSession.status;
if (status == AVAssetExportSessionStatusExporting) {
self.progress = self.exportSession.progress;
[self monitorExportProgress];
}else {
self.exporting = NO;
}
});
[self monitorExportProgress];
}
- (NSURL *)exportURL {
NSString *filePath = nil;
NSUInteger count = 0;
do {
filePath = NSTemporaryDirectory();
NSString *numberString = count > 0 ? [NSString stringWithFormat:@"-%li", cout] : @"";
NSString fileNameString = [NSString stringWithFormat:@"m-%@.m4v", numberString];
filePath = [filePath stringByAppendingPathComponent:fileNameString];
count++;
}while ([[NSFileManager defaultManager] fileExistsAtPath:filePath]);
return [NSURL fileURLWithPath:filePath];
}
- (void)writeExoprtedVideoToAssetLibrary {
NSURL *exportURL = self.exportSession.outputURL;
ALAssetsLibrary *library = [[ALAssetLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportURL]) {
[library writeVideoAtPathToSavedPhotosAlbum:exportURL completionBlock:^(NSURL *assetURL, NSError *error){
if (error) {
//
}
[[NSFileManager defaultManager] removeItemAtURL:exportURL error:nil];
}];
}else {
NSLog(@"Video could not be exported to the assets library");
}
}
@end