1. 混合音频, 首先就是音乐轨道刚开始音量就很大, 并且在组合资源时又突然停止, 用户会觉得很震耳, 如果声音刚开始是渐渐增大, 结束的时候声音渐渐减小, 会带来更好的体验. 另外一个问题就是画外音轨道的处理. 音乐轨道的声音覆盖了画外音, 解决这种音频轨道冲突一种名为闪避处理的技术. 在画外音持续期间将音乐声调低, 画外音完成后恢复到之前的音量. 可以使用AVAudioMix来解决这两个问题.
2. 自动调节音量, 在音频混合的问题上, 最核心的处理就是调整组合音频轨道的音量, 当一个组合源播放或导出时, 默认行为是以最大音量或正常音量播放的音频轨道. 通常比较习惯使用分贝来描述音量, AVFoundation使用浮点型数值, 范围0.0(静音) ~ 1.0(最大音量). 音频轨道默认音量为1.0, 不过可以使用AVMutableAudioMixInputParameters实例可以修改这个值. 这个对象允许在一个指定时间点或给定的时间范围自动调节音量.
3. AVMutableAudioMixInputParameters提供了两个方法来实现音量调节, setVolume:atTime: 在指定时间点立即调节音量, 音量在音频轨道持续时间内会保持不变, 直到有另一个音量调节出现. setVolumeRampFromStartVolume:toEndVolume:timeRange: 允许在一个给定时间范围内平滑地将音量从一个值调节到另外一个值.
AVCompositionTrack *track = //audio track in composition
CMTime twoSeconds = CMTimeMake(2,1);
CMTime fourSeconds = CMTimeMake(4,1);
CMTime sevenSeconds = CMTimeMake(7,1);
AVMutableAudioMixInputParameters *parameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[parameters setVolume:0.5 atTime:kCMTimeZero];
CMTimeRange range = CMTimeRangeFromTimeToTime(twoSeconds, fourSecond);
[parameters setVolumeRampFromStartVolume:0.5f toEndVolume:0.8f timeRange:range];
[parameters setVolume:0.3 atTime:sevenSeconds];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = @[parameters]; //音频混合可以设置为AVPlayerItem或AVAssetExportSession的audioMix属性进行播放或导出.
4. THCopositionBuilder, 具有一个简单的接口可以接受THTimeline, THTimeline对象包含了应用程序时间区域的状态并提供了创建组合对象所需的数据.
@property (nonatomic, strong) THTimeline *timeline;
@property (nonatomic, strong) AVMutableComposition *composition;
@implementation THAudioMixCompositionBuilder
- (id)initWithTimeline:(THTimeline *)timeline {
self = [super init];
if (self) {
_timeline = timeline;
}
return self;
}
- (id <THComposition>)buildComposition {
self.composition = [AVMutableComposition composition];
[self addCompositionTrackOfType:AVMediaTypeVideo withMediaItems:self.timeline.videos];
[self addCompositionTrackOfType:AVMediaTypeAudio withMediaItems:self.timeline.voiceOvers];
AVMutableCompositionTrack *musicTrack = [self addCompositionTrackOfType: AVMediaTypeAudio withMediaItems:self.timeline.musicItems];
AVAudioMix *audioMix = [self buildAudioMixWithTrack:musicTrack];
return [THAudioMixComposition compositionWithComposition:self.composition audioMix:audioMix];
}
- (AVMutableCompositionTrack *)addCompositionTrackOfType:(NSString *)type withMediaItems:(NSArray *)mediaItems {
if (!THIsEmpty(mediaItems)) {
CMPersistentTrackID trackID = kCMPersistentTrackID_Invalid;
AVMutableCompositionTrack *compositionTrack = [self.composition adMutableTrackWithMediaType:type preferredTrackID:trackID];
CMTime cursorTime = kCMTimeZero; // set insert cursor to 0
for (THMediaItem *item in mediaItems) {
if (CMTIME_COMPARE_INLINE(item.startTimeInTimeline != kCMTimeInvalid)) {
cursorTime = item.startTimeInTimeline;
}
AVAssetTrack *assetTrack = [[item.asset tracksWithMediaType:type] firstObject];
[compositionTrack insertTimeRange:item.timeRange ofTrack:assetTrack atTime:cursorTime error:nil];
cursorTime = CMTimeAdd(cursorTime, item.timeRange.duration);
}
return compositionTrack;
}
return nil;
}
@end
- (AVAudioMix *)buildAudioMixWithTrack:(AVMutableCompositionTrack *)track {
THAudioItem *item = [self.timeline.musicItems firstObject];
if (item) {
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
AVMutableAudioMixInputParameters *parameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
for (THVolumeAutomation *automation in item.volumeAutomation) {
[parameters setVolumeRampFromStartVolume:automation.startVolume toEndVolume:automation.endVolume timeRange:automation.timeRange];
}
audioMix.inputParameters = @[parameters];
return audioMix;
}
return nil;
}