iOS视觉-- (10) OpenGL ES+GLSL实现YUV视频渲染解析

本文借鉴:落影大神--iOS开发-OpenGL ES实践教程(一)

本文Demo
前面一篇我们学习了OpenGL ES渲染摄像头录制的视频帧的知识,就是将照相机录制的CMSampleBuffer转换成CVOpenGLESTexture纹理的再进行渲染的过程。如下图:

CMSampleBuffer转纹理

这篇呢我们将要学习使用OpenGL ES渲染YUV视频帧,原理和渲染照相机是一样的。重点在于视频帧的获取。这里我使用两种方式进行视频帧的获取。这里由于没有录制视频时的回调方法,我们可以使用一个定时器来代替,这里使用 CADisplayLink

        displayLink = CADisplayLink(target: self, selector: #selector(displayLinkDidUpdate(_:)))
        displayLink.add(to: RunLoop.current, forMode: RunLoop.Mode.default)
        displayLink.preferredFramesPerSecond = 30
        displayLink.isPaused = true
  • 1. 使用 AVPlayerItemVideoOutput 方式获取:
        let videoURL = URL(fileURLWithPath: Bundle.main.path(forResource: "test.mov", ofType: nil)!)
        self.reader = DDAssetReader(videoURL)
        
        let item = AVPlayerItem(url: videoURL)
        player = AVPlayer(playerItem: item)
        let asset: AVAsset = item.asset
        asset.loadValuesAsynchronously(forKeys: ["tracks"]) {
            if asset.statusOfValue(forKey: "tracks", error: nil) == AVKeyValueStatus.loaded {
                let tracks = asset.tracks(withMediaType: AVMediaType.video)
                if tracks.count > 0 {
                    // Choose the first video track.
                    let videoTrack: AVAssetTrack = tracks.first!
                    videoTrack.loadValuesAsynchronously(forKeys: ["preferredTransform"]) {
                        if videoTrack.statusOfValue(forKey: "preferredTransform", error: nil) == AVKeyValueStatus.loaded {
                            let preferredTransform: CGAffineTransform = videoTrack.preferredTransform
                            let preferredRotation = -1 * atan2(preferredTransform.b, preferredTransform.a)
                            NSLog("preferredRotation ----> \(preferredRotation)")

                            DispatchQueue.main.async {
                                item.add(self.videoOutput)
                                self.player.replaceCurrentItem(with: item)
                                self.videoOutput.requestNotificationOfMediaDataChange(withAdvanceInterval: 0.03)
                                self.player.play()
                            }
                        }
                    }
                }
            }
        }

        player.actionAtItemEnd = AVPlayer.ActionAtItemEnd.none
        NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: item, queue: OperationQueue.main) { noti in
            self.player.currentItem?.seek(to: CMTime.zero, completionHandler: { suc in

            })
        }

        mProcessQueue = DispatchQueue(label: "mProcessQueue")

        //kCVPixelFormatType_32BGRA
        videoOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [String(kCVPixelBufferPixelFormatTypeKey) : kCVPixelFormatType_420YpCbCr8BiPlanarFullRange])
        videoOutput.setDelegate(self, queue: mProcessQueue)

然后通过CADisplayLink的回调方法来获取视频帧:

    @objc func displayLinkDidUpdate(_ sender: CADisplayLink) {
        var outputItemTime: CMTime = .invalid

        // Calculate the nextVsync time which is when the screen will be refreshed next.
        let nextVSync: CFTimeInterval = sender.timestamp + sender.duration
        outputItemTime = videoOutput.itemTime(forHostTime: nextVSync)

        if videoOutput.hasNewPixelBuffer(forItemTime: outputItemTime) {
            var pixelBuffer: CVPixelBuffer?
            pixelBuffer = videoOutput.copyPixelBuffer(forItemTime: outputItemTime, itemTimeForDisplay: nil)
            self.renderView.renderBuffer(pixelBuffer: pixelBuffer)
        }
    }

  • 2. 使用 AVAssetReader + AVAssetReaderTrackOutput 方式获取:
class DDAssetReader: NSObject {

    var readerVideoTrackOutput: AVAssetReaderTrackOutput!
    var assetReader: AVAssetReader!
    var videoUrl: URL!
    var lock: NSLock!
    
    init(_ url: URL) {
        super.init()
        videoUrl = url
        lock = NSLock()
        customInit()
    }
    
    func customInit() {
        let inputOptions = [AVURLAssetPreferPreciseDurationAndTimingKey : true]
        let inputAsset = AVURLAsset(url: videoUrl, options: inputOptions)
        inputAsset.loadValuesAsynchronously(forKeys: ["tracks"]) {
            DispatchQueue.global().async {
                var error: NSError?
                let tracksStatus = inputAsset.statusOfValue(forKey: "tracks", error: &error)
                if (tracksStatus != AVKeyValueStatus.loaded) {
                    NSLog("error = \(error!)")
                    return
                }
                self.processWithAsset(inputAsset)
            }
        }
    }
    
    func processWithAsset(_ asset: AVAsset) {
        lock.lock()
        NSLog("processWithAsset")

        assetReader = try? AVAssetReader(asset: asset)
        
        let outputSettings = [String(kCVPixelBufferPixelFormatTypeKey) : kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
        
        readerVideoTrackOutput = AVAssetReaderTrackOutput(track: asset.tracks(withMediaType: AVMediaType.video).first!, outputSettings: outputSettings)
        readerVideoTrackOutput.alwaysCopiesSampleData = false
        assetReader.add(readerVideoTrackOutput)

        
        if (assetReader.startReading() == false) {
            NSLog("Error reading from file at URL: %@", asset)
        }
        lock.unlock()
    }
    
    func readBuffer() -> CMSampleBuffer? {
        lock.lock()
        var sampleBuffer: CMSampleBuffer?
        
        if ((readerVideoTrackOutput) != nil) {
            sampleBuffer = readerVideoTrackOutput.copyNextSampleBuffer()
        }
        
        if ((assetReader != nil) && assetReader.status == AVAssetReader.Status.completed) {
            NSLog("customInit")
            readerVideoTrackOutput = nil
            assetReader = nil
            customInit()
        }
        
        lock.unlock()
        return sampleBuffer
    }
}

本文Demo

©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容