在flutter中,想要实现一个视频播放器,或者调用摄像头来拍照或者录制视频,这样就需要将预览画面显示到Flutter UI中,除了使用PlatformView方案外,还可以使用Texture(外接纹理)的渲染方式来实现。
Texture可以理解为GPU内保存将要绘制的图像数据的一个对象,Flutter engine会将Texture的数据在内存中直接进行映射(而无需在原生和Flutter之间再进行数据传递),Flutter会给每一个Texture分配一个id,使用的时候只要有textureId即可显示。通过外接纹理的方式,实际上Flutter和Native传输的数据载体就是PixelBuffer,Native端的数据源(摄像头、播放器等)将数据写入PixelBuffer,Flutter拿到PixelBuffer以后转成OpenGLES Texture,交由Skia绘制。
const Texture({
super.key,
required this.textureId,
this.freeze = false,
this.filterQuality = FilterQuality.low,
});
flutter 外接纹理实现流程
注册纹理
Native创建一个对象,实现FlutterTexture协议,该对象用来管理具体的纹理数据
通过FlutterTextureRegistry来注册第一步的FlutterTexture对象,获取一个Flutter纹理id
将该id通过channel机制传递给dart侧,dart侧就能够通过Texture这个widget来使用纹理了,参数就是id纹理渲染
dart侧声明一个Texture widget,表明该widget实际渲染的是native提供的纹理
engine侧拿到layerTree,layerTree的TextureLayer节点负责外接纹理的渲染
首先通过dart侧传递的id,找到先注册的FlutterTexture,该flutterTexture是我们自己用native代码实现的,其核心是实现了copyPixelBuffer方法
flutter engine调用copyPixelBuffer拿到具体的纹理数据,然后交由底层进行GPU渲染
- iOS端实现关键代码( CameraManager需要实现FlutterTexture协议 )
HeartRateHelperPlugin
public static func register(with registrar: FlutterPluginRegistrar) {
let _channel = FlutterMethodChannel(name: "heart_rate_plugin", binaryMessenger: registrar.messenger())
channel = _channel
let instance = HeartRateHelperPlugin()
registrar.addMethodCallDelegate(instance, channel: _channel)
//管理texture
texture = registrar.textures()
let camera = CameraManager(cameraType: .front)
//获取textureId,通过MethodChannel传给flutter
let textureId = texture!.register(camera)
camera.updateHandle = { [weak self] in
//主动调用textureFrameAvailable,告诉TextureRegistry更新画面
texture?.textureFrameAvailable(textureId)
}
}
typealias onFrameAvailable = () -> Void
class CameraManager: NSObject,FlutterTexture {
//...
var updateHandle: onFrameAvailable?
private var pixelBufferSynchronizationQueue:dispatch_queue_t?
private var latestPixelBuffer:CVPixelBuffer?
func copyPixelBuffer() -> Unmanaged<CVPixelBuffer>? {
var pixelBuffer:Unmanaged<CVPixelBuffer>? = nil
//实现获取外界纹理的逻辑
// Use `dispatch_sync` because `copyPixelBuffer` API requires synchronous return.
pixelBufferSynchronizationQueue?.sync {
if let newBuffer = latestPixelBuffer {
pixelBuffer = Unmanaged.passRetained(newBuffer)
}
latestPixelBuffer = nil
}
return pixelBuffer
}
}
//获取每帧的数据,
extension CameraManager: AVCaptureVideoDataOutputSampleBufferDelegate {
// MARK: - Export buffer from video frame
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
if(!CMSampleBufferDataIsReady(sampleBuffer)) {
print("sample buffer is not ready. Skipping sample")
return
}
// Use `dispatch_sync` to avoid unnecessary context switch under common non-contest scenarios;
// Under rare contest scenarios, it will not block for too long since the critical section is
// quite lightweight.
pixelBufferSynchronizationQueue?.sync {
if let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
self.latestPixelBuffer = pixelBuffer
if let updateHandle = updateHandle {
updateHandle()
}
}
}
}
- flutter端实现
//通过MethodChannel接受的外接纹理ID
Texture(
textureId: state.textureId.value,
),
- 源码分析
以下是IOS端的TextureLayer节点的最终绘制代码(android类似,但是纹理获取方式略有不同),整体过程可以分为三步
1):调用external_texture copyPixelBuffer,获取CVPixelBuffer
2):CVOpenGLESTextureCacheCreateTextureFromImage创建OpenGL的Texture(这个是真的Texture)
3):将OpenGL Texture封装成SKImage,调用Skia的DrawImage完成绘制。
void IOSExternalTextureGL::Paint(SkCanvas& canvas, const SkRect& bounds) {
if (!cache_ref_) {
CVOpenGLESTextureCacheRef cache;
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL,
[EAGLContext currentContext], NULL, &cache);
if (err == noErr) {
cache_ref_.Reset(cache);
} else {
FXL_LOG(WARNING) << "Failed to create GLES texture cache: " << err;
return;
}
}
fml::CFRef<CVPixelBufferRef> bufferRef;
bufferRef.Reset([external_texture_ copyPixelBuffer]);
if (bufferRef != nullptr) {
CVOpenGLESTextureRef texture;
CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(
kCFAllocatorDefault, cache_ref_, bufferRef, nullptr, GL_TEXTURE_2D, GL_RGBA,
static_cast<int>(CVPixelBufferGetWidth(bufferRef)),
static_cast<int>(CVPixelBufferGetHeight(bufferRef)), GL_BGRA, GL_UNSIGNED_BYTE, 0,
&texture);
texture_ref_.Reset(texture);
if (err != noErr) {
FXL_LOG(WARNING) << "Could not create texture from pixel buffer: " << err;
return;
}
}
if (!texture_ref_) {
return;
}
GrGLTextureInfo textureInfo = {CVOpenGLESTextureGetTarget(texture_ref_),
CVOpenGLESTextureGetName(texture_ref_), GL_RGBA8_OES};
GrBackendTexture backendTexture(bounds.width(), bounds.height(), GrMipMapped::kNo, textureInfo);
sk_sp<SkImage> image =
SkImage::MakeFromTexture(canvas.getGrContext(), backendTexture, kTopLeft_GrSurfaceOrigin,
kRGBA_8888_SkColorType, kPremul_SkAlphaType, nullptr);
if (image) {
canvas.drawImage(image, bounds.x(), bounds.y());
}
}
PlatformView和Texture的优缺点
最开始调用摄像头录制视频采用的是PlatformView的方式(UiKitView组件),但是进入录制页面出现了内存暴涨,并且退出录制页面后没有明显下降,后面参考camera插件,改成了Texture的方案,性能上有明显提升
参考链接:
https://blog.csdn.net/gloryFlow/article/details/139090371
https://juejin.cn/post/6844903662548942855