今天使用 Core image 做图片合成,碰到了alpha丢失的问题。如果top Image 的alpha < 1,那么合成的图片 alpha 的就会丢失。我查了一下原因
FOR BLACK-AND-WHITE TEXT
If you're using .normal compositing operation you'll definitely get not the same result as using .hardLight. Your picture shows the result of .hardLight operation.
.normal operation is classical OVER op with formula: (Image1 * A1) + (Image2 * (1 – A1)).
Here's a premultiplied text (RGB*A), so RGB pattern depends on A's opacity in this particular case. RGB of text image can contain any color, including a black one. If A=0 (black alpha) and RGB=0 (black color) and your image is premultiplied – the whole image is totally transparent, if A=1 (white alpha) and RGB=0 (black color) – the image is opaque black.
If your text has no alpha when you use .normal operation, I'll get ADD op: Image1 + Image2.
To get what you want, you need to set up a compositing operation to .hardLight.
.hardLight compositing operation works as .multiply
if alpha of text image less than 50 percent (A < 0.5, the image is almost transparent)
Formula for .multiply: Image1 * Image2
.hardLight compositing operation works as .screen
if alpha of text image greater than or equal to 50 percent (A >= 0.5, the image is semi-opaque)
Formula 1 for .screen: (Image1 + Image2) – (Image1 * Image2)
Formula 2 for .screen: 1 – (1 – Image1) * (1 – Image2)
解决方案,就是将CIContext 的 kCIContextWorkingColorSpace ,设置成NSNull
CIContext *ciContext = [CIContext contextWithOptions:@{kCIContextWorkingColorSpace: [NSNull null]}];
CIFilter *filter = [CIFilter filterWithName:@"CISourceOverCompositing"];
[filter setDefaults];
CIImage *contentImage = [self getContentCIImage:imageSize]; [filter setValue:inputImage forKey:kCIInputBackgroundImageKey]; [filter setValue:contentImage forKey:kCIInputImageKey];
CIImage *outputImage = filter.outputImage;
[ciContext createCGImage:outputImage fromRect:outputImage.extent];