我正在尝试做一个Overlay Blend
具有相机源输出的库存图像,其中库存图像的不透明度小于 100%。我想我可以放一个GPUImageOpacityFilter
在过滤器堆栈中,一切都会好起来的:
- GPUImageVideoCamera -> MY_GPUImageOverlayBlendFilter
- GPUImagePicture -> GPUImageOpacityFilter (不透明度 0.1f) -> MY_GPUImageOverlayBlendFilter
- MY_GPUImageOverlayBlendFilter -> GPUImageView
但结果并不是将 GPUImagePicture 的 0.1f alpha 版本混合到 GPUImageVideoCamera 中,而是软化了 GPUImagePicture 的颜色/对比度并将其混合。所以我做了一些搜索,并根据建议尝试使用 GPUImageOpacity 过滤器获取 UIImageimageFromCurrentlyProcessedOutput
并将其发送到 BlendFilter 中:
- GPUImagePicture -> MY_GPUImageOpacityFilter(不透明度 0.1f)
-
[MY_GPUImageOpacityFilter imageFromCurrentlyProcessedOutput]-> MY_alphaedImage
- GPUImagePicture (MY_alphaedImage) -> MY_GPUImageOverlayBlendFilter
- GPUImageVideoCamera -> MY_GPUImageOverlayBlendFilter
- MY_GPUImageOverlayBlendFilter -> GPUImageView
这完全符合我的预期。那么,为什么我必须imageFromCurrentlyProcessedOutput
,这不应该发生在队列中吗?以下是上述两种场景的代码片段:
第一:
//Create the GPUPicture
UIImage *image = [UIImage imageNamed:@"someFile"];
GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease];
//Create the Opacity filter w/0.5 opacity
GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease];
opacityFilter.opacity = 0.5f
[textureImage addTarget:opacityFilter];
//Create the blendFilter
GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease];
//Point the cameraDevice's output at the blendFilter
[self._videoCameraDevice addTarget:blendFilter];
//Point the opacityFilter's output at the blendFilter
[opacityFilter addTarget:blendFilter];
[textureImage processImage];
//Point the output of the blendFilter at our previewView
GPUImageView *filterView = (GPUImageView *)self.previewImageView;
[blendFilter addTarget:filterView];
第二个:
//Create the GPUPicture
UIImage *image = [UIImage imageNamed:@"someFile"];
GPUImagePicture *textureImage = [[[GPUImagePicture alloc] initWithImage:image] autorelease];
//Create the Opacity filter w/0.5 opacity
GPUImageOpacityFilter *opacityFilter = [[[GPUImageOpacityFilter alloc] init] autorelease];
opacityFilter.opacity = 0.5f
[textureImage addTarget:opacityFilter];
//Process the image so we get a UIImage with 0.5 opacity of the original
[textureImage processImage];
UIImage *processedImage = [opacityFilter imageFromCurrentlyProcessedOutput];
GPUImagePicture *processedTextureImage = [[[GPUImagePicture alloc] initWithImage:processedImage] autorelease];
//Create the blendFilter
GPUImageFilter *blendFilter = [[[GPUImageOverlayBlendFilter alloc] init] autorelease];
//Point the cameraDevice's output at the blendFilter
[self._videoCameraDevice addTarget:blendFilter];
//Point the opacityFilter's output at the blendFilter
[processedTextureImage addTarget:blendFilter];
[processedTextureImage processImage];
//Point the output of the blendFilter at our previewView
GPUImageView *filterView = (GPUImageView *)self.previewImageView;
[blendFilter addTarget:filterView];
None
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)