I'm using BradLarson's GPUImage iOS framework. I'm trying to add a mask to my live camera input following the example in the filtershowcase. With other filters this simple chain works but with GPUImageMaskFilter it doesn't. My output is just a white screen. What's missing?
This is my code:
stillCamera = [[GPUImageStillCamera alloc] init];
stillCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
GPUImageView *filterView = [[GPUImageView alloc]initWithFrame:CGRectMake(0, 0, 320, 427)];
[self.view addSubview:filterView];
maskFilter = [[GPUImageMaskFilter alloc] init];
[(GPUImageFilter*)maskFilter setBackgroundColorRed:0.0 green:1.0 blue:0.0 alpha:1.0];
UIImage *mask = [UIImage imageNamed:@"mask.png"];
GPUImagePicture *maskImage = [[GPUImagePicture alloc] initWithImage:mask smoothlyScaleOutput:YES];
[maskImage processImage];
[maskImage addTarget:maskFilter];
[stillCamera addTarget:maskFilter];
[maskFilter addTarget:filterView];
[stillCamera startCameraCapture];
I also tried with the GPUImageVideoCamera but that doesn't help. I think the problem might be with the filter chain or the declaration of the mask filter.
I checked this question How to Implement GPUImageMaskFilter using GPUImage but this is for still pictures and not live camera input.
Any ideas?
You need to first add targets like :
then process the image :
then you can pass the filter result to the view :