I need to show live camera feed. I used GPUImage, when i execute the code i'm getting this error. I referred some document and i written code in viewDidLoad. when i open the app, it crashing.

 - (void)viewDidLoad
    {
        [super viewDidLoad];
    GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
    videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

    GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomShader"];
    GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, self.view.frame.size.width, self.view.frame.size.height)];

    // Add the view somewhere so it's visible

    [videoCamera addTarget:customFilter];
    [customFilter addTarget:filteredVideoView];

    [videoCamera startCameraCapture];

    }

Log:

Failed to load vertex shader
  Failed to compile fragment shader
 Program link log: ERROR: OpenGL ES 2.0 requires exactly one vertex and one fragment shader to validly link.
 Fragment shader compile log: (null)
 Vertex shader compile log: (null)
  *** Assertion failure in -[GPUImageFilter initWithVertexShaderFromString:fragmentShaderFromString:], /Users/ranganathagv/Projects/MobileApp/test/iOS/View/28May_GPU/BradLarson-GPUImage-f67cbd9/framework/Source/GPUImageFilter.m:76
  *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Filter shader link failed'
1

There are 1 answers

10
Chan On

Make sure that the shaders have been added to your Copy Bundle Resources build phase. By default, Xcode tries to compile them as source code files, rather than include them in your application bundle like they should be.