GPUImage

As a note: if you run into the error "Unknown class GPUImageView in Interface Builder" or the like when trying to build an interface with Interface Builder, you may need to add -ObjC to your Other Linker Flags in your project's build settings.

GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];

videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomShader"];

GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, viewWidth, viewHeight)];

// Add the view somewhere so it's visible

[videoCamera addTarget:customFilter];

[customFilter addTarget:filteredVideoView];

[videoCamera startCameraCapture];

Objective-C interface. This interface lets you define input sources for

images and video, attach filters in a chain, and send the resulting

processed image or video to the screen, to a UIImage, or to a movie on

disk.

Images or frames of video are uploaded from source objects, which are

subclasses of GPUImageOutput. These include GPUImageVideoCamera (for

live video from an iOS camera), GPUImageStillCamera (for taking photos

with the camera), GPUImagePicture (for still images), and GPUImageMovie

(for movies). Source objects upload still image frames to OpenGL ES as

textures, then hand those textures off to the next objects in the

processing chain.

Filters and other subsequent elements in the chain conform to the

GPUImageInput protocol, which lets them take in the supplied or

processed texture from the previous link in the chain and do something

with it. Objects one step further down the chain are considered targets,

and processing can be branched by adding multiple targets to a single

output or filter.

For example, an application that takes in live video from the camera,

converts that video to a sepia tone, then displays the video onscreen

would set up a chain looking something like the following:

GPUImageVideoCamera -> GPUImageSepiaFilter -> GPUImageView

GPUImage needs a few other frameworks to be linked into your

application, so you'll need to add the following as linked libraries in

your application target:

CoreMedia

CoreVideo

OpenGLES

AVFoundation

QuartzCore

Filtering live video

To filter live video from an iOS device's camera, you can use code like the following:

GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];

videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomShader"];

GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, viewWidth, viewHeight)];

// Add the view somewhere so it's visible

[videoCamera addTarget:customFilter];

[customFilter addTarget:filteredVideoView];

[videoCamera startCameraCapture];

Capturing and filtering a still photo

To capture and filter still photos, you can use a process similar to

the one for filtering video. Instead of a GPUImageVideoCamera, you use a

GPUImageStillCamera:

stillCamera = [[GPUImageStillCamera alloc] init];

stillCamera.outputImageOrientation = UIInterfaceOrientationPortrait;

filter = [[GPUImageGammaFilter alloc] init];

[stillCamera addTarget:filter];

GPUImageView *filterView = (GPUImageView *)self.view;

[filter addTarget:filterView];

[stillCamera startCameraCapture];

This will give you a live, filtered feed of the still camera's

preview video. Note that this preview video is only provided on iOS 4.3

and higher, so you may need to set that as your deployment target if you

wish to have this functionality.

Once you want to capture a photo, you use a callback block like the following:

[stillCamera capturePhotoProcessedUpToFilter:filter withCompletionHandler:^(UIImage *processedImage, NSError *error){

NSData *dataForJPEGFile = UIImageJPEGRepresentation(processedImage, 0.8);

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);

NSString *documentsDirectory = [paths objectAtIndex:0];

NSError *error2 = nil;

if (![dataForJPEGFile writeToFile:[documentsDirectory stringByAppendingPathComponent:@"FilteredPhoto.jpg"] options:NSAtomicWrite error:&error2])

{

return;

}

}];

Processing a still image

There are a couple of ways to process a still image and create a

result. The first way you can do this is by creating a still image

source object and manually creating a filter chain:

UIImage *inputImage = [UIImage imageNamed:@"Lambeau.jpg"];

GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];

GPUImageSepiaFilter *stillImageFilter = [[GPUImageSepiaFilter alloc] init];

[stillImageSource addTarget:stillImageFilter];

[stillImageFilter useNextFrameForImageCapture];

[stillImageSource processImage];

UIImage *currentFilteredVideoFrame = [stillImageFilter imageFromCurrentFramebuffer];

The following is an example of how you would load a sample movie,

pass it through a pixellation filter, then record the result to disk as a

480 x 640 h.264 movie:

movieFile = [[GPUImageMovie alloc] initWithURL:sampleURL];

pixellateFilter = [[GPUImagePixellateFilter alloc] init];

[movieFile addTarget:pixellateFilter];

NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Movie.m4v"];

unlink([pathToMovie UTF8String]);

NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];

movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(480.0, 640.0)];

[pixellateFilter addTarget:movieWriter];

movieWriter.shouldPassthroughAudio = YES;

movieFile.audioEncodingTarget = movieWriter;

[movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter];

[movieWriter startRecording];

[movieFile startProcessing];

GPUImage混合使用

首先在.h文件声明:

GPUImagePicture *staticPicture;

GPUImageOutput * brightnessFilter;//亮度

GPUImageOutput * contrastFilter;//对比度

NSMutableArray*arrayTemp;

UISlider*brightnessSlider;

UISlider* contrastSlider;

在.m文件中viewdidload中

UIImage * image = [UIImage imageNamed:@"sample1.jpg"];

staticPicture=[[GPUImagePicture alloc] initWithImage:image smoothlyScaleOutput:YES];

//亮度

brightnessFilter =[[GPUImageBrightnessFilter alloc] init];

CGRect mainScreenFrame=[[UIScreen mainScreen] applicationFrame];

GPUImageView* GPUView =[[GPUImageView alloc] initWithFrame:[[UIScreen mainScreen] applicationFrame]];

[brightnessFilter forceProcessingAtSize:GPUView.sizeInPixels];

self.view=GPUView;

[brightnessFilter addTarget:GPUView];

brightnessSlider= [[UISlider alloc] initWithFrame:CGRectMake(25.0, mainScreenFrame.size.height -250, mainScreenFrame.size.width -50.0,40.0)];

[brightnessSlider addTarget:self action:@selector(updateSliderValue:) forControlEvents:UIControlEventValueChanged];

brightnessSlider.autoresizingMask= UIViewAutoresizingFlexibleWidth |UIViewAutoresizingFlexibleTopMargin;

brightnessSlider.minimumValue=0.0;

brightnessSlider.maximumValue=1.0;

brightnessSlider.tag=10;

brightnessSlider.value=0.0;

[GPUView addSubview:brightnessSlider];

[staticPicture processImage];

//对比度

contrastFilter =[[GPUImageContrastFilter alloc]init];

[contrastFilter forceProcessingAtSize:GPUView.sizeInPixels];

[contrastFilter addTarget:GPUView];

contrastSlider= [[UISlider alloc] initWithFrame:CGRectMake(25.0, mainScreenFrame.size.height -190, mainScreenFrame.size.width -50.0,40.0)];

[contrastSlider addTarget:self action:@selector(updateSliderValue:) forControlEvents:UIControlEventValueChanged];

contrastSlider.autoresizingMask= UIViewAutoresizingFlexibleWidth |UIViewAutoresizingFlexibleTopMargin;

contrastSlider.minimumValue=0.0;

contrastSlider.maximumValue=1.0;

contrastSlider.tag=11;

contrastSlider.value=0.0;

[GPUView addSubview:contrastSlider];

[staticPicture processImage];

//组合,这就是把你要添加的所有滤镜效果放进数组

[staticPicture addTarget:brightnessFilter];

staticPicture addTarget:contrastFilter];

arrayTemp=[[NSMutableArray alloc]initWithObjects:brightnessFilter,contrastFilter,nil];

pipeline= [[GPUImageFilterPipeline alloc]initWithOrderedFilters:arrayTemp input:staticPicture output:(GPUImageView*)self.view];

添加方法,用UISlider将调色做成可视化

- (void)updateSliderValue:(UISlider *)sender

{

NSInteger index= sender.tag -10;switch(index)

{case0:

{

GPUImageBrightnessFilter*GPU = (GPUImageBrightnessFilter *)brightnessFilter;

[GPU setBrightness:brightnessSlider.value];

[staticPicture processImage];

NSLog(@"亮度 =  %f",brightnessSlider.value);

}break;case1:{

GPUImageContrastFilter*GPU = (GPUImageContrastFilter *)contrastFilter;

[GPU setContrast:contrastSlider.value];

[staticPicture processImage];

NSLog(@"对比度 =  %f",contrastSlider.value);

}default:break;

}

}

https://github.com/BradLarson/GPUImage

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
【社区内容提示】社区部分内容疑似由AI辅助生成,浏览时请结合常识与多方信息审慎甄别。
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

相关阅读更多精彩内容

  • 人:工程师 工具:建筑 方法:引导当地居民用自己已有的技术加入到再生产中 产品:民宿 国土部:允许深度贫困地区以出...
    刘冰杰阅读 1,801评论 0 0
  • java语言中,使用jdk提供的方法写文件一般有三种方式,关键类分别为FileOutputStream,Buffe...
    high_m阅读 2,571评论 0 0
  • 环境: win8 纯净环境(无任何开发语言工具的安装) 1.下载安装JAVA,Python,Android sdk...
    第八共同体阅读 2,586评论 0 0
  • 前段时间学习了小六洋葱阅读法,现在整理分享给大家。小六认为阅读就像吃饭睡觉一样,是我们每天不得不做的事情,应当融入...
    0edb135f3eb9阅读 18,275评论 19 307

友情链接更多精彩内容