ios – 如何将CIFilter输出到Camera视图?

前端之家收集整理的这篇文章主要介绍了ios – 如何将CIFilter输出到Camera视图?前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我刚刚开始使用Objective-C,我正在尝试创建一个简单的应用程序,它会显示带有模糊效果的摄像机视图.我让Camera输出与AVFoundation框架一起工作.现在,我正在尝试连接核心图像框架,但不知道如何,Apple文档让我感到困惑,在线搜索指南和教程导致没有结果.在此先感谢您的帮助.
  1. #import "ViewController.h"
  2. #import <AVFoundation/AVFoundation.h>
  3. @interface ViewController ()
  4.  
  5. @property (strong,nonatomic) CIContext *context;
  6.  
  7. @end
  8.  
  9. @implementation ViewController
  10. AVCaptureSession *session;
  11. AVCaptureStillImageOutput *stillImageOutput;
  12.  
  13. -(CIContext *)context
  14. {
  15. if(!_context)
  16. {
  17. _context = [CIContext contextWithOptions:nil];
  18. }
  19. return _context;
  20. }
  21. - (void)viewDidLoad {
  22. [super viewDidLoad];
  23. // Do any additional setup after loading the view,typically from a nib.
  24. }
  25.  
  26. -(void)viewWillAppear:(BOOL)animated{
  27. session = [[AVCaptureSession alloc] init];
  28. [session setSessionPreset:AVCaptureSessionPresetPhoto];
  29.  
  30. AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
  31. NSError *error;
  32. AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];
  33.  
  34. if ([session canAddInput:deviceInput]) {
  35. [session addInput:deviceInput];
  36. }
  37.  
  38. AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
  39. [previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
  40. CALayer *rootLayer = [[self view] layer];
  41. [rootLayer setMasksToBounds:YES];
  42. CGRect frame = self.imageView.frame;
  43.  
  44. [previewLayer setFrame:frame];
  45.  
  46. [previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
  47.  
  48.  
  49. [rootLayer insertSublayer:previewLayer atIndex:0];
  50.  
  51. stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
  52. NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
  53. [stillImageOutput setOutputSettings:outputSettings];
  54.  
  55. [session addOutput:stillImageOutput];
  56.  
  57. [session startRunning];
  58. }
  59. @end

解决方法

这是让你入门的东西.这是以下链接代码的更新版本.
https://gist.github.com/eladb/9662102

诀窍是使用AVCaptureVideoDataOutputSampleBufferDelegate.
使用此委托,您可以使用imageWithCVPixelBuffer从相机缓冲区构建CIImage.

现在虽然我想弄清楚如何减少滞后.我会尽快更新.

更新:延迟现在很小,并且在某些影响上无法察觉.不幸的是,模糊似乎是最慢的之一.您可能想要查看vImage.

  1. #import "ViewController.h"
  2. #import <CoreImage/CoreImage.h>
  3. #import <AVFoundation/AVFoundation.h>
  4.  
  5. @interface ViewController () {
  6.  
  7. }
  8.  
  9. @property (strong,nonatomic) CIContext *coreImageContext;
  10. @property (strong,nonatomic) AVCaptureSession *cameraSession;
  11. @property (strong,nonatomic) AVCaptureVideoDataOutput *videoOutput;
  12. @property (strong,nonatomic) UIView *blurCameraView;
  13. @property (strong,nonatomic) CIFilter *filter;
  14. @property BOOL cameraOpen;
  15.  
  16. @end
  17.  
  18. @implementation ViewController
  19.  
  20. - (void)viewDidLoad {
  21. [super viewDidLoad];
  22. self.blurCameraView = [[UIView alloc]initWithFrame:[[UIScreen mainScreen] bounds]];
  23. [self.view addSubview:self.blurCameraView];
  24.  
  25. //setup filter
  26. self.filter = [CIFilter filterWithName:@"CIGaussianBlur"];
  27. [self.filter setDefaults];
  28. [self.filter setValue:@(3.0f) forKey:@"inputRadius"];
  29.  
  30. [self setupCamera];
  31. [self openCamera];
  32. // Do any additional setup after loading the view,typically from a nib.
  33. }
  34.  
  35. - (void)didReceiveMemoryWarning {
  36. [super didReceiveMemoryWarning];
  37. // Dispose of any resources that can be recreated.
  38. }
  39.  
  40. - (void)setupCamera
  41. {
  42. self.coreImageContext = [CIContext contextWithOptions:@{kCIContextUseSoftwareRenderer : @(YES)}];
  43.  
  44. // session
  45. self.cameraSession = [[AVCaptureSession alloc] init];
  46. [self.cameraSession setSessionPreset:AVCaptureSessionPresetLow];
  47. [self.cameraSession commitConfiguration];
  48.  
  49. // input
  50. AVCaptureDevice *shootingCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
  51. AVCaptureDeviceInput *shootingDevice = [AVCaptureDeviceInput deviceInputWithDevice:shootingCamera error:NULL];
  52. if ([self.cameraSession canAddInput:shootingDevice]) {
  53. [self.cameraSession addInput:shootingDevice];
  54. }
  55.  
  56. // video output
  57. self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
  58. self.videoOutput.alwaysDiscardsLateVideoFrames = YES;
  59. [self.videoOutput setSampleBufferDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH,0)];
  60. if ([self.cameraSession canAddOutput:self.videoOutput]) {
  61. [self.cameraSession addOutput:self.videoOutput];
  62. }
  63.  
  64. if (self.videoOutput.connections.count > 0) {
  65. AVCaptureConnection *connection = self.videoOutput.connections[0];
  66. connection.videoOrientation = AVCaptureVideoOrientationPortrait;
  67. }
  68.  
  69. self.cameraOpen = NO;
  70. }
  71.  
  72. - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
  73. // Get a CMSampleBuffer's Core Video image buffer for the media data
  74. CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
  75.  
  76. // turn buffer into an image we can manipulate
  77. CIImage *result = [CIImage imageWithCVPixelBuffer:imageBuffer];
  78.  
  79. // filter
  80. [self.filter setValue:result forKey:@"inputImage"];
  81.  
  82. // render image
  83. CGImageRef blurredImage = [self.coreImageContext createCGImage:self.filter.outputImage fromRect:result.extent];
  84. dispatch_async(dispatch_get_main_queue(),^{
  85. self.blurCameraView.layer.contents = (__bridge id)blurredImage;
  86. CGImageRelease(blurredImage);
  87. });
  88. }
  89.  
  90. - (void)openCamera {
  91. if (self.cameraOpen) {
  92. return;
  93. }
  94.  
  95. self.blurCameraView.alpha = 0.0f;
  96. [self.cameraSession startRunning];
  97. [self.view layoutIfNeeded];
  98.  
  99. [UIView animateWithDuration:3.0f animations:^{
  100.  
  101. self.blurCameraView.alpha = 1.0f;
  102.  
  103. }];
  104.  
  105. self.cameraOpen = YES;
  106. }

猜你在找的iOS相关文章