从相机面部检测iOS

前端之家收集整理的这篇文章主要介绍了从相机面部检测iOS前端之家小编觉得挺不错的,现在分享给大家,也给大家做个参考。
我收到了一张图片视图
-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSString *mediaType = info[UIImagePickerControllerMediaType];

    [self dismissViewControllerAnimated:YES completion:nil];

    if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {
        UIImage *image = info[UIImagePickerControllerOriginalImage];

        //imgvprofileImage.image = image;
        //[self detectForFacesInUIImage:[UIImage imageNamed:@"image00.jpg"]];

        [self detectForFacesInUIImage:image];
    }
    else if ([mediaType isEqualToString:(NSString *)kUTTypeMovie])
    {
        // Code here to support video if enabled
    }
}

当我发这样的照片时

[self detectForFacesInUIImage:[UIImage imageNamed:@"image00.jpg"]];

检测效果很好并找到一张脸但是当我使用从相机返回的图像时它不起作用.

[self detectForFacesInUIImage:image]

这是我用来检测脸部的功能

-(void)detectForFacesInUIImage:(UIImage *)facePicture
{
    CIImage* image = [CIImage imageWithCGImage:facePicture.CGImage];

    CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyLow forKey:CIDetectorAccuracy]];

    NSArray* features = [detector featuresInImage:image];

    if (features.count == 0) {
        NSLog(@"There is no faces in captured image ") ;
    }

    for(CIFaceFeature* faceObject in features)
    {
        CGRect modifiedFaceBounds = faceObject.bounds;
        modifiedFaceBounds.origin.y = facePicture.size.height-faceObject.bounds.size.height-faceObject.bounds.origin.y;

        [self addSubViewWithFrame:facePicture toRect:modifiedFaceBounds] ;
    }
}

解决方法

问题在于图像方向.

不记得我把它带到了哪里,但它有效:

- (void) detectForFaces:(CGImageRef)facePicture orientation:(UIImageOrientation)orientation {


    CIImage* image = [CIImage imageWithCGImage:facePicture];

    CIContext *context = [CIContext contextWithOptions:nil];                    // 1
    NSDictionary *opts = @{ CIDetectorAccuracy : CIDetectorAccuracyLow };      // 2
    CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace
                                              context:context
                                              options:opts];                    // 3

    int exifOrientation;
    switch (orientation) {
        case UIImageOrientationUp:
            exifOrientation = 1;
            break;
        case UIImageOrientationDown:
            exifOrientation = 3;
            break;
        case UIImageOrientationLeft:
            exifOrientation = 8;
            break;
        case UIImageOrientationRight:
            exifOrientation = 6;
            break;
        case UIImageOrientationUpMirrored:
            exifOrientation = 2;
            break;
        case UIImageOrientationDownMirrored:
            exifOrientation = 4;
            break;
        case UIImageOrientationLeftMirrored:
            exifOrientation = 5;
            break;
        case UIImageOrientationRightMirrored:
            exifOrientation = 7;
            break;
        default:
            break;
    }


    opts = @{ CIDetectorImageOrientation :[NSNumber numberWithInt:exifOrientation
                                           ] };

    NSArray *features = [detector featuresInImage:image options:opts];

    if ([features count] > 0) {
        CIFaceFeature *face = [features lastObject];
        NSLog(@"%@",NSStringFromCGRect(face.bounds));
    }
}

如何使用:

UIImage *image = // some image here;
[self detectForFaces:image.CGImage orientation:image.imageOrientation];
原文链接:https://www.f2er.com/iOS/333889.html

猜你在找的iOS相关文章