Monday 25 May 2015

Face detection with OpenCV in iOS

Hello Readers!
If you want set face detection in iOS (iPhone) follow the steps given below.
First of all we need to add opencv framework to our project. There are severel ways to do that. But the most simple is as follows:
Make framework via CMake
install home-brew first-
ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"
Then install cmake with home-brew with following:
brew install cmake

After installing cmake fire following command:
=> git clone https://github.com/Itseez/opencv.git
=> sudo ln -s /Applications/Xcode.app/Contents/Developer Developer
=> python opencv/platforms/ios/build_framework.py ios

After few minutes directory in which terminal is heading, you will get /ios/opencv2.framework
Now you need to clone opencv master for your convenience. Follow URL :
https://github.com/Itseez/opencv

Now In the the downloaded folder you will find “CvEffects” xcode project which need to be included in your project. 
Now create a new Xcode project and name it as “Sample_Face_Detection” and add opencv2.framework framework to your project.
In the main.storyboard at first ViewController add one UIImageView and two button named as ’start capture’ and ’stop capture’ respectively as per your convenience.

Now add attached files in this blog to your Xcode project
1. glasses.png
2. mustache.png
3. lbpcascade_frontalface.xml
4. haarcascade_mcs_eyepair_big.xml
5. haarcascade_mcs_mouth.xml



Its time to add code for ViewController
In your viewController add following steps:
1. import these in viewcontroller.h 
  1. #import "ios.h"
  2. #import "CvEffects/RetroFilter.hpp"
  3. #import "CvEffects/FaceAnimator.hpp"


2. add videoCamera delegate like follow:
  1. @interface ViewController : UIViewController<CvVideoCameraDelegate>


3. Set data types for controller 
  1. CvVideoCamera* videoCamera;
  2. FaceAnimator::Parameters parameters;
  3. cv::Ptr<FaceAnimator> faceAnimator;


4. Add following properties:
  1. @property (nonatomic, strong) CvVideoCamera* videoCamera;
  2. @property (nonatomic, strong) IBOutlet UIImageView *imageView;
Map this imageView to you view controller Xib.

5. Add following actions
  1. -(IBAction)startCaptureButtonPressed:(id)sender;
  2. -(IBAction)stopCaptureButtonPressed:(id)sender;
Map these action to start and stop capture in view controller Xib in storyboard. 
In .m file of viewController add following code in Viewdidload:
  1. self.videoCamera = [[CvVideoCamera alloc]
  2. initWithParentView:imageView];
  3. self.videoCamera.delegate = self;
  4. self.videoCamera.defaultAVCaptureDevicePosition =
  5. AVCaptureDevicePositionFront;
  6. self.videoCamera.defaultAVCaptureSessionPreset =
  7. AVCaptureSessionPreset352x288;
  8. self.videoCamera.defaultAVCaptureVideoOrientation =
  9. AVCaptureVideoOrientationPortrait;
  10. self.videoCamera.defaultFPS = 30;
This above code will initiate CvVideoCamera with properties set to it.
  1. // Load images
  2. UIImage* resImage = [UIImage imageNamed:@"glasses.png"];
  3. UIImageToMat(resImage, parameters.glasses, true);
  4. cvtColor(parameters.glasses, parameters.glasses, CV_BGRA2RGBA);
  5. resImage = [UIImage imageNamed:@"mustache.png"];
  6. UIImageToMat(resImage, parameters.mustache, true);
  7. cvtColor(parameters.mustache, parameters.mustache, CV_BGRA2RGBA);
  8. // Load Cascade Classisiers
  9. NSString* filename = [[NSBundle mainBundle]
  10. pathForResource:@"lbpcascade_frontalface"
  11. ofType:@"xml"];
  12. parameters.faceCascade.load([filename UTF8String]);
  13. filename = [[NSBundle mainBundle]
  14. pathForResource:@"haarcascade_mcs_eyepair_big"
  15. ofType:@"xml"];
  16. parameters.eyesCascade.load([filename UTF8String]);
  17. filename = [[NSBundle mainBundle]
  18. pathForResource:@"haarcascade_mcs_mouth"
  19. ofType:@"xml"];
  20. parameters.mouthCascade.load([filename UTF8String]);
These XMLs are used to identify different parts of your face as in our example lbpcascade_frontalface.xml, haarcascade_mcs_eyepair_big.xml and haarcascade_mcs_mouth.xml is used to identify front face, both eyes and mouth during detecting your whole face respectively.
Now add following functions in your file
To start capture :
  1. -(IBAction)startCaptureButtonPressed:(id)sender
  2. {
  3. [videoCamera start];
  4. isCapturing = YES;
  5. faceAnimator = new FaceAnimator(parameters);
  6. }
To stop capture :
  1. -(IBAction)stopCaptureButtonPressed:(id)sender
  2. {
  3. [videoCamera stop];
  4. isCapturing = NO;
  5. }
CvVideoCameraDelegate function which run once face animator runs:
  1. - (void)processImage:(cv::Mat&)image
  2. {
  3. faceAnimator->detectAndAnimateFaces(image);
  4. }
Now run the code and you will find glasses on your eye pair and mustache on your upper lips. Since with the help of these XMLs app will detect you eyes and lips.


For such more Blogs you can visit to http://findnerd.com/NerdDigest

No comments:

Post a Comment