Microsoft Cognitive Services is a set of APIs, SDKs and services available to developers to make their applications more intelligent, engaging and discoverable. You can basically do a lot of cool stuffs using the platform and we are going to see one of those cool stuffs today.
We are going to integrate Microsoft Cognitive Face API in an iOS application. We will extract faces from a given image and also facial features like age, gender, emotions like smile, anger etc. Basically you can click a photo from your iPhone app, and detect how many people are present in the photo, their age and gender, and their emotions. Sounds exciting ? Let dive in.
I am expecting a basic understanding of iOS application development for this tutorial.
Step 1. Create a single view application.
Step 2. Install cocoapods if you don't already have it. Its basically a dependency manager and will help us install third party SDKs / libraries in our applications.
Step 3. Open terminal, and go inside the project folder.
Step 4. Type in, pod init. It creates a pod file in your project folder. Open the podfile using xcode.
Step 5. Edit the podfile with the below contents / code. Save and close.
# Uncomment the next line to define a global platform for your project
platform :ios, '9.0'
target 'MyCoolProject' do
# Comment the next line if you're not using Swift and don't want to use dynamic frameworks
use_frameworks!
# Pods for MyCoolProject
pod 'ProjectOxfordFace', :git => 'https://github.com/Microsoft/Cognitive-Face-iOS'
end
Step 6. Open the xcworkspace created in your project folder.
Step 7. Integrate an image picker to input an image from your camera. (Its pretty straight forward and you will find many tutorials for this.)
Step 8. In your ViewController, import the following.
#import <Photos/Photos.h>
#import "UIImage+FixOrientation.h"
#import "UIImage+Crop.h"
#import "ImageHelper.h"
#import <ProjectOxfordFace/MPOFaceServiceClient.h>
Step 9. Add a 'Analyse' button in the UI and add the below action code for your analyse button.
- (IBAction)analyseClicked:(id)sender {
MPOFaceServiceClient *client = [[MPOFaceServiceClient alloc] initWithEndpointAndSubscriptionKey:ProjectOxfordFaceEndpoint key:ProjectOxfordFaceSubscriptionKey];
NSData *data = UIImageJPEGRepresentation(selectedImage, 0.8);
[client detectWithData:data returnFaceId:YES returnFaceLandmarks:YES returnFaceAttributes:@[@(MPOFaceAttributeTypeGender), @(MPOFaceAttributeTypeAge),@(MPOFaceAttributeTypeEmotion), @(MPOFaceAttributeTypeSmile)] completionBlock:^(NSArray *collection, NSError *error) {
if (error) {
NSLog(@"Detection failed with error %@",[error localizedDescription]);
return;
}
[self->detectedFaces removeAllObjects]; //detectedFaces is a NSMutableArray
for (MPOFace *face in collection) {
UIImage *croppedImage = [self->selectedImage crop:CGRectMake(face.faceRectangle.left.floatValue, face.faceRectangle.top.floatValue, face.faceRectangle.width.floatValue, face.faceRectangle.height.floatValue)];
MPODetectionFaceObject *obj = [[MPODetectionFaceObject alloc] init];
obj.croppedFaceImage = croppedImage;
obj.genderText = [NSString stringWithFormat:@"Gender: %@", face.attributes.gender];
obj.ageText = [NSString stringWithFormat:@"Age: %@", face.attributes.age.stringValue];
NSLog(@"Gender : %@", face.attributes.gender);
NSLog(@"Age : %d", face.attributes.age.intValue);
NSLog(@"Smile : %d", face.attributes.smile.intValue);
NSLog(@"Happiness : %d", face.attributes.emotion.happiness.intValue);
NSLog(@"Sadness : %d", face.attributes.emotion.sadness.intValue);
[self->detectedFaces addObject:obj];
}
}];
}