ios - How can centre image on face? -


i want detect face image , centre it. how can ? have 1 picture :enter image description here

and want display face, in photo:

enter image description here

this quite bit beyond scope of single question, unless you're going write own face detection algo yourself, i'd suggest apple's native class cidetector

example code:

cicontext *context = [cicontext contextwithoptions:nil];                    // 1  nsdictionary *opts = @{ cidetectoraccuracy : cidetectoraccuracyhigh };      // 2  cidetector *detector = [cidetector detectoroftype:cidetectortypeface                                            context:context                                            options:opts];                    // 3    opts = @{ cidetectorimageorientation :            [[myimage properties] valueforkey:kcgimagepropertyorientation] }; // 4  nsarray *features = [detector featuresinimage:myimage options:opts];        // 5 

1 creates context; in example, context ios. can use of context-creation functions described in processing images.) have option of supplying nil instead of context when create detector.)

2 creates options dictionary specify accuracy detector. can specify low or high accuracy. low accuracy (cidetectoraccuracylow) fast; high accuracy, shown in example, thorough slower.

3 creates detector faces. type of detector can create 1 human faces.

4 sets options dictionary finding faces. it’s important let core image know image orientation detector knows can find upright faces. of time you’ll read image orientation image itself, , provide value options dictionary.

5 uses detector find features in image. image provide must ciimage object. core image returns array of cifeature objects, each of represents face in image.

face features include:

left , right eye positions  mouth position  tracking id , tracking frame count core image uses follow face in video segment 

after array of face features cidetector object, can loop through array examine bounds of each face , each feature in faces

for (cifacefeature *f in features)  {      nslog(nsstringfromrect(f.bounds));        if (f.haslefteyeposition)          nslog("left eye %g %g", f.lefteyeposition.x. f.lefteyeposition.y);        if (f.hasrighteyeposition)          nslog("right eye %g %g", f.righteyeposition.x. f.righteyeposition.y);        if (f.hasmouthposition)          nslog("mouth %g %g", f.mouthposition.x. f.mouthposition.y);  } 

when you've detected face can use standard uikit/coreimage classes zoom , centre.


Comments