Swift version: 5.10
Core Image has a number of feature detectors built right in, including the ability to detect faces, eyes, mouths, smiles and even blinking in pictures. When you ask it to look for faces in a picture, it will return you an array of all the faces it found, with each one containing face feature details such as eye position. Here's an example:
if let inputImage = UIImage(named: "taylor-swift") {
let ciImage = CIImage(cgImage: inputImage.cgImage!)
let options = [CIDetectorAccuracy: CIDetectorAccuracyHigh]
let faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: options)!
let faces = faceDetector.features(in: ciImage)
if let face = faces.first as? CIFaceFeature {
print("Found face at \(face.bounds)")
if face.hasLeftEyePosition {
print("Found left eye at \(face.leftEyePosition)")
}
if face.hasRightEyePosition {
print("Found right eye at \(face.rightEyePosition)")
}
if face.hasMouthPosition {
print("Found mouth at \(face.mouthPosition)")
}
}
}
SPONSORED Get accurate app localizations in minutes using AI. Choose your languages & receive translations for 40+ markets!
Sponsor Hacking with Swift and reach the world's largest Swift community!
Available from iOS 5.0
This is part of the Swift Knowledge Base, a free, searchable collection of solutions for common iOS questions.
Link copied to your pasteboard.