Swift version: 5.6
Core Image has a number of feature detectors built right in, including the ability to detect faces, eyes, mouths, smiles and even blinking in pictures. When you ask it to look for faces in a picture, it will return you an array of all the faces it found, with each one containing face feature details such as eye position. Here's an example:
if let inputImage = UIImage(named: "taylor-swift") {
let ciImage = CIImage(cgImage: inputImage.cgImage!)
let options = [CIDetectorAccuracy: CIDetectorAccuracyHigh]
let faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: options)!
let faces = faceDetector.features(in: ciImage)
if let face = faces.first as? CIFaceFeature {
print("Found face at \(face.bounds)")
if face.hasLeftEyePosition {
print("Found left eye at \(face.leftEyePosition)")
}
if face.hasRightEyePosition {
print("Found right eye at \(face.rightEyePosition)")
}
if face.hasMouthPosition {
print("Found mouth at \(face.mouthPosition)")
}
}
}
SAVE 50% To celebrate Black Friday, all our books and bundles are half price, so you can take your Swift knowledge further without spending big! Get the Swift Power Pack to build your iOS career faster, get the Swift Platform Pack to builds apps for macOS, watchOS, and beyond, or get the Swift Plus Pack to learn advanced design patterns, testing skills, and more.
Sponsor Hacking with Swift and reach the world's largest Swift community!
Available from iOS 5.0
This is part of the Swift Knowledge Base, a free, searchable collection of solutions for common iOS questions.
Link copied to your pasteboard.