Swift version: 5.6
Core Image has a number of feature detectors built right in, including the ability to detect faces, eyes, mouths, smiles and even blinking in pictures. When you ask it to look for faces in a picture, it will return you an array of all the faces it found, with each one containing face feature details such as eye position. Here's an example:
if let inputImage = UIImage(named: "taylor-swift") {
let ciImage = CIImage(cgImage: inputImage.cgImage!)
let options = [CIDetectorAccuracy: CIDetectorAccuracyHigh]
let faceDetector = CIDetector(ofType: CIDetectorTypeFace, context: nil, options: options)!
let faces = faceDetector.features(in: ciImage)
if let face = faces.first as? CIFaceFeature {
print("Found face at \(face.bounds)")
if face.hasLeftEyePosition {
print("Found left eye at \(face.leftEyePosition)")
}
if face.hasRightEyePosition {
print("Found right eye at \(face.rightEyePosition)")
}
if face.hasMouthPosition {
print("Found mouth at \(face.mouthPosition)")
}
}
}
SPONSORED Build a functional Twitter clone using APIs and SwiftUI with Stream's 7-part tutorial series. In just four days, learn how to create your own Twitter using Stream Chat, Algolia, 100ms, Mux, and RevenueCat.
Sponsor Hacking with Swift and reach the world's largest Swift community!
Available from iOS 5.0
This is part of the Swift Knowledge Base, a free, searchable collection of solutions for common iOS questions.
Link copied to your pasteboard.