GO FURTHER, FASTER: Try the Swift Career Accelerator today! >>

How to detect images using ARImageTrackingConfiguration

Swift version: 5.10

Paul Hudson    @twostraws   

ARKit can automatically scan for images in the world, which means you can attach overlays showing more detail or trigger other behaviors inside your app depending on what was found. There are two important drawbacks you should be aware of before you start:

  1. The images need to be visually distinct, which means they need some amount of detail and color variation. Xcode will warn you if your images aren’t good enough.
  2. ARKit can detect a fixed number of images at a time, so if you want to detect many you either need to decide which to search for based on location (e.g. iBeacons in an art gallery), or cycle between your picture selection constantly. 25 or fewer is the target Apple recommends.

To get started detecting images, create a new iOS project using the Augmented Reality App template and SceneKit, then clean it up: open ViewController.swift, clear out everything in viewDidLoad() except the call to super.viewDidLoad() and sceneView.delegate = self, and finally delete the three empty methods at the end. You can also delete art.scnassets, which isn’t needed here.

The first step is to import the pictures you want ARKit to recognize. Remember, these should be digital copies of real-world pictures, so either scan the real-world objects or print your images. These pictures should not just be dragged into your asset catalog – we need to add them in a special way.

In your asset catalog, right-click on the blank space below AppIcon and choose New AR Resource Group. It will be named “AR Resources” by default, but I’d like you to change that to something that represents your images. For example, if you were looking for pictures in an art gallery you might call it Paintings. Now drag your images to where it says “No AR items”, to add those numbers to the resource group.

This process creates a set of images that ARKit is able to scan for, and although you can create as many as you want you can have only one active at a given time.

When you next press Cmd+B to build your project, Xcode will scan your ARKit images to make sure they are suitable for AR detection. You should, at least at first, always get warnings for your images, because Xcode should report the images need “non-zero, positive width”. This is because adding PNG files to the ARKit catalog isn’t enough: Xcode needs to know an estimated size of the images in the real world, so it can detect them more accurately. So, select each of your images, then enter their size into the attributes inspector – the default unit is meters, but you’ll probably find it easier to change that to centimeters.

Once you’ve entered a valid size for each image, Xcode’s warnings should go away – if any warnings remain it means your images fail the detection criteria, so read Xcode’s suggestions and try again.

The next step is to tell ARKit that we want to scan for images, and in particular those images we just added. Open ViewController.swift and change viewWillAppear() to this:

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)

    let configuration = ARImageTrackingConfiguration()

    guard let trackingImages = ARReferenceImage.referenceImages(inGroupNamed: "YourGroupNameHere", bundle: nil) else {
        // failed to read them – crash immediately!
        fatalError("Couldn't load tracking images.")
    }

    configuration.trackingImages = trackingImages
    sceneView.session.run(configuration)
}

Note: Obviously you should change “YourGroupNameHere” to name of your AR resource group.

That loads the AR resource group you created and asks ARKit to track them. If for some reason you need to track more than one image at a time, you can set the maximumNumberOfTrackedImages property on your session to whatever you need – it defaults to 1, but modern iPhones can handle about 4.

Now that tracking is running, the final step is to make the app do something when your image is detected. Here’s some code for the ViewController class that places a translucent blue layer over each detected image:

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
    // make sure this is an image anchor, otherwise bail out
    guard let imageAnchor = anchor as? ARImageAnchor else { return nil }

    // create a plane at the exact physical width and height of our reference image
    let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)

    // make the plane have a transparent blue color
    plane.firstMaterial?.diffuse.contents = UIColor.blue.withAlphaComponent(0.5)

    // wrap the plane in a node and rotate it so it's facing us
    let planeNode = SCNNode(geometry: plane)
    planeNode.eulerAngles.x = -.pi / 2

    // now wrap that in another node and send it back
    let node = SCNNode()
    node.addChildNode(planeNode)
    return node
}

Wrapping our node in a parent is helpful so that ARKit can move, rotate, and scale the parent without affecting the child node inside.

Tip: You can read the name of the detected image by using imageAnchor.referenceImage.name – this will match whatever name it has in your asset catalog.

That’s all the code you need, so if you run the app on a real device you should be able to try scanning your images. When it runs for the first time you’ll be asked for camera permissions, but after that you’ll find you can detect your images in any orientation, pick them up, move them around, and so on – ARKit is remarkably good at detecting all sorts of variations.

Go further, faster with the Swift Career Accelerator.

GO FURTHER, FASTER Unleash your full potential as a Swift developer with the all-new Swift Career Accelerator: the most comprehensive, career-transforming learning resource ever created for iOS development. Whether you’re just starting out, looking to land your first job, or aiming to become a lead developer, this program offers everything you need to level up – from mastering Swift’s latest features to conquering interview questions and building robust portfolios.

Learn more here

Available from iOS 12.0 – learn more in my book Practical iOS 12

Similar solutions…

About the Swift Knowledge Base

This is part of the Swift Knowledge Base, a free, searchable collection of solutions for common iOS questions.

BUY OUR BOOKS
Buy Pro Swift Buy Pro SwiftUI Buy Swift Design Patterns Buy Testing Swift Buy Hacking with iOS Buy Swift Coding Challenges Buy Swift on Sundays Volume One Buy Server-Side Swift Buy Advanced iOS Volume One Buy Advanced iOS Volume Two Buy Advanced iOS Volume Three Buy Hacking with watchOS Buy Hacking with tvOS Buy Hacking with macOS Buy Dive Into SpriteKit Buy Swift in Sixty Seconds Buy Objective-C for Swift Developers Buy Beyond Code

Was this page useful? Let us know!

Average rating: 4.1/5

 
Unknown user

You are not logged in

Log in or create account
 

Link copied to your pasteboard.