Project 13 was a trivial application if you look solely at the amount of code we had to write, but it’s remarkable behind the scenes thanks to the power of Core Images. Modern iPhones have extraordinary CPU and GPU hardware, and Core Image uses them both to the full so that advanced image transformations can happen in real time – if you didn’t try project 13 on a real device, you really ought to if only to marvel at how incredibly fast it is!
You also met UIKit’s animation framework for the first time, which is a wrapper on top of another of Apple’s cornerstone frameworks: Core Animation. This is a particularly useful framework to get familiar with because of its simplicity: you tell it what you want (“move this view to position X/Y”) then tell it a duration (“move it over three seconds”), and Core Animation figures out what each individual frame looks like.
Here are just some of the other things you’ve now learned:
UISlider
.CGAffineTransform
.UIImageWriteToSavedPhotosAlbum()
function.CIContext
, and creating then applying Core Image filters using CIFilter
.SKTexture
.SKCropNode
.SKActions
, including moveBy(x:y:)
, wait(forDuration:)
, sequence()
, and even how to run custom closures using run(block:)
.asyncAfter()
method of GCD, which causes code to be run after a delay.UIImagePickerController
, to select pictures from the user’s photo library. We’ll be using this again – it’s a really helpful component to have in your arsenal!SPONSORED Still waiting on your CI build? Speed it up ~3x with Blaze - change one line, pay less, keep your existing GitHub workflows. First 25 HWS readers to use code HACKING at checkout get 50% off the first year. Try it now for free!
Sponsor Hacking with Swift and reach the world's largest Swift community!
Link copied to your pasteboard.