Project 13 was a trivial application if you look solely at the amount of code we had to write, but it’s remarkable behind the scenes thanks to the power of Core Images. Modern iPhones have extraordinary CPU and GPU hardware, and Core Image uses them both to the full so that advanced image transformations can happen in real time – if you didn’t try project 13 on a real device, you really ought to if only to marvel at how incredibly fast it is!
You also met UIKit’s animation framework for the first time, which is a wrapper on top of another of Apple’s cornerstone frameworks: Core Animation. This is a particularly useful framework to get familiar with because of its simplicity: you tell it what you want (“move this view to position X/Y”) then tell it a duration (“move it over three seconds”), and Core Animation figures out what each individual frame looks like.
Here are just some of the other things you’ve now learned:
CIContext, and creating then applying Core Image filters using
sequence(), and even how to run custom closures using
asyncAfter()method of GCD, which causes code to be run after a delay.
UIImagePickerController, to select pictures from the user’s photo library. We’ll be using this again – it’s a really helpful component to have in your arsenal!
SPONSORED In-app subscriptions are a pain to implement, hard to test, and full of edge cases. RevenueCat makes it straightforward and reliable so you can get back to building your app. Oh, and it's free if your app makes less than $10k/mo.
Link copied to your pasteboard.