Project 13 was a trivial application if you look solely at the amount of code we had to write, but it’s remarkable behind the scenes thanks to the power of Core Images. Modern iPhones have extraordinary CPU and GPU hardware, and Core Image uses them both to the full so that advanced image transformations can happen in real time – if you didn’t try project 13 on a real device, you really ought to if only to marvel at how incredibly fast it is!
You also met UIKit’s animation framework for the first time, which is a wrapper on top of another of Apple’s cornerstone frameworks: Core Animation. This is a particularly useful framework to get familiar with because of its simplicity: you tell it what you want (“move this view to position X/Y”) then tell it a duration (“move it over three seconds”), and Core Animation figures out what each individual frame looks like.
Here are just some of the other things you’ve now learned:
UISlider
.CGAffineTransform
.UIImageWriteToSavedPhotosAlbum()
function.CIContext
, and creating then applying Core Image filters using CIFilter
.SKTexture
.SKCropNode
.SKActions
, including moveBy(x:y:)
, wait(forDuration:)
, sequence()
, and even how to run custom closures using run(block:)
.asyncAfter()
method of GCD, which causes code to be run after a delay.UIImagePickerController
, to select pictures from the user’s photo library. We’ll be using this again – it’s a really helpful component to have in your arsenal!SPONSORED Ready to dive into the world of Swift? try! Swift Tokyo is the premier iOS developer conference will be happened in April 9th-11th, where you can learn from industry experts, connect with fellow developers, and explore the latest in Swift and iOS development. Don’t miss out on this opportunity to level up your skills and be part of the Swift community!
Sponsor Hacking with Swift and reach the world's largest Swift community!
Link copied to your pasteboard.