Swift version: 5.10
Although it’s easy enough to play sound effects and music using AVKit, it’s actually one of the most powerful frameworks in iOS and can easily add some fun and interesting effects to your apps and games.
For example, one of the powerful classes in AVKit is called AVAudioEngine
. Its job is to connect audio processing objects in a chain so that the output of one object is the input for another. You can feed audio into the start, apply processing in the middle, then play the audio as the output, giving you real-time audio manipulation without much effort.
To try this out, we’ll create a simple test that loads an MP3 file and starts playing it, but adjusts the playback speed and pitch of the audio every time the user taps the screen.
First you need a property to store your AVAudioEngine
object, along with properties that store an AVAudioUnitTimePitch
and an AVAudioUnitVarispeed
– the processors that transform the speed and pitch of audio:
let engine = AVAudioEngine()
let speedControl = AVAudioUnitVarispeed()
let pitchControl = AVAudioUnitTimePitch()
Next you need a method that will play a URL. This takes six steps:
AVAudioFile
that reads from whatever file URL gets passed into the method.AVAudioPlayerNode
that will read in your AVAudioFile
. This is a like a more advanced AVAudioPlayer
, and we can use it as part of our engine connections.Here’s the code for that, with comments matching the numbers above:
func play(_ url: URL) throws {
// 1: load the file
let file = try AVAudioFile(forReading: url)
// 2: create the audio player
let audioPlayer = AVAudioPlayerNode()
// 3: connect the components to our playback engine
engine.attach(audioPlayer)
engine.attach(pitchControl)
engine.attach(speedControl)
// 4: arrange the parts so that output from one is input to another
engine.connect(audioPlayer, to: speedControl, format: nil)
engine.connect(speedControl, to: pitchControl, format: nil)
engine.connect(pitchControl, to: engine.mainMixerNode, format: nil)
// 5: prepare the player to play its file from the beginning
audioPlayer.scheduleFile(file, at: nil)
// 6: start the engine and player
try engine.start()
audioPlayer.play()
}
Now you can call that method using a path to any audio file in your app bundle.
As for changing the pitch and rate, we made pitchControl
and speedControl
properties so we can adjust them at will. For example:
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
pitchControl.pitch += 50
speedControl.rate += 0.1
}
As you’ll see, this processing happens incredibly quickly – it’s all realtime, so you can create fun effects for apps and games in just a few minutes of work!
SPONSORED Join a FREE crash course for mid/senior iOS devs who want to achieve an expert level of technical and practical skills – it’s the fast track to being a complete senior developer! Hurry up because it'll be available only until February 9th.
Sponsor Hacking with Swift and reach the world's largest Swift community!
Available from iOS 8.0
This is part of the Swift Knowledge Base, a free, searchable collection of solutions for common iOS questions.
Link copied to your pasteboard.