NEW: Learn to build the incredible iOS 15 Weather app today! >>

@Published property wrapper

Forums > SwiftUI

I have a an observable object that has an @Published property in it. From my main view I tap a button that presents a sheet to record some audio. Since the published property is changed when the sheet is presented ( a new recording is added to it ) shouldn't the main view be updated with the new info, because when i dismiss the sheet the main view doesnt display the new recording?

   

Seems like a duplicate of your earlier posting over in the Swift forum. Some code samples would be great and would help us help you. :-)

   

Yes, I wasn't sure where to ask my question, so I asked in both.

What Im trying understand is that if I'm in the app and the ListOfRecordingsView is showing and I tap the MicButtonIcon and bring up the sheet to record a new voice recording. I then tap the button to start and stop the recording. The published property recordingsList from RecordAndListenViewModel is updated with the new recording by calling fetchAllRecords from stopRecording So, why when I dismiss the sheet and the ListOfRecordingsView is showing again does the new recoring not show up.

I'm assuming its because a sheet is modal. If that is the case, is there a way to force the ListOfRecordingsView to refresh when dismissing the MakeNewRecordingView

My Observable Object (Partial)

class RecordAndListenViewModel: NSObject, ObservableObject, AVAudioPlayerDelegate {
    var audioRecorder: AVAudioRecorder!
    var audioPlayer: AVAudioPlayer!
    let dataController: DataController

    var indexOfPlayer = 0
    @Published var appError: ErrorType?
    @Published var showErrorAlert = false

    @Published var isRecording = false

    @Published var recordingsList = [Recording]()

    @Published var countSec = 0
    @Published var timerCount: Timer?
    @Published var blinkingCount: Timer?
    @Published var timer = "0:00"
    @Published var toggleColor = false
    @Published var showingUnlockView = false
    @Published var showPermissionDeniedView = false

    // The next 3 properites are needed for monitoring sound
    // So we can get the visualizer working
    @Published public var soundSamples: [Float]
    private var monitorTimer: Timer?
    private var currentSample: Int
    private let numberOfSamples: Int

    var playingURL: URL?
    var fileName = URL(string: "")

    init(numberOfSamples: Int, dataController: DataController) {
        self.dataController = dataController
        self.numberOfSamples = numberOfSamples
        self.soundSamples = [Float](repeating: .zero, count: numberOfSamples)
        self.currentSample = 0

        super.init()

        fetchAllRecordings(usingICloud: dataController.useICloud)
    }

    func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) {
        for idx in 0..<recordingsList.count where recordingsList[idx].fileURL == playingURL {
            recordingsList[idx].isPlaying = false
        }
    }

    func startRecording() {
        let canCreate = dataController.fullVersionUnlocked || dataController.countNumberOfFiles() < 25

        if canCreate {
            let recordingSession = AVAudioSession.sharedInstance()

            // Check for permission
            switch recordingSession.recordPermission {
            case .undetermined:
                recordingSession.requestRecordPermission { (isGranted) in
                    DispatchQueue.main.async {
                        if !isGranted {
                            self.showPermissionDeniedView = true
                            NSLog("Previously Denied Permission \(recordingSession.recordPermission)")
                        } else {
                            self.setupRecording(session: recordingSession)
                        }
                    }
                }
            case .denied:
                self.showPermissionDeniedView = true
                NSLog("Previously Denied Permission \(recordingSession.recordPermission)")
            case .granted:
                setupRecording(session: recordingSession)
            default:
                break
            }
        } else {
            showingUnlockView.toggle()
        } // End of if canCreate
    }

    /// Setups a recording session
    private func setupRecording(session: AVAudioSession) {
        do {
            try session.setCategory(.playAndRecord, mode: .default)
            try session.setActive(true)
        } catch {
            appError = ErrorType(error: .setupRecordingError)
            showErrorAlert = true
            NSLog("Cannot setup the Recording: \(error.localizedDescription)")
        }

        if dataController.fullVersionUnlocked && dataController.useICloud {
            // We drop in here to save to iCloud Drive instead of local folder.
            // This checks to see if Documents folder exists if not it creates it
            if let containerURL = FileManager.default.url(forUbiquityContainerIdentifier: nil)?.appendingPathComponent("Documents") {
                if !FileManager.default.fileExists(atPath: containerURL.path, isDirectory: nil) {
                    do {
                        try FileManager.default.createDirectory(at: containerURL, withIntermediateDirectories: true, attributes: nil)
                    } catch {
                        appError = ErrorType(error: .createICloudDirectoryError)
                        showErrorAlert = true
                        NSLog("Couldn't create directory on iCloud: \(error.localizedDescription)")
                    }
                }

                // Create the file now
                fileName = containerURL.appendingPathComponent("Recording: \(Date().toString(dateFormat: "MM:dd:yyyy:HH:mm:ss")).\(dataController.fileExtension)")
            }
        } else {
            let containerURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
            fileName = containerURL.appendingPathComponent("Recording: \(Date().toString(dateFormat: "MM:dd:yyyy:HH:mm:ss")).\(dataController.fileExtension)")
        } // End of if dataController.fullVersionUnlocked

        do {
            audioRecorder = try AVAudioRecorder(url: fileName!, settings: audioSettings())
            audioRecorder.prepareToRecord()
            audioRecorder.record()
            isRecording = true

            startMonitoringMic()

            timerCount = Timer.scheduledTimer(withTimeInterval: 1, repeats: true, block: { (_) in
                self.countSec += 1
                self.timer = self.covertSecToMinAndHour(seconds: self.countSec)
            })
            blinkColor()
        } catch {
            appError = ErrorType(error: .startRecordingError)
            showErrorAlert = true
            NSLog("Failed to Start Recording: \(error.localizedDescription)")
        }

    }

    private func audioSettings() -> [String: Any] {
        var settings: [String: Any] = [:]

        if dataController.audioFormat == 0 {
            settings = [
                AVFormatIDKey: Int(kAudioFormatLinearPCM),
                AVSampleRateKey: AudioSampleRate.k44100.rawValue,
                AVNumberOfChannelsKey: 1,
                AVEncoderAudioQualityKey: dataController.getAVEncoderQualityKey(index: dataController.audioQuality).rawValue
            ]
        } else if dataController.audioFormat == 1 {
            settings = [
                AVFormatIDKey: Int(kAudioFormatAppleLossless),
                AVSampleRateKey: AudioSampleRate.k44100.rawValue,
                AVNumberOfChannelsKey: 1,
                AVEncoderAudioQualityKey: dataController.getAVEncoderQualityKey(index: dataController.audioQuality).rawValue
            ]
        } else if dataController.audioFormat == 2 {
            settings = [
                AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
                AVSampleRateKey: AudioSampleRate.k44100.rawValue,
                AVNumberOfChannelsKey: 1,
                AVEncoderAudioQualityKey: dataController.getAVEncoderQualityKey(index: dataController.audioQuality).rawValue
            ]
        }

        return settings
    }

    /// Monitors the incoming sound from the microphone
    private func startMonitoringMic() {
        audioRecorder.isMeteringEnabled = true
        monitorTimer = Timer.scheduledTimer(withTimeInterval: 0.01, repeats: true, block: { (_) in
            self.audioRecorder.updateMeters()
            self.soundSamples[self.currentSample] = self.audioRecorder.averagePower(forChannel: 0)
            self.currentSample = (self.currentSample + 1) % self.numberOfSamples
        })
    }

    func stopRecording() {
        audioRecorder.stop()

        isRecording = false

        self.countSec = 0

        timerCount!.invalidate()
        blinkingCount!.invalidate()
        monitorTimer!.invalidate()

        fetchAllRecordings(usingICloud: dataController.useICloud)
    }

    func fetchAllRecordings(usingICloud: Bool) {
        var path: URL?
        recordingsList.removeAll()

        if usingICloud {
            path = (FileManager.default.url(forUbiquityContainerIdentifier: nil)?.appendingPathComponent("Documents"))
        } else {
            path = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
        }

        // swiftlint:disable:next force_try
        let directoryContents = try! FileManager.default.contentsOfDirectory(at: path!, includingPropertiesForKeys: nil)

        for idx in directoryContents {
            recordingsList.append(Recording(fileURL: idx, createdAt: getFileDate(for: idx), isPlaying: false))
        }

        recordingsList.sort(by: { $0.createdAt.compare($1.createdAt) == .orderedDescending})
    }
}

Then in my TabBarView I have a MicButton that when tapped brings up a sheet.

struct MicButtonIcon: View {
    @EnvironmentObject var dataController: DataController

    @Binding var showPopUpMenu: Bool
    let proxy: GeometryProxy

    private let device = Device.current

    var body: some View {
        Button(action: { showPopUpMenu.toggle() }, label: {
            ZStack {
                Circle()
                    .foregroundColor(Color(dataController.themeAccentColor))
                    .frame(
                        width:
                            device.isPhone || device.isPod ?
                                proxy.size.width/Constants.flt5 :
                            /// iPad config settings based on which orientation device is in
                            device.orientation == .portrait ?
                                proxy.size.width/Constants.flt6 :
                                proxy.size.width/Constants.flt10,
                        height:
                            device.isPhone || device.isPod ?
                                proxy.size.width/Constants.flt5 :
                            /// iPad config settings based on which orientation device is in
                            device.orientation == .portrait ?
                                proxy.size.width/Constants.flt6 :
                                proxy.size.width/Constants.flt10                )
                    .doubleShadow(radius: Constants.flt5)

                Image(systemName: "mic.fill")
                    .resizable()
                    .aspectRatio(contentMode: .fit)
                    .frame(
                        width:
                            device.isPhone || device.isPod ?
                                proxy.size.width/Constants.flt7-Constants.flt6 :
                            /// iPad config settings based on which orientation device is in
                            device.orientation == .portrait ?
                                proxy.size.width/Constants.flt8-Constants.flt9 :
                                proxy.size.width/Constants.flt12-Constants.flt13,
                        height:
                            device.isPhone || device.isPod ?
                                proxy.size.width/Constants.flt7-Constants.flt6 :
                            /// iPad config settings based on which orientation device is in
                            device.orientation == .portrait ?
                                proxy.size.width/Constants.flt8-Constants.flt9 :
                                proxy.size.width/Constants.flt12-Constants.flt13
                    )
                    .foregroundColor(.white)
//                    .rotationEffect(Angle(degrees: showPopUpMenu ? 45 : 0))
            }
            .sheet(isPresented: $showPopUpMenu) {
                MakeANewRecordingView(dataController: dataController, numberOfSamples: 10)
            }
        })
        .offset(
            x: device.isPad ? -Constants.flt27 : -Constants.flt0,
            y: -proxy.size.height/Constants.flt16/Constants.flt2)
        .buttonStyle(BounceButtonStyle())
    }
}

   

Hacking with Swift is sponsored by Essential Developer

SPONSORED Only until this Sunday, December 12th, you can join a free crash course to learn advanced techniques for testing new and legacy Swift code — it's the fast track to becoming a complete Senior iOS Developer!

Click to learn more

Sponsor Hacking with Swift and reach the world's largest Swift community!

Reply to this topic…

You need to create an account or log in to reply.

All interactions here are governed by our code of conduct.

 
Unknown user

You are not logged in

Log in or create account
 

Link copied to your pasteboard.