In the middle of a view I'm working on for an iPhone app, I have an HStack with a variable number of custom views. How can I make them react when the user drags a finger across them?
To make it more concrete, I'd like each view to play a sound when it's touched. Imagine dragging a finger across a xylophone or piano and each key playing the appropriate sound.
I can make each view respond when tapped pretty easily.
However, taps only work for single sounds, don't activate immediately when a view is touched, and don't work with drags.
I've tried replacing the above tap gesture with a drag gesture such as this one, but it only works for the item that's touched first. If I drag my finger from the first SoundView to the second, it doesn't cancel the first sound and play the second.
This activates when touched, but it does NOT:
I've tried putting the drag gesture into the SoundView directly rather than have it attached a view in the HStack, but that doesn't work either.
I think ideally I'd put state variables inside the SoundView so that it could change visually as well as play a sound when touched, but at the moment, I just can't get the core drag recognition functionality to work even close to the way I want.
I've mostly figured out a way to solve the problem. Posting for feedback as well as future reference for others :)
My solution is based on the HWS Switcharoo video: https://youtu.be/ffV_fYcFoX0
However, what I needed to do was switch things around so that instead of the dragged item responding to what's underneath it, the items underneath responded to the drag.
In the main view, I setup a Bool array that I bound to each individually created SoundView. At some point I'll automatically generate the required amount, but for now, I just manually created as many as I needed.
Following the tutorial, I also created an array of CGRect to be populated later. These are what gave me the coordinates on screen to see if I was dragging to the right spot.
The magic happens in the DragGesture, where the drag coordinates get evaluated against the coordinates in the soundFrames. If there's a match, I take the array index of the matched frame and use that to toggle a Bool at the same index in the soundStates array, which are bound to each SoundView. When the SoundView state is active, it performs the desired behaviours.
In the SoundView, I did a bit of a hack where I added a .background modifier that runs a function that returns a Color, but in the function, I also play some sound. I added a comment to myself that I should investigate using .onChanged when I start to require iOS 14.
I'm sure it can be done better, so feedback definitely welcome :)
SPONSORED With Sentry’s error and performance monitoring for iOS, you see mobile vitals that actually matter, can solve any latency issues quickly, and learn how each release is performing over time.
This topic has been closed due to inactivity, so you can't reply. Please create a new topic if you need to.
All interactions here are governed by our code of conduct.
Link copied to your pasteboard.