Finishing Touches: Haptics

We are probably all so accustomed by now to haptic feedback on our iPhones that we hardly give it a thought. Here are some examples:

  • On the home screen, long press on the background, and the screen enters “jiggly mode” — with a dull thud.

  • On the home screen, long press an app or a folder, and the shortcuts menu opens — with a sharp click.

What are those thuds and clicks that our iPhone produces? They are haptic feedback. They affect us almost below the level of consciousness, which could be why we don’t really notice them. But once you do notice them, you’ll realize that they are everywhere.

Many places in the interface respond to long press in some way; for example, in the Control Center, long press just about any icon, and it toggles or produces a secondary interface with a click.

Many standard iOS views and behaviors are supplemented with automatic haptic feedback. For example, a wheel-style UIPickerView ticks as you pass through its values. A UISwitch clunks as you toggle it. In the Mail app, in the list of messages, when you swipe a message all the way left to delete it, there’s a faint, subtle click in confirmation of the action — and not just in the Mail app, but for any list cell’s leftward swipe action.

These are not noises. They are not sound effects. They are not coming out of your phone’s speakers. They are a tactile effect — physical vibrations within the body of the phone itself. You may be able to hear the mechanism that produces the vibration, but the important thing is the vibration itself.

My goal in this article is to suggest that your app might benefit from the addition of haptic feedback — and to demonstrate to you that haptic feedback is easy to add. You should already be thinking: Is my app the sort of thing that can benefit from haptic feedback? Where within my app might some haptic feedback be useful?

Games, in particular, are the obvious beneficiary of tactile feedback, as anyone knows who (like me) enjoys playing Candy Crush with the sound turned off, but with haptics turned on. There’s a definite dopamine hit as the explosion of jellies reverberates through your hand holding the phone.

But any app can benefit from supplementing its visible interface with an occasional physical effect, possibly so subtle as to be almost subconscious, confirming that the user has successfully performed an action — or has failed to perform it.

And any sort of animation can be reinforced with haptic feedback to indicate that it is in transition or that it has ended.

A Little History

Historically, haptic feedback seems to have grown out of 3D touch, which first appeared on the iPhone 6s and was eventually abandoned with the advent of the iPhone 11. With 3D touch, you could press harder and harder on something, and when you reached full pressure, it would respond with a satisfying click; this was the basis for “peek and pop” view controller transitions.

I think Apple abandoned 3D touch for several reasons. Detection of the touch pressure may have called for special hardware whose cost wasn’t sustainable. It wasn’t a particularly discoverable affordance. Most importantly, it divided the world into haves and have-nots, not merely in terms of 3D touch itself but also in terms of the features that it accessed.

For example, on an iPhone 6s, 3D touch on an icon on the home screen would produce the shortcuts menu. So what about an iPhone 6 or an iPad? Basically, on a device without 3D touch, there were two choices:

  • The shortcuts menu feature might be missing entirely. That seems harsh. If this is a useful feature, why should half the world be robbed of it? If something is not a sufficiently useful feature to be present on non-3D touch devices, then by Occam’s Razor, why do we need it anywhere at all?

  • The shortcuts menu could be summoned by some other gesture (such as a long press) on non-3D touch devices. But then again, by Occam’s Razor, what’s the point of 3D touch itself?

So Apple realized they had gone down a wrong road and removed 3D touch, replacing it with haptic feedback.

In one way, haptic feedback is not as powerful as 3D touch, in that it doesn’t afford sensitivity to the degree of pressure. In another way, though, it’s far more powerful because anything — not just a long press — can produce a “sound” that doesn’t come through the speaker but instead makes itself felt through the body of the phone itself. And if some devices lack haptic feedback altogether, well, at least the feature itself is still present. On an iPad, a UISwitch may not clunk as you switch it, but it is still a UISwitch.

Haptic How-To

There are two tactile feedback APIs in iOS:

  • UIFeedbackGenerator was introduced in iOS 10 and requires (I think) an iPhone 7 or later. It provides a basic standard “vocabulary” of subtle responses.

  • Core Haptics appeared in iOS 13 and requires an iPhone 8 or later. It allows you to design your own haptic effects of taps and vibrations, which can be combined into sequences.

I’ll survey the two APIs in that order.

UIFeedbackGenerator

UIFeedbackGenerator is an abstract class with three concrete subclasses:

  • UISelectionFeedbackGenerator: Call its selectionChanged method to emit a barely perceptible tick.

  • UIImpactFeedbackGenerator: Call its impactOccurred method to emit a clunk. You can govern the FeedbackStyle with which the generator is initialized, as well as the intensity of a particular impact.

  • UINotificationFeedbackGenerator: Call its notificationOccurred method with a FeedbackType to emit a tick-tock or tick-a-tock pattern.

The basic pattern of usage is pretty simple:

  1. Instantiate the generator and retain it.

  2. Call prepare. If you can do this in advance of the next step, it can reduce latency.

  3. Emit the feedback.

  4. Release the generator. This step isn’t crucial, but it probably preserves battery life by powering down the taptic engine chip immediately.

In this simple example, I provide subtle feedback when the user taps a segment of a UISegmentedControl:

var feedback : UISelectionFeedbackGenerator? = nil
@IBAction func goPrevNext(_ sender : Any) {
    if let seg = sender as? UISegmentedControl {
        self.feedback = self.feedback ?? UISelectionFeedbackGenerator()
        self.feedback!.prepare()
        let f = // what to do, depending what segment this is
        DispatchQueue.main.async {
            f()
            self.feedback!.selectionChanged()
        }
    }
}

Apple warns that you should think of the feedback generator types in terms of their semantic content rather than aiming at a specific feel. The reason for calling selectionChanged is not to cause a faint internal tick: it’s because the selection really did change! If your goal is to achieve a specific feel, on the other hand, you can use Core Haptics.

Core Haptics

Core Haptics involves a “stack” of classes:

  • CHHapticEvent is a single “note” of feedback. Here are some of its features:

    • An event may be transient (a tap) or continuous (a sustained vibration).

    • It can have a duration, which obviously is more significant for a continuous feedback “note”.

    • It can have a relative time, which is useful for combining events into a sequence.

    • It can have parameters specifying such things as intensity and sharpness that characterize the sonic quality of the event.

  • CHHapticPattern embraces one or more haptic events, combining them into a single sequence of playable feedback. If a haptic event is like a note, a haptic pattern is like a tune.

  • CHHapticEngine is the persistent object that plays a haptic pattern by way of a player that it generates. The engine must be running in order for any haptics to be produced. (This architecture may remind you of AVFoundation’s AVAudioEngine.)

  • CHHapticPatternPlayer is an object that contains a pattern and can be told to play it. You don’t necessarily need an explicit pattern player; if you expect to reuse the same pattern, it’s nice to maintain a persistent pattern player, but it is also possible to tell the haptic engine directly to play a pattern as a one-off.

There’s much more to know, because Core Haptics is a very powerful technology, but that’s sufficient for a simple working example. So here we go! This example is the Core Haptics analog of the preceding UISegmentedControl example:

var engine : CHHapticEngine?
var player : CHHapticPatternPlayer?
fileprivate func ensureHapticEngine() {
    if self.engine == nil && CHHapticEngine.capabilitiesForHardware().supportsHaptics {
        let params : [CHHapticEventParameter] = [
            .init(parameterID: .hapticIntensity, value: 0.5),
            .init(parameterID: .hapticSharpness, value: 0.3)
        ]
        let event = CHHapticEvent(eventType: .hapticTransient, parameters: params, relativeTime: 0)
        do {
            let pattern = try CHHapticPattern(events: [event], parameters: [])
            self.engine = try CHHapticEngine()
            self.player = try self.engine?.makePlayer(with: pattern)
        } catch {
            print(error)
        }
    }
}
fileprivate func playHapticPattern() {
    self.engine?.start { err in
        if err == nil {
            do {
                self.engine?.notifyWhenPlayersFinished { _ in .stopEngine }
                try self.player?.start(atTime: CHHapticTimeImmediate)
            } catch {
                print(error)
            }
        }
    }
}
@IBAction func goPrevNext(_ sender : Any) {
    self.ensureHapticEngine()
    if let seg = sender as? UISegmentedControl {
        let f = // what to do, depending what segment this is
        DispatchQueue.main.async {
            self.playHapticPattern()
            f(self)
        }
    }
}

Configuring the stack of objects for Core Haptics is more work than UIFeedbackGenerator, but the reward is that now we are in total charge of the “tap” feedback characteristics. The architecture in our example may look elaborate, but it’s actually fairly straightforward and is largely just boilerplate. We have just one haptic event type, so we need just one player. And even though the preparatory code looks wordy, when the time comes to play our haptic event, all we have to do is call two utility methods in succession, ensureHapticEngine and playHapticPattern:

  1. In ensureHapticEngine, if we haven’t been called before, we create the event, wrap it up in a pattern, create the engine, and obtain a player for our pattern. Both the engine and the pattern player are then retained.

  2. When it’s time to play our pattern, in playHapticPattern, we start the engine asynchronously (so as not to block the main thread) and tell the pattern player to play. Observe that the command we use, start(atTime:), takes a time offset; we are playing our pattern immediately, but we could take advantage of this offset to coordinate the playback with something other time-based effect such as an animation.

  3. When the pattern finishes playing, the engine’s notifyWhenPlayersFinished handler stops the engine. In this way, the engine runs only when we actually need to play our pattern. Note that we do not stop the engine by calling its stop method directly; if we did that, we might stop it before the pattern has even had a chance to play, and there would be no feedback.

A Bigger Pattern

That wasn’t a very interesting example, in the sense that our “pattern” wasn’t much of a pattern: it consists of just a single “note”. Just for practice, let’s build a more elaborate sequence of notes.

A pattern can be built up piece by piece out of events. Alternatively, it can be described as a whole through a dictionary. You can construct the dictionary in code, or you can express it as JSON and play it directly from a text file at runtime.

Here’s a demonstration of the pattern dictionary notation, representing the rhythm of a possibly familiar musical motif:

let params : [[CHHapticPattern.Key : Any]] = [
    [
        .parameterID: CHHapticEvent.ParameterID.hapticSharpness,
        .parameterValue: 0.7,
    ],
    [
        .parameterID: CHHapticEvent.ParameterID.hapticIntensity,
        .parameterValue: 0.8,
    ],
]
typealias patt = [CHHapticPattern.Key : [[CHHapticPattern.Key : [CHHapticPattern.Key : Any]]]]
let d : patt = [
    .pattern: [
        [.event: [
            .eventType: CHHapticEvent.EventType.hapticTransient,
            .eventParameters: params,
            .time: 0.0,
            .eventDuration: 0.1]
        ],
        [.event: [
            .eventType: CHHapticEvent.EventType.hapticTransient,
            .eventParameters: params,
            .time: 0.3,
            .eventDuration: 0.1]
        ],
        [.event: [
            .eventType: CHHapticEvent.EventType.hapticTransient,
            .eventParameters: params,
            .time: 0.6,
            .eventDuration: 0.1]
        ],
        [.event: [
            .eventType: CHHapticEvent.EventType.hapticContinuous,
            .time: 0.9,
            .eventDuration: 1.0]
        ]
    ]
]

Recognize that? It’s the four-note theme from the first movement of Beethoven’s Fifth, reduced to three taps and a vibration. Note the use of the .time attribute to offset the start of each note within the sequence. This might remind you of how animations are lined up sequentially in a keyframe animation or an animation group.

Hungry For Power

I said a moment ago that Core Haptics is powerful. Here are some more things it can do:

  • A haptic event, in addition to its intensity and sharpness, can have MIDI-style note shaping features such as attack, decay, and release times.

  • A haptic event can have an actual audio component, initialized from an audio file URL. This means that your pattern can incorporate and synchronize both haptic and audio elements. You can temporarily suppress (mute) the playback of either type of element. Note that if your pattern has no audio events, you can greatly reduce the latency of starting up the engine by setting its playsHapticsOnly to true.

  • A pattern player can be a CHHapticAdvancedPatternPlayer, which behaves rather like an audio player: it can pause and resume, seek to a given time, play in a loop, and vary its playback rate.

  • A pattern player, while it is playing, can be sent dynamic parameters (CHHapticDynamicParameter), which behave like an envelope around the parameters of its events. For example, you could suddenly duck the overall intensity of the player, even while it is playing, allowing other sounds to stand out. It can also be assigned a parameter curve (CHHapticParameterCurve), which modulates the parameter envelope gradually.

The Icing on the Cake

You may not previously have considered adding haptic feedback to your app, but once you’ve noticed how pervasive it is, you might suddenly sense its absence as leaving your app feeling rather dry and empty. That’s certainly the case for me. Now that I’m suddenly conscious of haptic feedback, I see all sorts of places where I could be using it in every one of my apps!

Wherever you’ve got animations, transitions, or simple user actions that can benefit from a subtle confirmation, haptic feedback can make things feel cleaner, clearer, more well-defined. And it doesn’t have to be obstreperous feedback, like Candy Crush. Indeed, in some situations, the more subtle, the better. Our fingers and brains are amazingly sensitive; haptic feedback can be almost subliminal and yet psychologically effective.

It’s true that some older iPhones, and all iPads, can’t benefit from haptic feedback, but don’t use that as an excuse for not implementing haptic feedback in the first place. After all, there is still a huge base of hand-held devices that can respond to haptic feedback. This is a technology that can really give your app that final polish that it needs. Let’s get out there and start polishing.

You Might Also Like…

Picking a Photo in iOS 14

If your app puts up an interface where the user gets to choose a photo from the photos library, you’re probably familiar with UIImagePickerController. Indeed, you’re probably all too familiar with it. UIImagePickerController is a remarkably clunky, aged piece of interface, both from the user point of view and with regard to its programming API. …

Picking a Photo in iOS 14 Read More »

    Sign Up

    Get more articles and exclusive content that takes your iOS skills to the next level.