16

Getting Started With Core Haptics [FREE]

 4 years ago
source link: https://www.raywenderlich.com/10608020-getting-started-with-core-haptics
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Apple introduced Core Haptics in iOS 13. It’s a new API that can generate custom, high-resolution tactile and audio feedback, making vibrations and rumbles feel like an old-fashioned gimmick. Get ready to immerse yourself in a world of haptic experiences.

In this tutorial, you’ll enhance an app with new haptic experiences. You’ll learn how to:

  1. Create haptic patterns and play them.
  2. Synchronize audio with haptic events.
  3. Create dynamic haptic patterns that change in response to external stimuli.

This tutorial is fully hands-on!

Note : This intermediate-level tutorial assumes that you’re comfortable building an iOS app using Xcode and writing Swift. You’ll also need a device running iOS 13 that supports haptic feedback. This tutorial uses the game from the tutorial How To Make a Game Like Cut the Rope With SpriteKit , Snip The Vine, as the starting project. You should read that tutorial first if you’d like an introduction to SpriteKit.

Getting Started

Download the starter project using the Download Materials button at the top or bottom of this tutorial. Open the starter project. Build and run, and… you have a game.

snip_the_vine-231x500.png

You’ve got the cute graphics, your physics simulation and animations are slick and your sound effects put the player right in the jungle. But you have an intense feeling that something’s missing. You’re itching to feel the *snip*-iness as you slice the vine. And what about the *ker-splosh*-iness of that pineapple landing in the water and, of course, the *nom-nom-crunch-munch*-iness when the croc eats it?

It’s time to get tactile with Core Haptics!

Adding Your First Haptic Experience

The best way to start is to quickly make a simple haptic experience and test it. Your first haptic experience will add a little *snip* that players feel when they cut the vine.

The standard process of creating a haptic experience is to:

  1. Check device compatibility.
  2. Create a haptic engine object.
  3. Create a pattern of haptic events.
  4. Make a pattern player.
  5. Start the haptic engine.
  6. Play the pattern.
  7. Stop the haptic engine.

To avoid adding even more code to GameScene.swift , expand SnipTheVine ▸ Classes then find and open Haptics.swift . It’s an empty file for you to use. Add the following:

import CoreHaptics

class HapticManager {
  // 1
  let hapticEngine: CHHapticEngine

  // 2
  init?() {
    // 3
    let hapticCapability = CHHapticEngine.capabilitiesForHardware()
    guard hapticCapability.supportsHaptics else {
      return nil
    }

    // 4
    do {
      hapticEngine = try CHHapticEngine()
    } catch let error {
      print("Haptic engine Creation Error: \(error)")
      return nil
    }
  }
}

Here’s what this code does:

  1. hapticEngine holds a reference to CHHapticEngine .
  2. The initializer is failable, so you can check if it’s nil in the game scene and use that to indicate that haptics are unavailable.
  3. The first thing you do in the initializer is check if haptics are available. Call CHHapticEngine.capabilitiesForHardware() to obtain an object that you can use to test support with a simple check of supportsHaptics .
  4. Finally, you create an engine object. The CHHapticEngine initializer can throw, so you need to wrap it in a do/catch block and return nil if it throws an error.

Note : There are many reasons why haptics could be unavailable on a device, so you need to use do-catch blocks for a lot of the API. This also means you need to make sure you have a fallback for any haptic experience.

Now, push on to get to the point where you can test your first haptic as soon as possible.

Still in Haptics.swift , add this extension to the class:

extension HapticManager {
  private func slicePattern() throws -> CHHapticPattern {
    let slice = CHHapticEvent(
      eventType: .hapticContinuous,
      parameters: [
        CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.35),      
        CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.25)
      ],
      relativeTime: 0,
      duration: 0.25)

    let snip = CHHapticEvent(
      eventType: .hapticTransient,
      parameters: [
        CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
        CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0)
      ],
      relativeTime: 0.08)

    return try CHHapticPattern(events: [slice, snip], parameters: [])
  }
}

slicePattern() returns your first Core Haptics pattern. It creates two haptic events and a haptic pattern that uses them. You’ll explore the details of events and patterns shortly, but for now, forge on!

Finally, add a method to HapticManager to play the pattern:

func playSlice() {
  do {
    // 1
    let pattern = try slicePattern()
    // 2
    try hapticEngine.start()
    // 3
    let player = try hapticEngine.makePlayer(with: pattern)
    // 4
    try player.start(atTime: CHHapticTimeImmediate)
    // 5
    hapticEngine.notifyWhenPlayersFinished { _ in
      return .stopEngine
    }
  } catch {
    print("Failed to play slice: \(error)")
  }
}

This code does the following:

  1. You call slicePattern() to, as the name implies, grab your haptic slice pattern.
  2. Then you call start() on the haptic engine.
  3. Make a pattern player with your slice pattern.
  4. Next, play the pattern, calling start(atTime:) with CHHapticTimeImmediate to play it immediately.
  5. Call notifyWhenPlayersFinished(finishedHandler:) and return .stopEngine to stop the engine once it finishes playing.

OK, you’re nearly there, but you need to make some updates to GameScene.swift . Start by opening the file and adding the following property at the top of the class:

private var hapticManager: HapticManager?

Then add the following to the top of didMove(to:) :

hapticManager = HapticManager()

In checkIfVineCut(withBody:) , add this line above the line where you run the slice sound action:

hapticManager?.playSlice()

Build and run and slice that vine! Can you feel the gameplay improving already?

Exploring the Events That Make up the Pattern

OK, that was a lot all at once. Now, take a moment for a deeper look at what you did there.

Focusing on the pattern, you can see that it’s made up of events:

let slice = CHHapticEvent(
  eventType: .hapticContinuous,
  parameters: [
    CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.35),      
    CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.25)
  ],
  relativeTime: 0,
  duration: 0.5)

let snip = CHHapticEvent(
  eventType: .hapticTransient,
  parameters: [
    CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
    CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0)
  ],
  relativeTime: 0.08)

A haptic event, represented by CHHapticEvent , can be one of two types: .hapticTransient or .hapticContinuous . A transient event is instantaneous, like a single drum beat, and a continuous event is like a rumble. You can also create audio events, but more on that later.

Each event has two properties that control the haptic sensation that CHHapticEventParameter represents. Each has a parameter ID and a value between 0 and 1 .

A parameter with the ID .hapticIntensity represents the strength of the sensation; the higher the value, the stronger it is. A parameter with the ID .hapticSharpness represents a physical quality that, at the high end of the scale, creates a precise mechanical feel. At the low end, it has a more rounded, organic feel.

The event’s relativeTime represents the number of seconds from the start of the pattern where the event occurs. For continuous events, there’s also a duration property to define how long the event plays.

Your first haptic effect is a low, soft rumble for 0.25 seconds starting immediately, with a sharp, intense beat that occurs 0.08 seconds from the start. You can represent the pattern in a diagram:

Snip-1-2-480x240.png

Next, you’ll learn how you can use haptic events more efficiently.

Managing Energy Usage

Core Haptics uses the Taptic Engine hardware in your iOS device. Like all hardware components on an iOS device, you’ll want to activate it only when necessary to avoid wasting energy. When to run the engine depends on the needs of your app, but here are some guidelines:

  • If your pattern plays a single transient event immediately, call the engine start() , play the pattern then call stop(completionHandler:) . This is the most energy-efficient way, as described in Apple’s haptic pattern documentation .
  • If you’re playing a longer or more complex pattern, call notifyWhenPlayersFinished(finishedHandler:) and return .stopEngine , as you’ve done already.
  • However, if your app plays multiple patterns in succession, or that overlap, as Snip The Vine does, you need the engine to always run.

Now, you’ll find out how to make sure the engine keeps running.

First, remove the following call to notifyWhenPlayersFinished(finishedHandler:) from playSlice() :

hapticEngine.notifyWhenPlayersFinished { error in
  return .stopEngine
}

Next, add this to the end of init?() :

do {
  try hapticEngine.start()
} catch let error {
  print("Haptic failed to start Error: \(error)")
}

This makes sure the haptic engine is ready when the game scene loads.

Still in init?() , add the following line to the end:

hapticEngine.isAutoShutdownEnabled = true

This enables the automatic idle shutdown of the haptic engine. That’s responsible energy management on your part. However, this means iOS can shut the haptic engine down at any time, and you can’t assume it will still be running when you need it.

Designing a Haptic Experience

Why did you use those specific events, properties and timings? How did the experience feel to you? Did you get an impression of a blade slicing through the air and a sharp *snip*? These are the questions you need to ask when you design a haptic experience for your users.

A well-designed haptic experience enhances an app in subtle ways; a poorly-designed experience annoys and distracts.

A well-designed haptic experience also takes time. You need to test many subtle variations, but it’s a hoot to be able to say to friends that you just spent the afternoon designing the perfect haptic experience for a crocodile eating a pineapple. :]

Every budding haptic designer should read the Apple Human Interface Guidelines page on Haptics . The section on Designing with Haptics has a great list of design tips.

When to use Core Haptics : The general rule is that if you use UIKit in your app, you can take advantage of Apple-designed Haptics for free if you use standard UIKit controls.

If you need something a little different, UIFeedbackGenerator is available for notification, impact or selection effects.

If you truly need to get creative, then Core Haptics is there for you. But the jump in complexity from UIFeedbackGenerator to Core Haptics is large.

Now, it’s time to look at how to get the perfect sensation when the croc bites into that pineapple!

Feeding the Crocodile

So how do you start designing a haptic experience?

Build and run and take a look at your crocodile, happily munching his pineapple as it falls.

There are some cues you can draw from: sound and animation. Look in Resources/Sounds and open NomNom.caf in GarageBand or your audio editing app of choice. You can see the waveform of that sound effect:

nom-wave-1-480x131.png

The waveform has two distinct high points: An initial *crunch* , then a smaller *munch* at the end.

Open Haptics.swift and add a new method after slicePattern() :

private func nomNomPattern() throws -> CHHapticPattern {
  let rumble1 = CHHapticEvent(
    eventType: .hapticContinuous,
    parameters: [
      CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
      CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.3)
    ],
    relativeTime: 0,
    duration: 0.15)
  
  let rumble2 = CHHapticEvent(
    eventType: .hapticContinuous,
    parameters: [
      CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.4),
      CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.1)
    ],
    relativeTime: 0.3,
    duration: 0.3)

  return try CHHapticPattern(events: [rumble1, rumble2], parameters: [])
}

This is the start of your crocodile’s *nom-nom* pattern — two continuous events to correspond to the two sounds in the nom nom sound effect.

Nomnom-2-1-480x165.png

Playing Different Patterns

Before playing your new pattern, first write a generic pattern-playing method to avoid code duplication. In Haptics.swift , add the following method above playHapticSlice() :

private func playHapticFromPattern(_ pattern: CHHapticPattern) throws {
  try hapticEngine.start()
  let player = try hapticEngine.makePlayer(with: pattern)
  try player.start(atTime: CHHapticTimeImmediate)
}

This method will play any pattern you pass to it. It still must try to start the haptic engine, because iOS may shut the engine down at any time. So, you can now simplify playSlice() :

func playSlice() {
  do {
    let pattern = try slicePattern()
    try playHapticFromPattern(pattern)
  } catch {
    print("Failed to play slice: \(error)")
  }
}

You can also add a method to play your new effect under playSlice() :

func playNomNom() {
  do {
    let pattern = try nomNomPattern()
    try playHapticFromPattern(pattern)
  } catch {
    print("Failed to play nomNom: \(error)")
  }
}

In GameScene.swift , find didBegin(_:) , which runs nomNomSoundAction , and add this line above it:

hapticManager?.playNomNom()

Build and run. Of course, you’ll need enough skill to feed the crocodile to test your new effect. :]

Look at runNomNomAnimation(withDelay:) in GameScene.swift . It’s called with a value of 0.15 when the croc catches his treat. The animation runs like this:

  1. Close mouth.
  2. Wait 0.15 seconds.
  3. Open mouth.
  4. Wait 0.15 seconds.
  5. Close mouth.

croc-munch.gif

It’d be great to add a couple of strong beats to match those snapping jaws. To do this, add two more events to nomNomPattern() . Replace the line return try CHHapticPattern ... with:

let crunch1 = CHHapticEvent(
  eventType: .hapticTransient,
  parameters: [
    CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
    CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0)
  ],
  relativeTime: 0)

let crunch2 = CHHapticEvent(
  eventType: .hapticTransient,
  parameters: [
    CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
    CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.3)
  ],
  relativeTime: 0.3)

return try CHHapticPattern(
  events: [rumble1, rumble2, crunch1, crunch2],
  parameters: [])

As mentioned earlier, a transient event is instantaneous, like a single drum beat. .hapticIntensity represents the strength of the sensation, and .hapticSharpness represents the physical quality. Higher values create stronger and more prominent haptic events. Feel free to tweak the numbers to your liking. Here, you add two transient events to match those snapping jaws.

Build and run. Pretty *snappy* , right? You want to give the user a rewarding feeling, since this is the success result for the game screen. And that satisfying *snap* *snap* is perfect:

Nomnom-3-1-480x165.png

Syncing Audio Events

Up to now, you’ve kept the audio and haptics separate, but Core Haptics also supports audio events in haptic patterns. Before you use this feature, you have to register each audio resource with the haptic engine, which will return a resource ID that you can use to identify which audio waveform to use for an event.

Add the following properties to HapticManager :

var sliceAudio: CHHapticAudioResourceID?
var nomNomAudio: CHHapticAudioResourceID?
var splashAudio: CHHapticAudioResourceID?

These hold your audio resource IDs.

Next, add the following method after init?() :

private func setupResources() {
  do {
    if let path = Bundle.main.url(forResource: "Slice", withExtension: "caf") {
      sliceAudio = try hapticEngine.registerAudioResource(path)
    }
    if let path = Bundle.main.url(forResource: "NomNom", withExtension: "caf") {
      nomNomAudio = try hapticEngine.registerAudioResource(path)
    }
    if let path = Bundle.main.url(forResource: "Splash", withExtension: "caf") {
      splashAudio = try hapticEngine.registerAudioResource(path)
    }
  } catch {
    print("Failed to load audio: \(error)")
  }
}

setupResources() checks the path to each of your audio files. It then registers them using registerAudioResource(_:options:) , which returns the resource ID.

If the file isn’t found, the property will remain nil , and you can check for that in your pattern methods. You haven’t started the splash pattern yet; that’s coming later. ;]

Now, you need to add a call to setupResources() at the end of init?() :

setupResources()

Add the audio event to the pattern in nomNomPattern() . Replace return try CHHapticPattern ... with:

var events = [rumble1, rumble2, crunch1, crunch2]

// 1
if let audioResourceID = nomNomAudio {
  // 2
  let audio = CHHapticEvent(
    audioResourceID: audioResourceID, 
    parameters: [], 
    relativeTime: 0)
  events.append(audio)
}

// 3
return try CHHapticPattern(events: events, parameters: [])
nil
CHHapticEvent

While you’re there, add the audio to the slice pattern. Replace the return try CHHapticPattern ... in slicePattern() with:

var events = [slice, snip]

if let audioResourceID = sliceAudio {
  let audio = CHHapticEvent(
    audioResourceID: audioResourceID,
    parameters: [],
    relativeTime: 0)
  events.append(audio)
}

return try CHHapticPattern(events: events, parameters: [])

This is very similar to what you did in nomNomPattern() . You check that the resource ID for slicing action is not nil in order to create an audio event with the resource ID.

Because you’re now including audio in your haptic patterns, there’s no need for the game scene to play that audio. Open GameScene.swift and find setUpAudio() .

At the end of that method, you’ve set the sliceSoundAction , splashSoundAction and nomNomSoundAction properties:

sliceSoundAction = .playSoundFileNamed(...)
splashSoundAction = .playSoundFileNamed(...)
nomNomSoundAction = .playSoundFileNamed(...)

You’ll need to change these so that the game scene plays the haptic pattern instead of the audio, but only if the haptic manager successfully registered those audio resource IDs and can play them.

Replace the code from above in setUpAudio() with the following:

guard let manager = hapticManager else {
  sliceSoundAction = .playSoundFileNamed(
    SoundFile.slice,
    waitForCompletion: false)
  nomNomSoundAction = .playSoundFileNamed(
    SoundFile.nomNom,
    waitForCompletion: false)
  splashSoundAction = .playSoundFileNamed(
    SoundFile.splash,
    waitForCompletion: false)
  return
}

setupHaptics(manager)

That code first makes sure that hapticManager isn’t nil . If it is, it creates the sound actions as normal. This is the first fallback position.

If hapticManager is not nil , it calls setupHaptics , which you’ll now add under setUpAudio() :

private func setupHaptics(_ manager: HapticManager) {
}

You’ll use setupHaptics(_:) to create the SKAction objects that play your haptic patterns, but you also need a fallback in case the haptic audio resource ID is nil . In that situation, you can create an SKAction group that’ll play the sound and run the haptic pattern, without audio, together.

Add the following to setupHaptics(_:) :

// 1
let sliceHaptics = SKAction.run {
  manager.playSlice()
}
if manager.sliceAudio != nil {
  // 2
  sliceSoundAction = sliceHaptics
} else {
  // 3
  sliceSoundAction = .group([
    .playSoundFileNamed(SoundFile.slice, waitForCompletion: false),
    sliceHaptics
  ])
}
  1. First, you create the haptics action. It’s a simple run action that calls playSlice() .
  2. If sliceAudio is not nil , you assign this action to sliceSoundAction .
  3. However, if sliceAudio is nil , you create a group action with two child actions. The first is the playSoundFileNamed action and the second your sliceHaptics action.

Now, add the same approach for nomNomSoundAction :

let nomNomHaptics = SKAction.run {
  manager.playNomNom()
}
if manager.nomNomAudio != nil {
  nomNomSoundAction = nomNomHaptics
} else {
  nomNomSoundAction = .group([
    .playSoundFileNamed(SoundFile.nomNom, waitForCompletion: false),
    nomNomHaptics
  ])
}

This is very similar to sliceSoundAction , except that you use nomNomHaptics .

For now, add a simple playSoundFileNamed action for splashSoundAction :

splashSoundAction = .playSoundFileNamed(
  SoundFile.splash,
  waitForCompletion: false)

You haven’t designed that haptic experience yet; this avoids a crash when you run the game and splashSoundAction is nil .

Build and run! Now, Core Haptics plays your slice and the nom-nom audio.

Setting a Reset Handler

Now that you’re using haptic audio resources, you have a new problem to consider. If the haptic server on your device recovers from a failure, then your haptic engine instance resets. When that happens, the engine stops and loses all audio resource ID references. To prevent that, you need a reset handler.

Adding a reset handler is easy. First, add this new method to HapticManager :

func handleEngineReset() {
  do {
    // 1
    try hapticEngine.start()
    // 2
    setupResources()
  } catch {
    print("Failed to restart the engine: \(error)")
  }
}
  1. Apple recommends that you first try to start the engine.
  2. If that works, you restore any audio resource IDs you’ve previously registered.

Next, add the following to init?() to call handleEngineReset() when the engine resets:

hapticEngine.resetHandler = { [weak self] in
  self?.handleEngineReset()
}

See the Apple documentation on Preparing Your App to Play Haptics for more information.

For your next step, you’ll add haptics when the crocodile misses the pineapple.

Ramping Intensity Up and Down — Pineapple Splashdown

Listening to the Splash.caf sound effect, there’s a heavy *splish* followed by a longer, tailing *splash* . Add a new method to HapticManager to make a pattern that represents that sound experience:

private func splashPattern() throws -> CHHapticPattern {
  let splish = CHHapticEvent(
    eventType: .hapticTransient,
    parameters: [
      CHHapticEventParameter(parameterID: .hapticIntensity, value: 1.0),
      CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.1)
    ],
    relativeTime: 0)
  
  let splash = CHHapticEvent(
    eventType: .hapticContinuous, 
    parameters: [
      CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.5),
      CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.1)
    ],
    relativeTime: 0.1,
    duration: 0.6)
  
  var events = [splish, splash]
  if let audioResourceID = splashAudio {
    let audio = CHHapticEvent(
      audioResourceID: audioResourceID, 
      parameters: [], 
      relativeTime: 0)
    events.append(audio)
  }
  
  return try CHHapticPattern(events: events, parameters: [])
}

Your new haptic experience has a single strong, but rounded, transient event at the start for the *splish* , and then a longer, softer continuous event that starts at 0.1 seconds and lasts for 0.6 seconds for the *splash* :

Pineapple-1-480x164.png

Before you can play it, you need to add a new method to HapticManager , beneath playNomNom() :

func playSplash() {
  do {
    let pattern = try splashPattern()
    try playHapticFromPattern(pattern)
  } catch {
    print("Failed to play splash: \(error)")
  }
}

Return to setupHaptics(_:) in GameScene.swift , remove the temporary splashSoundAction code, then add the following code to set splashSoundAction :

let splashHaptics = SKAction.run {
  manager.playSplash()
}
if manager.splashAudio != nil {
  splashSoundAction = splashHaptics
} else {
  splashSoundAction = .group([
    .playSoundFileNamed(SoundFile.splash, waitForCompletion: false),
    splashHaptics
  ])
}

Build and run and test it. The *splish* works well, but the *splash* is just a long rumble; it’s too one-dimensional. It should be more like a cresting wave. Fortunately, there are event properties that can help you. Update the splash event with three new properties:

let splash = CHHapticEvent(
  eventType: .hapticContinuous, 
  parameters: [
    CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.5),
    CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.1),
    CHHapticEventParameter(parameterID: .attackTime, value: 0.1),
    CHHapticEventParameter(parameterID: .releaseTime, value: 0.2),
    CHHapticEventParameter(parameterID: .decayTime, value: 0.3)
  ],
  relativeTime: 0.1, 
  duration: 0.6)
  • .attackTime is a property that controls how many seconds it takes for the event to reach the specified intensity value from 0 at the start of the event. Think of it as the ramp-up time.
  • .decayTime is the opposite, representing the time it takes for the intensity to ramp down to 0 .
  • .releaseTime controls when the decay ramp down begins.

Build and run and experience the disappointing letdown of missing the crocodile and splashing into the ocean. Can you feel the wave? It should ramp down to 0 intensity just before the sound finishes playing.

Pineapple-2-480x165.png

Controlling Intensity With a Parameter Curve

Since you’re getting creative with these haptic experiences, why not improve the snip haptic pattern for a more satisfying *SsssNIP* feel? A haptic pattern can also accept parameters that apply to the pattern as a whole.

First, update the slice event property like so:

let slice = CHHapticEvent(
  eventType: .hapticContinuous, 
  parameters: [
    CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.6),
    CHHapticEventParameter(parameterID: .hapticSharpness, value: 0.8)
  ], 
  relativeTime: 0, 
  duration: 0.5)

This increases the intensity, sharpness and duration.

Next, create a new type of parameter — a CHHapticParameterCurve — after the two slice and snip events in slicePattern() :

let curve = CHHapticParameterCurve(
  parameterID: .hapticIntensityControl, 
  controlPoints: [
    .init(relativeTime: 0, value: 0.2),
    .init(relativeTime: 0.08, value: 1.0),
    .init(relativeTime: 0.24, value: 0.2),
    .init(relativeTime: 0.34, value: 0.6),
    .init(relativeTime: 0.5, value: 0)
  ], 
  relativeTime: 0)

A parameter curve is similar to an animation curve, and the control points are like animation keyframes.

This parameter curve has the ID .hapticIntensityControl , which acts as a multiplier to all event intensity values across the pattern. Because it’s a curve, the parameter interpolates smoothly between the control points as the pattern plays.

For example, the first control point is at time 0 with value 0.2 , which means that it multiplies all event intensity values by 0.2 at the start. By 0.08 seconds, it will have ramped up smoothly to a multiplier of 1.0 . By 0.24 seconds, it will have ramped smoothly back down to 0.2 , and so on.

Here’s how it looks:

Snip-2-2-480x240.png

To use the parameter curve, you need to initialize the pattern object using CHHapticPattern(events:parameterCurves:) .

Still in Haptics.swift , replace the return statement in slicePattern() with the following:

return try CHHapticPattern(events: events, parameterCurves: [curve])

This creates the haptic pattern using the curve you specified.

Build and run to experience your new dynamic haptic experience.

Updating Pattern Parameters in Real Time

If you think dynamic parameter curves are cool, wait until you see Core Haptics’ heavy hitter: CHHapticAdvancedPatternPlayer . This is a pattern player that you can control while it’s playing.

There’s something important missing in your game. When the player swishes a finger across the screen, you can see the particle effects, but where’s the feel of the blade slicing through the air? With a CHHapticAdvancedPatternPlayer , you can even control the intensity in real time so it ramps up stronger the faster the player’s finger moves.

swish-particles.gif

First, add a property to HapticManager to hold a reference to your new player:

var swishPlayer: CHHapticAdvancedPatternPlayer?

Next, add a method to create the player:

func createSwishPlayer() {
  let swish = CHHapticEvent(
    eventType: .hapticContinuous, 
    parameters: [
      CHHapticEventParameter(parameterID: .hapticIntensity, value: 0.5),
      CHHapticEventParameter(parameterID: .hapticSharpness, value: 1.0)
    ], 
    relativeTime: 0,
    duration: 60)
  
  do {
    let pattern = try CHHapticPattern(events: [swish], parameters: [])
    swishPlayer = try hapticEngine.makeAdvancedPlayer(with: pattern)
  } catch let error {
    print("Swish player error: \(error)")
  }
}

It’s a simple pattern: a single continuous event with a long duration. You create the player by calling makeAdvancedPlayer(with:) .

Next, in HapticManager , add the following line to setupResources() :

createSwishPlayer()

By doing this, you create the swish player whenever you call setupResources() , both in initializer and the haptic engine reset handler. Player references also reset when the engine resets.

Next, you need to add a method to start the player. Add the following at the end of HapticManager :

func startSwishPlayer() {
  do {
    try hapticEngine.start()
    try swishPlayer?.start(atTime: CHHapticTimeImmediate)
  } catch {
    print("Swish player start error: \(error)")
  }
}

startSwishPlayer() first calls hapticEngine.start() just in case the engine has stopped. Then, it calls the pattern player’s start(atTime:) with CHHapticTimeImmediate , so the player starts immediately.

You’ll also need to add a method to stop the player. Add this at the end of HapticManager as well:

func stopSwishPlayer() {
  do {
    try swishPlayer?.stop(atTime: CHHapticTimeImmediate)
  } catch {
    print("Swish player stop error: \(error)")
  }
}

Here you try to stop the pattern player as soon as you can, almost immediately, by passing CHHapticTimeImmediate .

Return to GameScene.swift , find touchesBegan(_:with:) and add the following line to start the pattern player when the player begins swiping:

hapticManager?.startSwishPlayer()

Next, find touchesEnded(_:with:) and add the following line to stop the pattern player when the player’s swipe ends:

hapticManager?.stopSwishPlayer()

Build and run and you should experience the player starting and stopping as you move your finger around the screen.

Now, it’s time to add the magic!

Making the Player Dynamic

Next, you’ll make the swish’s intensity depend on the user’s movements. Add the following method to HapticManager :

// 1
func updateSwishPlayer(intensity: Float) {
  // 2
  let intensity = CHHapticDynamicParameter(
    parameterID: .hapticIntensityControl, 
    value: intensity, 
    relativeTime: 0)
  do {
    // 3
    try swishPlayer?.sendParameters([intensity], atTime: CHHapticTimeImmediate)
  } catch let error {
    print("Swish player dynamic update error: \(error)")
  }
}
  1. Your new updateSwishPlayer(intensity:) takes a single float argument: a value between 0 and 1.
  2. Use that value to create a CHHapticDynamicParameter with the ID .hapticIntensityControl . This parameter functions much like the previous parameter curve you created, acting as a multiplier to all the event intensity values in the pattern. Unlike the curve, this is a one-time change.
  3. Send the dynamic parameter to the player for it to apply immediately to the pattern that’s playing.

Return to GameScene.swift and add the following to touchesMoved(_:with:) :

let distance = CGVector(
  dx: abs(startPoint.x - endPoint.x),
  dy: abs(startPoint.y - endPoint.y))
let distanceRatio = CGVector(
  dx: distance.dx / size.width,
  dy: distance.dy / size.height)
let intensity = Float(max(distanceRatio.dx, distanceRatio.dy)) * 100
hapticManager?.updateSwishPlayer(intensity: intensity)

Every time the system calls touchesMoved(_:with:) , you update your dynamic player’s intensity control value. You calculate the intensity using a simple algorithm: The more you move from the previous touch, the higher the intensity value will be.

Build and run. Snipping the vine should now feel like you’re a Jedi knight wielding a light saber!

Where to Go From Here?

You can download the completed project using the Download Materials button at the top or bottom of this tutorial.

You’ve transformed Snip The Vine from an amusing distraction to a whole new immersive experience! In the field of physics-based-cutting-things-down-so-they-drop-on-animals games on the App Store, it’ll beat them all, hands down.

If you can believe it, you’ve only touched on what Core Haptics can achieve. There’s so much more to explore.

Watch the Apple sessions from WWDC 2019:

Have a look through the Core Haptics documentation . There are a few sample Xcode projects in there to download as well.

Don’t forget about the Apple Human Interface Guidelines page on Haptics and the tips in Designing with Haptics .

You may also want to read about Apple Haptic and Audio Pattern (AHAP) file format .

I hope you enjoyed this Core Haptics tutorial. If you have any questions or comments, please join the forum discussion below.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK