Gestures

Performing gestures

Gestures are a key element in making an interaction lively, and are called by the gesture() method. Furhat comes with a library of basic gestures (in the Gestures class). For example, the Smile gesture.

furhat.gesture(Gestures.Smile) // Perform a smile

Gestures are asynchronous/non-blocking (default), but you can make them synchronous/blocking with the async = false parameter.

furhat.gesture(Gestures.ExpressAnger, async = false) // Express anger and wait until the end of the gesture to carry on

The strength and time-duration of all gestures in the basic library can be affected by parameters, strength and duration. The default value for these parameters is 1.0.

// Perform a smile that is twice as long as the default Smile
furhat.gesture(Gestures.Smile(duration=2.0))

// Perform a smile that is half as strong as the default Smile
furhat.gesture(Gestures.Smile(strength=0.5))

A gesture can be stopped by calling another gesture or by stopping the gesture manually.

// Perform a thinking gesture while doing an API call 
furhat.gesture(GesturesLib.ExpressThinking(duration = 10.0))
// do an API call here and wait for the result
furhat.stopGestures()

Defining gestures

You can define your own gestures in Kotlin like this:

val MySmile = defineGesture("MySmile") {
    frame(0.32, 0.72) {
        SMILE_CLOSED to 0.5
    }
    frame(0.2, 0.72){
        BROW_UP_LEFT to 1.0
        BROW_UP_RIGHT to 1.0
    }
    frame(0.16, 0.72){
        BLINK_LEFT to 0.1
        BLINK_RIGHT to 0.1
    }
    reset(1.04)
}

This gesture definition contains a list of key frames, i..e, points in time (defined in seconds from the start of the gesture), where certain parameter-values should be reached. Each frame contains a list of parameters and the value they should reach.

You can see all available parameters by looking at the furhatos.gestures.BasicParams ENUM class (legacy), or furhatos.gestures.ARKitParams (based on Apple's ARKit) and furhatos.gestures.CharParams (offsetting a character's facial appearance) for use with the new FaceCore face engine.

As in the example above, it is also possible to define several points in time for a frame. This is typically used to sustain a certain parameter setting. In the example above, the SMILE_CLOSED will reach the value 0.5 after 0.32 seconds, and then stay at this value until 0.72 seconds after the gesture started.

To define the end of the gesture, you typically end the list of frames with a reset(), specified with the number of seconds after the gestures starts (in this example, the gesture would be 1.04 seconds long). The reset() command restores all parameters affected by the gesture to their default values. If you do not provide a reset() command, the parameters will stay at the last value until another gesture will affect them (i.e., the gesture will be sustained indefinitely).

The gesture can then be used like this in the flow:

furhat.gesture(MySmile)

Note: The reset() function does not have control over audio. Similarly, furhat.gesture(cancel) will stop the gesture, but not any audio. It is possible to kill the sound after the gesture completed with the following function:

furhat.stopSpeaking()

This method of creating gestures can be very time consuming, but offers a lot of control. If you prefer capturing more natural gestures using facial capture, consider using the Furhat Gesture Capture Tool instead.

Additional parameters

It is also possible to change textures, change the colors of the LED ring and to play audio.

val TextureAndAudioGesture = defineGesture {
    frame(1.0) {
        texture("Elsa") //Changes texture to Elsa
    }

    frame(2.0) {
        texture("Ursula") //Changes texture to Ursula
        led(Pixel(0, 0, 255)) //Change LED ring by providing a pixel (R,G,B)
    }

    frame(4.0) {
        audio("classpath:sound/Test_audio.wav") //Plays audio from the resources folder. File is in resources/sound/Test_audio.wav
    }

    frame(15.0) {
        audio("https://bigsoundbank.com/UPLOAD/wav/0283.wav") //Plays audio from the web
    }

    //Resets params, with an option to resetTexture or resetLed to the values before the gesture started.
    reset(50.0, resetTexture = true, resetLed = true)
}

Note: You can choose to use lipsync on the audio files with the speech parameter, which is by default set to false:

audio("classpath:sound/Test_audio.wav", speech = true)

Defining gestures with parameters

You can easily add parameters to you gesture by defining it as a function instead:

fun Wink(strength: Double = 1.0, duration: Double = 1.0) =
        defineGesture("Wink", strength, duration) {
    frame(0.32) {
        EPICANTHIC_FOLD to 0.3
        EYE_SQUINT_LEFT to 1.0
        BROW_DOWN_LEFT to 1.0
    }
    reset(0.64)
}

Note how we pass the strength and duration parameters to defineGesture. By doing this, the strength will automatically be applied (multiplied) to the target values, and the duration to the key frame time points. Note also that defineGesture will make sure that the target values never fall out of range, even after applying the strength multiplier.

We can of course also define any parameters we want, and apply them directly in the definition:

// We add an iterations parameter to specify the number of times to do the wink
fun Wink(strength: Double = 1.0, duration: Double = 1.0, iterations: Int = 1) =
        defineGesture("Wink", duration = duration) {
    for (i in 0 until iterations) {
        frame(0.32 + 0.64 * i) {
            EPICANTHIC_FOLD to 0.3
            EYE_SQUINT_LEFT to 1.0 * strength
            BROW_DOWN_LEFT to 1.0
        }
        reset(0.64 * i)
    }
}

Note how we only passed duration to the defineGesture method and instead chose to implement our own handling of the strength parameter.

Persistent gestures and priority

Normally a gesture ends after the last frame. You can however make it persist using the flag persist=true on the last frame. This is for example the case for the built-in gesture CloseEyes:

val CloseEyes = defineGesture("CloseEyes") {
    frame(0.4, persist = true) {
        BLINK_RIGHT to 1.0
        BLINK_LEFT to 1.0
    }
}

However, when a new gesture is called that involves the same parameters, these parameters will be overridden. So, if for example a Blink gesture is performed after the CloseEyes gestures, Furhat will blink and then keep the eyes open. To specify which gesture should have precedence, you can call it with a priority parameter:

furhat.gesture(CloseEyes, priority=10)

The default priority for all gestures is 0, so this would mean that Furhat would not blink anymore after performing a CloseEyes gesture with a higher priority. To make Furhat open the eyes again, you could call the gesture OpenEyes with a priority that is higher or equals to the one used to close the eyes:

furhat.gesture(OpenEyes, priority = 10)

Since the OpenEyes gesture does not end with a persistent frame, the parameters BLINK_LEFT and BLINK_RIGHT are now free, which means that Furhat will start blinking again:

val OpenEyes = defineGesture("OpenEyes") {
    frame(0.4) {
        BLINK_RIGHT to 0.0
        BLINK_LEFT to 0.0
    }
}

Reactive gestures

Per default, Furhat reacts to certain events with gestures:

  • When "prominence" is detected in the synthesized speech (i.e., a part which is stressed), Furhat raises the eyebrows (Gestures.BrowRaise).
  • When the user starts to speak (while the system is listening), Furhat smiles (Gestures.Smile).

You can change or turn off these behaviours like this:

// Note that you have to make the following import for these parameters to be accessible:
import furhatos.autobehavior.*

// ...

// Randomly perform one of the provided gestures at prominence:
furhat.prominenceGesture = listOf(Gestures.BrowFrown, Gestures.Thoughtful)

// Make no gesture at speech start (empty list):
furhat.userSpeechStartGesture = listOf()

Automatic Smile Back

It's also possible (on a robot) to smile back to a user when the user smiles. To activate this functionality the furhat object can be used like so:

val SmileBackExample = state {
    onEntry {
        furhat.enableSmileBack = true
    }
}

Now when the user smiles, furhat will smile according to some set parameters.

Microexpressions

Microexpressions are small facial expressions that run continuously to make Furhat appear more "alive".

Furhat comes a set of pre-defined microexpressions, but you can configure them from the skill. You can toggle basic parameters (on/off), like this:

furhat.setDefaultMicroexpression(blinking = true, facialMovements= true, eyeMovements = false)

The basic parameters are:

  • blinking: Furhat blinks
  • facialMovements: Small movements in the face around the mouth and eyes
  • eyeMovements: The eyes gaze continuously shifts a little bit, similar to small saccades.

Note that the eyebrow movements are not part of those microexpressions. Their control is associated with the reactive gestures above and set though the prominence list.

If you want more detailed control, you can define your own microexpressions:

furhat.setMicroexpression(
    defineMicroexpression {
        // Fluctuade facial movements. First parameter is frequency, second amplitude, third adjustment.
        fluctuate(0.025, 0.06, 0.12, BasicParams.BROW_UP_LEFT, BasicParams.BROW_UP_RIGHT);
        fluctuate(0.025, 0.2, 0.5, BasicParams.SMILE_CLOSED);
        fluctuate(0.025, 0.2, 0.5, BasicParams.EXPR_SAD);
        // Adjust eye gaze randomly (between -3 and 3 degrees) with a random interval of 200-400 ms.
        repeat(200..400) {
            adjust(-3.0..3.0, BasicParams.GAZE_PAN)
            adjust(-3.0..3.0, BasicParams.GAZE_TILT)
        }
        // Blinking with a random interval of 2-8 seconds
        repeat(2000..8000, Gestures.Blink)
    })

Using the LED ring

Furhat has a ring of LEDs on his bottom. While these LEDs primarily functions as system indicators such as signaling startup, give feedback on setting output robot volume etc, skill developers have access to an API to control the LEDS using the furhat.ledStrip.solid(Color) method.

Examples:

furhat.ledStrip.solid(java.awt.Color.RED)
furhat.ledStrip.solid(java.awt.Color(127,0,0))

Only solid colors are supported at the moment. By default the LEDs will continue to display the selected color until the strip is disabled using color black Color(0,0,0).

Since the LEDs can generate a significant amount of heat we advise to not use RGB values larger than 127 for extended periods of time.