Skip to content

Furhat Remote API DEPRECATED

DEPRECATED

This application is no longer maintained. We recommend that you use the Realtime API instead.

Furhat Remote API is a way to connect and give commands to your Furhat robot from a program running on an external computer on the same network. A number of programming languages are supported, including Python, C#, JavaScript, Rust etc. (50+ languages, for full list see Swagger documentation)

We've created a wrapper specifically for Python to make it as easy as possible to start using it in a python application.

An intro, overview and setup instructions to the Remote API and the python wrapper is available in this video.

Requests for more commands to include in the Remote API can be sent to: tech@furhatrobotics.com

Setup

Run the server on the Robot

The Remote API is included in Standard and Premium packages of the Furhat Robot. It can be started from the web interface of the robot, or in the SDK Launcher. When the API is running, the robot will start a Swagger Kotlin server listening on port 54321.

Testing the remote API using Postman

If you want, you can test the Remote API with the tool Postman:

  1. Download Postman,
  2. Download and open the Furhat Remote API yaml specification,
  3. Test the different requests (look for API documentation below).

Create a client for your specific programming language

In this step you will generate the code in your preferred programming language that will enable your program to communicate with Furhat.

If you want to to use the Remote API from Python, we recommend to use our pre-built Python PyPi library, which wraps the Remote API and makes it easier to install and use.

  1. Download Swagger or use their online editor,
  2. Paste the content from Furhat Client yaml file ,
  3. Click 'Generate Client' and select your language,
  4. Incorporate the client to your system and send requests to control the robot.

API Documentation

API Endpoints

All URIs are relative to http://IP of the robot:54321

EndpointHTTP request methodDescription
/furhat/attendPOSTAttend a user/location
/furhat/facePOSTChange the character and mask, or texture and model (deprecated)
/furhat/visibilityPOSTFade in or out the face (FaceCore only)
/furhat/gesturePOSTPerform a gesture
/furhat/ledPOSTChange the colour of the LED strip
/furhat/listenGETMake the robot listen, and get speech results
/furhat/listen/stopPOSTMake the robot stop listening
/furhat/sayPOSTMake the robot speak
/furhat/say/stopPOSTMake the robot stop talking
/furhat/voicePOSTSet the voice of the robot
/furhat/gesturesGETGet all gestures
/furhatGETTest connection
/furhat/usersGETGet current users
/furhat/voicesGETGet all the voices on the robot

Authentication

None of the endpoints require authentication. Keep in mind that while this makes it easy to use, anyone on a larger network could theoretically listen in. It is recommended to stop the Remote API skill (or turn off the robot) when you are not actively working with it.

Models

/furhat

GET
Summary:

Test connection

Description:

Used to verify if the server is running, return "hello world" upon success

Responses
CodeDescription
200Status update

/furhat/gestures

GET
Summary:

Get all gestures

Description:

Returns a JSON array with all gestures on the system (names + duration).

Responses
CodeDescriptionSchema
200A list of possible gestures[ Gesture ]

/furhat/voice

POST
Summary:

Set the voice of the robot

Description:

Sets the voice of the robot using the name of the voice, can be requested by doing a GET request on this endpoint.

Parameters
NameLocated inDescriptionRequiredSchema
namequeryThe name of the voiceYesstring
Responses
CodeDescriptionSchema
200Successful operationStatus

/furhat/voices

GET
Summary:

Get all the voices on the robot

Description:

Returns a JSON array with voice names + languages.

Responses
CodeDescriptionSchema
200Success[ Voice ]

/furhat/users

GET
Summary:

Get current users

Description:

Get all current users (max: 2). Returns a JSON array containg Users (Rotation, Location, id).

Responses
CodeDescriptionSchema
200successful operation[ User ]

/furhat/say

POST
Summary:

Make the robot speak

Description:

Makes the robot speak by either using text, or a URL (linking to a.wav file). If generatelipsync=true, it uses a .pho file hosted on the same url, or generates phonemes by itself.

Note : Lipsync does not work for local audio.

Parameters
NameLocated inDescriptionRequiredSchema
textqueryA string depicting a utterance the robot should say.Nostring
urlqueryA url link to a audio file (.wav)Nostring
blockingqueryWhether to block execution before completionNoboolean
lipsyncqueryIf a URL is provided, indicate if lipsync files should be generated/looked for.Noboolean
abortqueryStops the current speech of the robot.Noboolean
Responses
CodeDescriptionSchema
200successful operationStatus

/furhat/attend

POST
Summary:

Attend a user/location

Description:

Provides 3 modes of attention.

  1. Attend user based on enum (CLOSEST, OTHER or RANDOM)
  2. Attend user based on it's id (can be retrieved by using /furhat/users)
  3. Attend location based on coordinates (x,y,z)
Parameters
NameLocated inDescriptionRequiredSchema
userqueryMake furhat attend a user. Example 'CLOSEST'No[ ]
useridqueryMake furhat attend specified userNostring
locationqueryMake furhat attend location, usage: x,y,z. Example -20.0,-5.0,23.0Nostring
Responses
CodeDescriptionSchema
200Successful operationStatus

/furhat/face

POST
Summary:

Changes the appearance of the robot, using either character and mask, or texture and model.

Description:

Changes both the character and mask (FaceCore face engine), or texture and model (deprecated OpenSceneGraph face engine), based on the mask/character or model/texture name. Case sensitive. Names can be retrieved from the web interface.

Parameters
NameLocated inDescriptionRequiredSchema
maskqueryChange the mask of the robotNostring
characterqueryChange the character of the robotNostring
modelqueryChange the model of the robotNostring
texturequeryChange the texture of the robotNostring
Responses
CodeDescriptionSchema
200Successful operationStatus

/furhat/visibility

POST
Summary:

Fade in/out the face

Description:

Triggers an animation which fades the face out to black, or in again, with a set duration in the range of [0,10000] ms. Invalid input values will be coerced in, and missing durations will be set to the default value of 2000 ms. This command is only applicable to the FaceCore face engine.

Parameters
NameLocated inDescriptionRequiredSchema
visiblequeryWhether the face should be made visible or notYesboolean
durationqueryDuration of the fade animation in millisecondsNointeger
Responses
CodeDescriptionSchema
200Successful operationStatus

/furhat/gesture

POST
Summary:

Perform a gesture

Description:

Performs a gesture based on

  1. Gesture name (retrieve by GET request to /furhat/gestures)
  2. Gesture definition, see example
Parameters
NameLocated inDescriptionRequiredSchema
namequeryThe gesture to doNostring
blockingqueryWhether to block execution before completionNoboolean
bodybodyDefinition of the gestureNoGestureDefinition
Responses
CodeDescriptionSchema
200Successful operationStatus
400Parameters are wrongStatus

/furhat/listen

GET
Summary:

Make the robot listen, and get speech results

Description:

Blocking call to get user speech input, language defaults to english_US. Language parameter can be used to provide a different language. Return values can be found in the Status object as message and can be:

  • User speech
  • SILENCE
  • INTERRUPTED
  • FAILED
Parameters
NameLocated inDescriptionRequiredSchema
languagequeryThe language to listen for, defaults to en-USNostring
Responses
CodeDescriptionSchema
200Successful operationStatus

/furhat/listen/stop

POST
Summary:

Make the robot stop listening

Description:

Aborts the listen

Responses
CodeDescriptionSchema
200Successful operationStatus

/furhat/led

POST
Summary:

Change the colour of the LED strip

Description:

Changes the LED strip of the robot, colours can be between 0-255 (above 255 is changed to 255). Any parameter not provided defaults to 0.

Parameters
NameLocated inDescriptionRequiredSchema
redqueryThe amount of redNointeger
greenqueryThe amount of greenNointeger
bluequeryThe amount of blueNointeger
Responses
CodeDescriptionSchema
200Successful operationStatus

/furhat/say/stop

POST
Summary:

Make the robot stop talking

Description:

Stops the current speech.

Responses
CodeDescriptionSchema
200Successful operationStatus

Models

Status

NameTypeDescriptionRequired
successbooleanNo
messagestringNo

Gesture

NameTypeDescriptionRequired
namestringNo
durationinteger (double)No

User

NameTypeDescriptionRequired
idstringNo
rotationRotationNo
locationLocationNo

Location

NameTypeDescriptionRequired
xdoubleRobot's leftNo
ydoubleRobot's forwardNo
zdoubleRobot's upNo

Voice

NameTypeDescriptionRequired
namestringNo
languagestringNo

Rotation

NameTypeDescriptionRequired
xdoubleNo
ydoubleNo
zdoubleNo

GestureDefinition

The class name needs to be furhatos.gestures.Gesture otherwise it won't be parsed as a gesture. Examples can be found below.

NameTypeDescriptionRequired
namestringNo
frames[ Frame ]No
classstringNo

GestureDefinition examples

All BasicParams are listed here:

kotlin
    //All parameters have values between 0.0 and 1.0 (Except for the ones at the bottom).
    EXPR_ANGER
    EXPR_DISGUST
    EXPR_FEAR
    EXPR_SAD
    SMILE_CLOSED
    SMILE_OPEN
    SURPRISE
    BLINK_LEFT
    BLINK_RIGHT
    BROW_DOWN_LEFT
    BROW_DOWN_RIGHT
    BROW_IN_LEFT
    BROW_IN_RIGHT
    BROW_UP_LEFT
    BROW_UP_RIGHT

    EYE_SQUINT_LEFT
    EYE_SQUINT_RIGHT

    LOOK_DOWN
    LOOK_LEFT
    LOOK_RIGHT
    LOOK_UP

    PHONE_AAH
    PHONE_B_M_P
    PHONE_BIGAAH
    PHONE_CH_J_SH
    PHONE_D_S_T
    PHONE_EE
    PHONE_EH
    PHONE_F_V
    PHONE_I
    PHONE_K
    PHONE_N
    PHONE_OH
    PHONE_OOH_Q
    PHONE_R
    PHONE_TH
    PHONE_W

    LOOK_DOWN_LEFT
    LOOK_DOWN_RIGHT
    LOOK_LEFT_LEFT
    LOOK_LEFT_RIGHT
    LOOK_RIGHT_LEFT
    LOOK_RIGHT_RIGHT
    LOOK_UP_LEFT
    LOOK_UP_RIGHT

    //The following parameters have values in the range -50.0 to 50.0
    NECK_TILT
    NECK_PAN
    NECK_ROLL
    GAZE_PAN
    GAZE_TILT

Note that you can also use any of the FaceCore-compatible ARKitParams or CharParams, or any pre-recorded gestures, assuming the FaceCore face engine is used.

A couple of gesture examples:

Built-in BigSmile

json
{
  "name":"BigSmile",
  "frames":[
    {
      "time":[0.32,0.64],
      "persist":false, <- Optional
      "params":{
        "BROW_UP_LEFT":1,
        "BROW_UP_RIGHT":1,
        "SMILE_OPEN":0.4,
        "SMILE_CLOSED":0.7
        }
    },
    {
      "time":[0.96],
      "persist":false, <- Optional
      "params":{
        "reset":true
        }
    }],
  "class":"furhatos.gestures.Gesture"
}

Custom gesture

json
{
  "frames": [
    {
      "time": [
        0.17, 1.0, 6.0
      ],
      "params": {
        "NECK_ROLL": 25.0,
        "NECK_PAN": -12.0,
        "NECK_TILT": -25.0
      }
    },
    {
    	"time": [
    		7.0
    	],
    	"params": {
    		"reset": true
    	}
    }
  ],
  "name": "Cool Thing",
  "class": "furhatos.gestures.Gesture"
}

Frame

A list of times can be provided, at those times the params will be executed.

NameTypeDescriptionRequired
time[ double ]No
paramsBasicParamNo

BasicParam

All supported parameters can be found here BasicParam.

NameTypeDescriptionRequired
BasicParamobject

Python Remote API

Description

To simplify the use of the Furhat Remote API from Python, there is a package on PyPi called furhat-remote-api.

Installation

You can install the package using pip:

sh
pip install furhat-remote-api

(you may need to run pip with root permission: sudo pip install furhat-remote-api)

Usage

This shows how the different methods in the Remote API can be invoked from Python.

python
from furhat_remote_api import FurhatRemoteAPI

# Create an instance of the FurhatRemoteAPI class, providing the address of the robot or the SDK running the virtual robot
furhat = FurhatRemoteAPI("localhost")

# Get the voices on the robot
voices = furhat.get_voices()

# Set the voice of the robot
furhat.set_voice(name='Matthew')

# Say "Hi there!"
furhat.say(text="Hi there!")

# Play an audio file (with lipsync automatically added)
furhat.say(url="https://www2.cs.uic.edu/~i101/SoundFiles/gettysburg10.wav", lipsync=True)

# Listen to user speech and return ASR result
result = furhat.listen()

# Perform a named gesture
furhat.gesture(name="BrowRaise")

# Perform a custom gesture
furhat.gesture(body={
    "frames": [
        {
            "time": [
                0.33
            ],
            "params": {
                "BLINK_LEFT": 1.0
            }
        },
        {
            "time": [
                0.67
            ],
            "params": {
                "reset": True
            }
        }
    ],
    "class": "furhatos.gestures.Gesture"
    })

# Get the users detected by the robot
users = furhat.get_users()

# Attend the user closest to the robot
furhat.attend(user="CLOSEST")

# Attend a user with a specific id
furhat.attend(userid="virtual-user-1")

# Attend a specific location (x,y,z)
furhat.attend(location="0.0,0.2,1.0")

# Set the LED lights
furhat.set_led(red=200, green=50, blue=50)