Furhat Remote API
Note: This feature is considered a public BETA and can change significantly based on user feedback!
Furhat Remote API is a way to connect and give commands to your Furhat robot from a program running on an external computer on the same network. A number of programming languages are supported, including Python, C#, JavaScript, Rust etc. (50+ languages, for full list see Swagger documentation)
If you want to to use it from Python, we recommend to use the Python PyPi library, which wraps the Remote API and makes it easier to install and use.
Setup
Run the server on the Robot
Download the Remote API skill file and load it to you Furhat robot. When the skill is started, the robot will start a Swagger Kotlin server listening on port 54321.
Testing the remote API using Postman
If you want, you can test the Remote API with the tool Postman:
- Download Postman,
- Download and open the Furhat Remote API yaml specification,
- Test the different requests (look for API documentation below).
Create a client for your specific programming language
In this step you will generate the code in your preferred programming language that will enable your program to communicate with Furhat.
- Download swagger or use their online editor,
- Paste the content from Furhat Client yaml file ,
- Click 'Generate Client' and select your language,
- Incorporate the client to your system and send requests to control the robot.
API Documentation
API Endpoints
All URIs are relative to http://IP of the robot:54321
Method | HTTP request | Description |
---|---|---|
furhatAttendPost | POST /furhat/attend | Attend a user/location |
furhatFacePost | POST /furhat/face | Change the texture and/or model |
furhatGesturePost | POST /furhat/gesture | Perform a gesture |
furhatLedPost | POST /furhat/led | Change the colour of the LED strip |
furhatListenGet | GET /furhat/listen | Make the robot listen, and get speech results |
furhatListenStopPost | POST /furhat/listen/stop | Make the robot stop listening |
furhatSayPost | POST /furhat/say | Make the robot speak |
furhatSayStopPost | POST /furhat/say/stop | Make the robot stop talking |
furhatVoicePost | POST /furhat/voice | Set the voice of the robot |
furhatGesturesGet | GET /furhat/gestures | Get all gestures |
furhatGet | GET /furhat | Test connection |
furhatUsersGet | GET /furhat/users | Get current users |
furhatVoicesGet | GET /furhat/voices | Get all the voices on the robot |
Models
- io.swagger.server.models.BasicParam
- io.swagger.server.models.Frame
- io.swagger.server.models.Gesture
- io.swagger.server.models.GestureDefinition
- io.swagger.server.models.Location
- io.swagger.server.models.Rotation
- io.swagger.server.models.Status
- io.swagger.server.models.User
- io.swagger.server.models.Voice
Authorization
All endpoints do not require authorization.
/furhat
GET
Summary:
Test connection
Description:
Used to verify if the server is running, return "hello world" upon success
Responses
Code | Description |
---|---|
200 | Status update |
/furhat/gestures
GET
Summary:
Get all gestures
Description:
Returns a JSON array with all gestures on the system (names + duration).
Responses
Code | Description | Schema |
---|---|---|
200 | A list of possible gestures | [ Gesture ] |
/furhat/voice
POST
Summary:
Set the voice of the robot
Description:
Sets the voice of the robot using the name of the voice, can be requested by doing a GET request on this endpoint.
Parameters
Name | Located in | Description | Required | Schema |
---|---|---|---|---|
name | query | The name of the voice | Yes | string |
Responses
Code | Description | Schema |
---|---|---|
200 | Successfull operation | Status |
/furhat/voices
GET
Summary:
Get all the voices on the robot
Description:
Returns a JSON array with voice names + languages.
Responses
Code | Description | Schema |
---|---|---|
200 | Success | [ Voice ] |
/furhat/users
GET
Summary:
Get current users
Description:
Get all current users (max: 2). Returns a JSON array containg Users (Rotation, Location, id).
Responses
Code | Description | Schema |
---|---|---|
200 | successful operation | [ User ] |
/furhat/say
POST
Summary:
Make the robot speak
Description:
Makes the robot speak by either using text, or a URL (linking to a.wav file). If generatelipsync=true, it uses a .pho file hosted on the same url, or generates phonemes by itself.
Parameters
Name | Located in | Description | Required | Schema |
---|---|---|---|---|
text | query | A string depicting a utterance the robot should say. | No | string |
url | query | A url link to a audio file (.wav) | No | string |
blocking | query | Whether to block execution before completion | No | boolean |
lipsync | query | If a URL is provided, indicate if lipsync files should be generated/looked for. | No | boolean |
abort | query | Stops the current speech of the robot. | No | boolean |
Responses
Code | Description | Schema |
---|---|---|
200 | successful operation | Status |
/furhat/attend
POST
Summary:
Attend a user/location
Description:
Provides 3 modes of attention. 1. Attend user based on enum (CLOSEST, OTHER or RANDOM) 2. Attend user based on it's id (can be retrieved by using /furhat/users) 3. Attend location based on coordinates (x,y,z)
Parameters
Name | Located in | Description | Required | Schema |
---|---|---|---|---|
user | query | Make furhat attend a user. Example 'CLOSEST' | No | [ ] |
userid | query | Make furhat attend specified user | No | string |
location | query | Make furhat attend location, usage: x,y,z. Example -20.0,-5.0,23.0 | No | string |
Responses
Code | Description | Schema |
---|---|---|
200 | Successfull operation | Status |
/furhat/face
POST
Summary:
Change the texture and/or model
Description:
Changes the texture or model, based on the model/textue name. Case sensitive. Names can be retrieved from the web interface
Parameters
Name | Located in | Description | Required | Schema |
---|---|---|---|---|
model | query | Change the model of the robot | No | string |
texture | query | Change the texture of the robot | No | string |
Responses
Code | Description | Schema |
---|---|---|
200 | Successfull operation | Status |
/furhat/gesture
POST
Summary:
Perform a gesture
Description:
Performs a gesture based on 1. Gesture name (retrieve by GET request to /furhat/gestures) 2. Gesture definition, see example
Parameters
Name | Located in | Description | Required | Schema |
---|---|---|---|---|
name | query | The gesture to do | No | string |
blocking | query | Whether to block execution before completion | No | boolean |
definition | body | No | GestureDefinition |
Responses
Code | Description | Schema |
---|---|---|
200 | Successfull operation | Status |
400 | Parameters are wrong | Status |
/furhat/listen
GET
Summary:
Make the robot listen, and get speech results
Description:
Blocking call to get user speech input, language defaults to english_US. Language parameter can be used to provide a different language. Return values can be found in the Status object as message and can be: - User speech - SILENCE - INTERRUPTED - FAILED
Parameters
Name | Located in | Description | Required | Schema |
---|---|---|---|---|
language | query | The language to listen for, defaults to en-US | No | string |
Responses
Code | Description | Schema |
---|---|---|
200 | Successfull operation | Status |
/furhat/listen/stop
POST
Summary:
Make the robot stop listening
Description:
Aborts the listen
Responses
Code | Description | Schema |
---|---|---|
200 | Successfull operation | Status |
/furhat/led
POST
Summary:
Change the colour of the LED strip
Description:
Changes the LED strip of the robot, colours can be between 0-255 (above 255 is changed to 255). Any parameter not provided defaults to 0.
Parameters
Name | Located in | Description | Required | Schema |
---|---|---|---|---|
red | query | The amount of red | No | integer |
green | query | The amount of green | No | integer |
blue | query | The amount of blue | No | integer |
Responses
Code | Description | Schema |
---|---|---|
200 | Successfull operation | Status |
/furhat/say/stop
POST
Summary:
Make the robot stop talking
Description:
Stops the current speech.
Responses
Code | Description | Schema |
---|---|---|
200 | Successfull operation | Status |
Models
Status
Name | Type | Description | Required |
---|---|---|---|
success | boolean | No | |
message | string | No |
Gesture
Name | Type | Description | Required |
---|---|---|---|
name | string | No | |
duration | integer (double) | No |
User
Name | Type | Description | Required |
---|---|---|---|
id | string | No | |
rotation | Rotation | No | |
location | Location | No |
Location
Name | Type | Description | Required |
---|---|---|---|
x | double | No | |
y | double | No | |
z | double | No |
Voice
Name | Type | Description | Required |
---|---|---|---|
name | string | No | |
language | string | No |
Rotation
Name | Type | Description | Required |
---|---|---|---|
x | double | No | |
y | double | No | |
z | double | No |
GestureDefinition
The class name needs to be furhatos.gestures.Gesture otherwise it won't be parsed as a gesture. Examples can be found below.
Name | Type | Description | Required |
---|---|---|---|
name | string | No | |
frames | [ Frame ] | No | |
class | string | No |
GestureDefinition examples
All basic params are listed here:
//All parameters have values between 0.0 and 1.0 (Except for the ones at the bottom).
EXPR_ANGER
EXPR_DISGUST
EXPR_FEAR
EXPR_SAD
SMILE_CLOSED
SMILE_OPEN
SURPRISE
BLINK_LEFT
BLINK_RIGHT
BROW_DOWN_LEFT
BROW_DOWN_RIGHT
BROW_IN_LEFT
BROW_IN_RIGHT
BROW_UP_LEFT
BROW_UP_RIGHT
EYE_SQUINT_LEFT
EYE_SQUINT_RIGHT
LOOK_DOWN
LOOK_LEFT
LOOK_RIGHT
LOOK_UP
PHONE_AAH
PHONE_B_M_P
PHONE_BIGAAH
PHONE_CH_J_SH
PHONE_D_S_T
PHONE_EE
PHONE_EH
PHONE_F_V
PHONE_I
PHONE_K
PHONE_N
PHONE_OH
PHONE_OOH_Q
PHONE_R
PHONE_TH
PHONE_W
LOOK_DOWN_LEFT
LOOK_DOWN_RIGHT
LOOK_LEFT_LEFT
LOOK_LEFT_RIGHT
LOOK_RIGHT_LEFT
LOOK_RIGHT_RIGHT
LOOK_UP_LEFT
LOOK_UP_RIGHT
//The following parameters have values in the range -50.0 to 50.0
NECK_TILT
NECK_PAN
NECK_ROLL
GAZE_PAN
GAZE_TILT
A couple of gesture examples:
Built-in BigSmile
{
"name":"BigSmile",
"frames":[
{
"time":[0.32,0.64],
"persist":false, <- Optional
"params":{
"BROW_UP_LEFT":1,
"BROW_UP_RIGHT":1,
"SMILE_OPEN":0.4,
"SMILE_CLOSED":0.7
}
},
{
"time":[0.96],
"persist":false, <- Optional
"params":{
"reset":true
}
}],
"class":"furhatos.gestures.Gesture"
}
Custom gesture
{
"frames": [
{
"time": [
0.17, 1.0, 6.0
],
"params": {
"NECK_ROLL": 25.0,
"NECK_PAN": -12.0,
"NECK_TILT": -25.0
}
},
{
"time": [
7.0
],
"params": {
"reset": true
}
}
],
"name": "Cool Thing",
"class": "furhatos.gestures.Gesture"
}
Frame
A list of times can be provided, at those times the params will be executed.
Name | Type | Description | Required |
---|---|---|---|
time | [ double ] | No | |
params | BasicParam | No |
BasicParam
All supported parameters can be found here BasicParam.
Name | Type | Description | Required |
---|---|---|---|
BasicParam | object |
Python Remote API
Description
To simplify the use of the Furhat Remote API from Python, there is a package on PyPi called furhat-remote-api.
Installation
You can install the package using pip:
pip install furhat-remote-api
(you may need to run pip
with root permission: sudo pip install furhat-remote-api
)
Usage
This shows how the different methods in the Remote API can be invoked from Python.
from furhat_remote_api import FurhatRemoteAPI
# Create an instance of the FurhatRemoteAPI class, providing the address of the robot or the SDK running the virtual robot
furhat = FurhatRemoteAPI("localhost")
# Get the voices on the robot
voices = furhat.get_voices()
# Set the voice of the robot
furhat.set_voice(name='Matthew')
# Say "Hi there!"
furhat.say(text="Hi there!")
# Play an audio file (with lipsync automatically added)
furhat.say(url="https://www2.cs.uic.edu/~i101/SoundFiles/gettysburg10.wav", lipsync=True)
# Listen to user speech and return ASR result
result = furhat.listen()
# Perform a named gesture
furhat.gesture(name="BrowRaise")
# Perform a custom gesture
furhat.gesture(definition={
"frames": [
{
"time": [
0.33
],
"params": {
"BLINK_LEFT": 1.0
}
},
{
"time": [
0.67
],
"params": {
"reset": True
}
}
],
"class": "furhatos.gestures.Gesture"
})
# Get the users detected by the robot
users = furhat.get_users()
# Attend the user closest to the robot
furhat.attend(user="CLOSEST")
# Attend a user with a specific id
furhat.attend(userid="virtual-user-1")
# Attend a specific location (x,y,z)
furhat.attend(location="0.0,0.2,1.0")
# Set the LED lights
furhat.set_led(red=200, green=50, blue=50)