British Sign Language Machine Learning Demo

The Trigger:

To take advantage of newly gained knowledge of machine learning to create a game that could take advantage of BSL’s expressive nature.

The Quest:

The software Processing is used for both input and output programs. The input program adapted the original ‘Leap Motion via Processing’ example provided by Wekinator which utilised the Leap Motion library. This program already provided coding for gathering data on position of fingertips (on X, Y, and Z axis) of all fingers and thumb. As BSL made use of all part of hand, including the data on pitch, yaw and roll of the hand helped to improve the accuracy of Wekinator’s recognition of signs.

The Wekinator is then trained using the Dynamic Time Warping (DTW) algorithm based on the information provided by Leap Motion. The signs were demonstrated by myself.

The output program offered 5 options for users to view a video clip of a sign which was also recorded by myself.

You can download this demo from ITCH.io however you’ll need Motion Leap to play it.

The Heroine:

In interest of time, I decided not to do programming from scratch, I chose to adapt the following two programs found on Wekinator website:

  • Wekinator’s LeapMotion_Fingers_15Inputs_Adapted
  • Wekinator’s Processing_TriggerText_1DTW

Libraries used:

  • de.voidplus.leapmotion
  • Processing.video

My own work in this project included improving the Leap Motion input approach by increasing the numbers of inputs from 15 to 21 with yaw, pitch and roll of the hand. The X, Y, Z position of hand were included too but ultimately ignored as that was not beneficial to Wekinator’s accuracy. I also included information for how to use the Leap Motion window for users.

For the output program, I created the UI template with video clips, instruction and buttons. The template was designed for 10 video clips and 10 buttons which was hidden (commented out) in the end as they were not needed. I took advantage of the code snippet for Text Trigger which is used for user’s feedback to when their sign is recognised.

The Wisdom:

This project proved to be more challenging than I expected especially with the Leap Motion limited capability. As I had not used Leap Motion before this project, I had inaccurate impression of its potential believing what I saw of its use on the internet. Some of the inspiration of the project utilised Leap Motion to what that seemed to be highly accurate to me. I made allowed for the fact that both achievements using American Sign Language (ASL) which used one-handed fingerspelling rather than BSL’s two-handed fingerspelling for example.

To my disappointment, Leap Motion was glitchy and it proved to be difficult to record accurate examples, even for users to use this resulted in confused outcome at time. I attempted to improve this by increasing the numbers of inputs which had helped somewhat. However, I discovered as I went along, someday, the responses seemed excellent, that it could even differ between ‘who’ and ‘what’ which used same handshape, latter used rotation with former a waggle being the key difference between them. Other times, it failed to recognise either of them at all. So I feel that lighting and set up such as distance between Leap Motion and myself as well as my posture might have factored into the degree of success which proved to be impossible to account for.

A thought I had previously when I did my research paper on VR and British Sign Language and discovered limited progress made for sign language as input, I thought that the accuracy might be improved if these projects took into consideration handshapes as they provide the basis of many signs. Therefore, I attempted to build a dual model combining a classifier model with DTW model to identify positive outcome for both to result in an outcome – the correct identification of the sign.

I then attempted to use Wekinator Input Helper to measure the positions of fingertips to position of the hand (palm) which was successful within the Input Helper. However, I was unable to get Wekinator to successfully receive the outputs from Input Helper with DTW algorithm. This was more successful with a classifier algorithm, unfortunately I did not have the time to retrain Wekinator and experiment with AdaBoost, Support Vector Machine or Naïve Baynes algorithms. This would have required the aid of Input Helper to create a buffer inputs too as BSL is dynamic.