In early August, a graduate of NSTU NETI completed the work on a prototype system that translates the language of people with hearing impairments and controls the computer with gestures. Alexey Prikhodko is the only deaf programmer in the world who is working on the creation of an automatic sign language translator.
In addition to the built-in translator function, the system is able to control the computer with gestures. Hand movements can adjust the volume, brightness, and control the cursor on the screen without using a mouse.
According to Alexey Prikhodko, today there are many companies that claim that they were able to create a ready-made translator for the deaf, they advertise it and even sell it. "Savvy Motion, Kinect Sign Language Translator from Microsoft Research and other large companies have not yet been able to successfully accomplish the task of translating from sign language to spoken one, so the quality of such applications leaves much to be desired," he remarked.
The problem of developing a satisfactory translator is related to the main feature of sign language: its grammar. That is why today no program can replace live translators. The translation does not solely depend on the hands configuration and orientation. The hands movement, location and the so-called non-manual component of gestures (facial expression, lip movement and other signs of articulation) are of crucial importance.
"It is not difficult to translate from written language into sign language. It is difficult to automatically recognize gestures and translate them. It all depends on camera and sensors as well. There are two ways to recognize gestures using the automated systems: marker-less and marker-based. Marker-based system means that a person wears special gloves, devices on the wrist, bracelets and modern units that track the movement of muscles and points on the body of a person. I chose the hard way, which does not require a lot of special equipment to create a program with a marker-based system. My program applies the marker-less method to recognize person's gestures with the help of cameras", - says Alexey Prikhodko.
The marker-less system which is used in the Alexey's system superimposes a virtual "grid" on the image received from the special cameras. It is on this "grid" that the software algorithms find the control points to recognize the gestures. Next, the system processes the data and performs the specified actions: translation or control.
"If, for example, the model recognized that the fingers are open, it means letter V; if the fingers meet, it means O. A lot depends on the elbow as well. It allows forming a certain mathematical model on the basis of the skeletal one. Accordingly, each number from this model is assigned to a coordinate system, and the screen displays what type of gesture it is," said Alexey Prikhodko
Now the prototype successfully translates statements reproduced with the alphabet of the deaf. By his research work completion, Alexey plans to teach the system other components of RSL grammar in order to complete the program as a finished product for wide use among the deaf in the future.
According to Olga Varinova, the head of the laboratory at the Department of Special Training and Rehabilitation Technologies, ISTR, Alexey's development will be necessary not only for the deaf, but also for the training of future RSL translators. "Today there is an urgent need for such a translator. There are dictionaries for translation from the Russian spoken language into Russian sign language, but not in the other way. Students find it hard to cope with certain tasks, unless they have friends among deaf people who could advise and help. Therefore, we are counting on Alexey's development", - she comments.
In addition to the RSL translator, Alexey Prikhodko works on other projects. In early July 2019, "Gesture interface" project made the programmer and his team the winners of the "Digital breakthrough" competition within the federal project "Russia is the country of opportunities". It took the team 36 hours to develop a project for teaching the deaf in RSL, thus landing among the potential participants in the finals of the all-Russian developers competition.
"Gesture interface" project participated in the "Digital breakthrough" competition, while "Mathematics in silence" project took part in 2017 Potanin summer school.
Currently, Alexey Prikhodko is looking for investors who want to participate in the project to develop a RSL translator. "I would be able to bring the prototype to the final product in 1-2 years, if someone was willing to invest. I do not have enough resources to work on the system and PhD thesis. The expenditures include the price for a patent registration, legal protection and consulting as well as RSL interpreter's fee", — said Alexey.
Video of Alexey Prihodko's system at work
Reference:
In 2013, Alexey Prikhodko received a Bachelor's Degree at Automation and Computer Engineering Department and Institute of Social Technology and Rehabilitation, NSTU NETI In 2015, he received a Master's Degree at ACED, NSTU NETI; in August 2019 he completed his postgraduate studies at the Automated Control Systems (ACS).