Automatic Translation of Sentences to Mexican Sign Language: Rule-based Machine Translation and Animation Synthesis in Avatar

Bella Martinez-Seis, Obdulia Pichardo-Lagunas, Eliot Hernández-Morales, Oscar Rivera-Rodríguez, Sabino Miranda

Abstract


Sign Languages are mainly used by deaf people. The translation between Spanish of Mexico and Mexican Sign Language is a current challenge that remains unresolved. This paper considers two main areas for a proper translation: automatic translation and sign representation. The first one considers the syntactic of the language. The second one includes the representation of sequential signs. We propose a tool to translate sentences from written Spanish to Mexican Sign Language considering the syntactic from both languages. We use automatic translation based on rules because of the lack of a big corpus. The BLUE score for the translation was about 0.8061, which suggests a good translation. To display the signs, we used a 3D humanoid avatar. Signs Languages are agraphia, so we use a configuration matrix to describe them. We propose a process for Sign Language Synthesis. It takes the configuration matrix of each sign and generates animation rules describing the whole movement and positions that the avatar follows to produce the signs. It allows to increase the signs that the avatar represents easily.

Keywords


Sign languages, automatic translation, avatar, animation synthesis

Full Text: PDF