Fig. 5. The demonstration of communication between the speech impaired and the nonsigner.
a Flow chart of the sign language recognition and communication system, which allows the signer to use sign language and nonsigner to types directly to engage in the interaction. The delivered sign language by the signer is recognized and translated into text and speech by AI block. Based on TCP/IP, the client (controlled by signer Lily) in VR interface receives the recognition results and transmits to the sever (operated by the nonsigner Mary). The nonsigner types on the chat box to respond to the signer. b (i–v) Communication/conversation process in VR interface between the speech-disordered user Lily and nonsigner Mary based on the sign language recognition and communication system. The red rectangle indicates the corresponding reaction of these two users. These photos are of one of the authors. c Conversation summary of b. The hand image is created by the authors via Blender. Photo credit: Feng Wen, National University of Singapore.