This paper presents an American sign language (ASL) semantic communications scheme. The scheme consists of a semantic encoder that leverages a convolutional neural network to effectively utilize the ASL alphabet. The encoded information is transmitted with the 24-QAM quadrature amplitude modulation (QAM). Additionally, this paper introduces a dataset that involves the overlaying of red-green-blue landmarks and key-points onto the acquired images, thereby augmenting the depiction of hand posture. The quantification of the proposed system’s training, testing, and communication performance is accomplished through numerical results, which serve to emphasize the attainable benefits and stimulate meaningful discussions.
V. Kouvakis, S. E. Trevlakis, A. -A. A. Boulogeorgos, T. Tsiftsis, K. Singh, and N. Qi, “When Sign Language Meets Semantic Communications,” 2024 IEEE International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC), Valencia, Spain, 2024.