Automated coding Non-manual Markers: Sign language corpora analysis using FaceReader

Authors

  • Letícia Kaori Hanada Universidade Estadual de Campinas (UNICAMP)

DOI:

https://doi.org/10.36942/revincluso.v3i1.899

Keywords:

FaceReader; Non Manual Expressions; Coding; sign languages.

Abstract

The present study aims to introduce the possibility of coding facial expressions and head movements in different sign languages using the FaceReader program. Considering that Non-Manuals Markers (NMMs) (facial expressions, head and torso movements) are an important part of the signed languages grammar, the FaceReader’s automatic coding ability would be very useful for investigating a larger number of signed data, The program can calibrate participants’ faces, automatically annotate emotions (emotive facial expressions), Action Units (AUs) (grammatical NMMs), their intensities, as well as head movements. Overall, it seems highly beneficial to use the program for recognizing facial and head movements in sign languages. However, there are some disadvantages to adopting it, such as pricing, the lack of torso movements and head protraction/retraction coding, and difficulties in analyzing participants with beards and glasses. Some of these disadvantages can be addressed through the use of other softwares or methodologies strategies.

 

Downloads

Download data is not yet available.

References

Abelin, Å. (2004). Cross-Cultural Multimodal Interpretation of Emotional Expressions _ An Experimental Study of Spanish and Swedish. In SPEECH PROSODY 2004, INTERNATIONAL CONFERENCE.

Allport, F. H. (1924). Social psychology. Boston, Houghton.

Asch, S. E. (1952). Social Psychology. Englewood Cliffs, New Jersey: PrenticeHall.

Ambadar, Z., Schooler, J. W., & Cohn, J. F. (2005). Deciphering the enigmatic face: The importance of facial dynamics in interpreting subtle facial expressions. PSYCHOLOGICAL SCIENCE, 16(5), 403-410.

Ann, J. (2005). A functional explanation of Taiwan Sign Language handshape frequency. LANGUAGE AND LINGUISTICS-TAIPEI-, 6(2), 217.

Baker-Shenk , C. (1983). A micro-analysis of the nonmanual components of questions in American Sign Language. Unpublished doctoral dissertation, University of California, Berkeley.

Baker-Shenk, C. L., & Cokely, D. (1991). American Sign Language: A teacher's resource text on grammar and culture. Gallaudet University Press.

Berenz, N. (2002). Insights into person deixis. SIGN LANGUAGE & LINGUISTICS, 5(2), 203-227.

Bergman, B. (1984). Non-manual components of signed language: Some sentence types in Swedish Sign Language. RECENT RESEARCH ON EUROPEAN SIGN LANGUAGES, 49-59.

Blossom, M., & Morgan, J. L. (2006). Does the face say what the mouth says? A study of infants’ sensitivity to visual prosody. PROCEEDINGS OF THE 30TH ANNUAL BOSTON UNIVERSITY CONFERENCE ON LANGUAGE DEVELOPMENT. Somerville, MA.

Braem, P. B. (1999). Rhythmic temporal patterns in the signing of deaf early and late learners of Swiss German Sign Language. LANGUAGE AND SPEECH, 42(2-3), 177-208.

Boyes-Braem, P., & Sutton-Spence, R. (2001). The hands are the head of the mouth: The mouth as articulator in sign languages. (No Title).

Calder, A. J., Young, A. W., Rowland, D., Perrett, D. I., Hodges, J. R., & Etcoff, N. L. (1996). Facial emotion recognition after bilateral amygdala damage: Differentially severe impairment of fear. COGNITIVE NEUROPSYCHOLOGY, 13, 699–745

Ciciliani, T. A., & Wilbur, R. B. (2006). Pronominal system in Croatian Sign Language. SIGN LANGUAGE & LINGUISTICS, 9(1-2), 95-132.

Cohn, J. F., Ambadar, Z., & Ekman, P. (2007). Observer-based measurement of facial expression with the Facial Action Coding System. THE HANDBOOK OF EMOTION ELICITATION AND ASSESSMENT, 1(3), 203-221.

Cohn, J. F., & Ekman, P. Cohn, J. F. & Ekman, P. (2005). Measuring facial action. In J. A. Harrigan, R. Rosenthal, & K. R. Scherer (Eds.), THE NEW HANDBOOK OF NONVERBAL BEHAVIOR RESEARCH (pp. 9–64). New York: Oxford University Press.

Crasborn, O., Sloetjes, H., Auer, E., & Wittenburg, P. (2006). Combining video and numeric data in the analysis of sign languages with the ELAN annotation software. In 2ND WORKSHOP ON THE REPRESENTATION AND PROCESSING OF SIGN LANGUAGES: LEXICOGRAPHIC MATTERS AND DIDACTIC SCENARIOS (pp. 82-87). ELRA.

Crasborn, O. A., & Sloetjes, H. (2010). Using ELAN for annotating sign language corpora in a team setting.

Crasborn, O., & Sloetjes, H. (2008). Enhanced ELAN functionality for sign language corpora. In 6TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2008)/3RD WORKSHOP ON THE REPRESENTATION AND PROCESSING OF SIGN LANGUAGES: CONSTRUCTION AND EXPLOITATION OF SIGN LANGUAGE CORPORA (pp. 39-43).

Crasborn, O., & Van der Kooij, E. (2013). The phonology of focus in Sign Language of the Netherlands1. JOURNAL OF LINGUISTICS, 49(3), 515-565.

Dachkovsky, S., & Sandler, W. (2009). Visual intonation in the prosody of a sign language. LANGUAGE AND SPEECH, 52 (2-3), 287-314.

Darwin, C. (1872). The expression of emotions in animals and man. London: Murray, 11, 1872.

Davidson, R. J., Ekman, P., Saron, C. D., Senulis, J. A., & Friesen, W. V. (1990). Approach-withdrawal and cerebral asymmetry: Emotional expression and brain physiology: I. JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, 58, 330–341

Dinculescu, A. et al. (2019). Automatic identification of anthropological face landmarks for emotion detection. 2019 9th International Conference on Recent Advances in Space Technologies (RAST). IEEE, 585-590.

Eccarius, P., & Brentari, D. (2007). Symmetry and dominance: A cross-linguistic study of signs and classifier constructions. LINGUA, 117(7), 1169-1201.

Ekman, P. (1973). Universal facial expressions in emotion. STUDIA PSYCHOLOGICA, 15(2), 140.

Ekman, P. (1992). Are there basic emotions? Psychol. Rev, 99, 550–553. 10.1037/0033-295X.99.3.550

Ekman, P., & Friesen, W. V. (1978). Facial action coding system. ENVIRONMENTAL PSYCHOLOGY & NONVERBAL BEHAVIOR.

Ekman, P., & Friesen, W. V. (1982). Rationale and reliability for EMFACS coders. Unpublished manuscript.

Ekman, P., Friesen, W. V., & Tomkins, S. S. (1971). Facial affect scoring technique: A first validation study. SEMIOTICA, 3, 37–58.

Ekman, P., Friesen, W. V., & Hager, J. C. (Eds.). (2002). Facial Action Coding System [E-book]. Salt Lake City, UT: Research Nexus.Ekman, P., Friesen, W. V., & O’Sullivan, M. (1988). Smiles when lying. JOURNAL OF PERSONALITY AND SOCIAL PSYCHOLOGY, 54, 414–420

Ekman, P., & Rosenberg, E. (Eds.). (2005). What the face reveals (2nd ed.). New York: Oxford University Press

Engberg-Pedersen, E. (1990). Pragmatics of nonmanual behaviour in Danish Sign Language. SLR, 87, 121-128.

Engberg-Pedersen, E. (1993). Space in Danish Sign Language: The semantics and morphosyntax of the use of space in a visual language (Vol. 19). Gallaudet University Press.

FaceReader: emotion analysis. [s.d.]. Noldus. Acessado em 21 de junho de 2023 pelo URL https://www.noldus.com/facereader

Ferreira-Brito, L. F. (1995). Por uma gramática de lınguas de sinais. TEMPO BRASILEIRO, Rio de Janeiro.

Fontes, M. A., & Madureira, S. [s.d.]. Um experimento sobre a linguagem não verbal na detecção de efeitos de sentidos: o questionamento da autenticidade. ESTUDOS EM VARIAÇÃO LINGUÍSTICA NAS LÍNGUAS ROM NICAS-2, 138.

Fung, C. H., Sze, F., Lam, S., & Tang, G. (2008). Simultaneity vs. sequentiality: Developing a transcription system of Hong Kong Sign Language acquisition data. In SIGN-LANG@ LREC 2008 (pp. 22-27). European Language Resources Association (ELRA).

Gu, S., Wang, F., Yuan, T., Guo, B., and Huang, H. (2015). Differentiation of primary emotions through neuromodulators: review of literature. INT. J. NEUROL. RES. 1, 43–50. 10.17554/j.issn.2313-5611.2015.01.19

Gu, S., Wang, W., Wang, F., and Huang, J. H. (2016). Neuromodulator and emotion biomarker for stress induced mental disorders. NEURAL PLAST. 2016:2609128. 10.1155/2016/2609128

Henner, J., Geer, L. C., & Lillo-Martin, D. (2013, May). Calculating frequency of occurrence of ASL handshapes. In LSA ANNUAL MEETING EXTENDED ABSTRACTS (Vol. 4, pp. 16-1).

Hodge, G. C. E., & Ferrara, L. (2014). Showing the story: Enactment as performance in Auslan narratives. In SELECTED PAPERS FROM THE 44TH CONFERENCE OF THE AUSTRALIAN LINGUISTIC SOCIETY, 2013 (Vol. 44, pp. 372-397). University of Melbourne.

Izard, C. E. (1979). Facial expression scoring manual (FESM). Newark: University of Delaware Press.

Izard, C. E. (1983). Maximally discriminative facial movement coding system (MAX). Unpublished manuscript, University of Delaware, Newark

Izard, C. E., & Dougherty, L. M. (1982). Two complementary systems for measuring facial expressions in infants and children. MEASURING EMOTIONS IN INFANTS AND CHILDREN, 1, 97-126.

Jack, R., Garrod, O., and Schyns, P. (2014). Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. CURR. BIOL. 24, 187–192. doi: 10.1016/j.cub.2013.11.064

Kaiser, S. (2002). Facial expressions as indicators of “functional” and “dysfunctional” emotional processes. In M. Katsikitis (Ed.), THE HUMAN FACE: MEASUREMENT AND MEANING (pp. 235– 253). Dordrecht, Netherlands: Kluwer Academic.

Kanade, T., Cohn, J., & Tian, Y. (2000). Comprehensive database for facial expression analysis. Proceedings fourth IEEE international conference on automatic face and gesture recognition (cat. No. PR00580). IEEE, 46-53.

Küntzler, T., Höfling, T., Tim, A. & Alpers, G. W. (2021). Automatic facial expression recognition in standardized and non-standardized emotional expressions. Frontiers in psychology, 12, 1086.

Lackner, A. (2015). Linguistic functions of head and body movements in Austrian Sign Language (ÖGS). A corpus-based analysis:(Karl-Franzens-University Graz, 2013). SIGN LANGUAGE & LINGUISTICS, 18(1), 151-157.

Levenson, R. W., Ekman, P., & Friesen, W. V. (1990). Voluntary facial action generates emotion-specific autonomic nervous system activity. Psychophysiology, 27, 363–384.

Lewinski, P., Den, U., Tim, M. & Butler, C. (2014). Automated facial coding: Validation of basic emotions and FACS AUs in FaceReader. Journal of Neuroscience, Psychology, and Economics, 7 (4), 227.

Liddell, S. K. (1978). Nonmanual signals and relative clauses in ASL. In: Patricia Siple (Ed.), Understanding language through sign language research. New York: Academic Press.

Liddell, S. K. (1980). American Sign Language syntax. The Hague: Mouton

Liddell, S. K. (1986). Head thrust in ASL conditional marking. SIGN LANGUAGE STUDIES, 52(1), 244-262.

Loijens, L., & Krips, O. (2018). FaceReader methodology note. A WHITE PAPER BY NOLDUS INFORMATION TECHNOLOGY.

Lutz, C., & White, G. M. (1986). The anthropology of emotions. ANNUAL REVIEW OF ANTHROPOLOGY, 15(1), 405-436.

Malatesta, C. Z., Culver, C., Tesman, J. R., & Shephard, B. (1989). The development of emotion expression during the first two years of life. Monographs of the Society for Research in Child Development, 54.

Mansourian, S., Corcoran, J., Enjin, A., Lofstedt, C., Dacke, M., and Stensmyr, M. (2016). Fecal-derived phenol induces egg-laying aversion in drosophila. CURR. BIOL. 26, 2762–2769. 10.1016/j.cub.2016.07.065

Matias, R., & Cohn, J. F. (1993). Are max-specified infant facial expressions during face-to-face interaction consistent with differential emotions theory? Developmental Psychology, 29, 524–531.

Measure advertisement effectiveness with Emotion AI. [s.d]. FaceReader-online. Acessado em 21 de junho de 2023 pelo URL https://www.facereader-online.io/main

Meurant, L. (2008). The Speaker’s Eye Gaze Creating deictic, anaphoric and pseudo-deictic spaces of reference. SIGN LANGUAGES: SPINNING AND UNRAVELING THE PAST, PRESENT AND FUTURE. TISLR, 9, 403-414.

Padden, C. (1990). The relation between space and grammar in ASL verb morphology. SIGN LANGUAGE RESEARCH: THEORETICAL ISSUES, 118-132.

Puupponen, A., Wainio, T., Burger, B., & Jantunen, T. (2015). Head movements in Finnish Sign Language on the basis of Motion Capture data: A study of the form and function of nods, nodding, head thrusts, and head pulls. SIGN LANGUAGE & LINGUISTICS, 18(1), 41-89.

Quadros, R. M., & Karnopp, L. B. (2004). Língua de sinais brasileira: estudos linguísticos. Porto Alegre: Artmed Editora.

Reilly, J. S., McIntire, M. L., & Seago, H. (1992). Affective prosody in American sign language. SIGN LANGUAGE STUDIES, 113-128.

Reilly, J. (2006). How faces come to serve grammar: The development of nonmanual morphology in American Sign Language. Advances in the sign language development of deaf children, 262-290.

Sandler, W. (2012). 4. Visual prosody. In SIGN LANGUAGE (pp. 55-76). De Gruyter Mouton.

Schalber, K. (2006). What is the chin doing?: An analysis of interrogatives in Austrian sign language. SIGN LANGUAGE & LINGUISTICS, 9(1-2), 133-150.

Sloan, D. M., Straussa, M. E., Quirka, S. W., & Sajatovic, M. (1997). Subjective and expressive emotional responses in depression. JOURNAL OF AFFECTIVE DISORDERS, 46, 135–141.

Tian, Y., Kanade, T., & Cohn, J. F. (2011). Facial expression recognition. HANDBOOK OF FACE RECOGNITION, 487-519.

Tomkins, S. (1962). Affect imagery consciousness: Volume I: The positive affects. Springer publishing company.

Tomkins, S. (1963). Affect imagery consciousness: Volume II: The negative affects. Springer publishing company.

Transforming the perfect brow shape. (2004). Brow Diva. Acessado em 21 de junho de 2023 pelo URL https://browdiva.com/blog/transforming-the-perfect-brow-shape

Wang, F., and Pereira, A. (2016). Neuromodulation, emotional feelings and affective disorders. MENS SANA MONOGR. 14, 5–29. doi: 10.4103/0973-1229.154533

Wilbur, R. B. (1987). American Sign Language: linguistic and applied dimensions. Little, Brown and Co.

Wilbur, R. B. (1990). Why syllables? What the notion means for ASL research. THEORETICAL ISSUES IN SIGN LANGUAGE RESEARCH, 1, 81-108.

Wilbur, R. B., & Patschke, C. G. (1998). Body leans and the marking of contrast in American Sign Language. JOURNAL OF PRAGMATICS, 30(3), 275-303.

Wilbur, R. B. (2013). Phonological and prosodic layering of nonmanuals in American Sign Language. In THE SIGNS OF LANGUAGE REVISITED (pp. 196-220). Psychology Press.

Wittenburg, P., Brugman, H., Russel, A., Klassmann, A., & Sloetjes, H. (2006). ELAN: A professional framework for multimodality research. In 5TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2006) (pp. 1556-1559).

Zeshan, U. (2006). INTERROGATIVE AND NEGATIVE CONSTRUCTIONS IN SIGN LANGUAGE (p. 375). Ishara Press.

Downloads

Published

2023-11-09

How to Cite

Hanada, L. K. (2023). Automated coding Non-manual Markers: Sign language corpora analysis using FaceReader. Revincluso - Revista Inclusão & Sociedade, 3(1). https://doi.org/10.36942/revincluso.v3i1.899