Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Hardware and software complex for the detection and identification of informative features in images of means of nonverbal communication

Not Accessible

Your library or personal account may give you access

Abstract

Subject of study. Techniques for identifying informative features in dynamic images (video) of means of nonverbal communication of operators during a joint constructive activity were the subject of this study. Aim of study. This study aimed to develop methods for detecting and identifying significant features of nonverbal communication in the received dynamic (video) images of faces of partners, i.e., key features in the joint activity of partners upon disabling the verbal communication channel. Method. A technique for interaction was developed, with a hardware and software complex designed for identifying communication markers in images of facial expressions and eye movements of partners during a joint play activity, in which nonverbal communication is a key aspect and the speech communication channel is disabled. The joint activity involved one partner searching for a hidden target in the perceived image using a nonverbal clue from the second partner, who views the target position on his screen. Partners were chosen among representatives of two geographically distant cultures—Russian and Chinese. The interactions of the partners during their work with images of the object of activity received via simultaneous video communication in two research centers located in western and eastern Eurasia were evaluated through simultaneously recorded physiological parameters. Main results. A new algorithm for investigating images as a means of nonverbal interaction for ensuring the achievement of a shared goal without the speech communication channel was designed. The effectiveness of eye movements of the partners, which are the key features of nonverbal communication, was demonstrated during the constructive joint activity. The task was presented to Chinese and Russian test subjects in the form of English text. Based on the performed measurements, nonverbal interaction in the task of looking for the target position functions similarly for representatives of both cultures and depends on the roles of the partners and the type of executed task. The correlation between the activities of different neural networks of the brain that ensure the recognition of images during the target search in nonverbal communication was identified. An analysis of the images of eye movement during the communication isolated a classic pattern of eye capture for guided and guiding partners. Constant switching occurs between the assessment of images of partners’ eyes and the target search in the entire image. The obtained results suggest that 1) the cognitive process of recognition of facial expressions is probably the most complex visual cognitive process, and 2) universal behavioral algorithms exist when a common joint activity understandable to both partners is performed. Practical significance. A hardware and software complex for detecting significant features of nonverbal communication in the images of faces was designed. The working capacity of the proposed method for detecting features was demonstrated using clear basic signals of nonverbal communication, thus enabling new methods of training artificial neural networks to ensure intuitive nonverbal communication between machines and humans via the recognition of hidden markers of nonverbal communication.

© 2022 Optica Publishing Group

PDF Article
More Like This
Feature ghost imaging for color identification

Zihan Gao, Minghui Li, Peixia Zheng, Jiahao Xiong, Xuan Zhang, Zikang Tang, and Hong-Chao Liu
Opt. Express 31(10) 16213-16226 (2023)

Shannon information for joint estimation/detection tasks and complex imaging systems

Eric Clarkson and Johnathan B. Cushing
J. Opt. Soc. Am. A 33(3) 286-292 (2016)

Detection of linear features in synthetic-aperture radar images by use of the localized Radon transform and prior information

Vincent-de-Paul Onana, Emmanuel Trouvé, Gilles Mauris, Jean-Paul Rudant, and Emmanuel Tonyé
Appl. Opt. 43(2) 264-273 (2004)

Cited By

You do not have subscription access to this journal. Cited by links are available to subscribers only. You may subscribe either as an Optica member, or as an authorized user of your institution.

Contact your librarian or system administrator
or
Login to access Optica Member Subscription

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.