{"id":10816,"date":"2016-12-15T14:56:02","date_gmt":"2016-12-15T14:56:02","guid":{"rendered":"https:\/\/lab-ncs.com\/?page_id=10816\/"},"modified":"2023-12-19T00:10:09","modified_gmt":"2023-12-18T23:10:09","slug":"techniques-and-methods-of-non-verbal-behavior-analysis-2","status":"publish","type":"page","link":"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/","title":{"rendered":"Methods and techniques of nonverbal behavior analysis"},"content":{"rendered":"<header class=\"icon-heading\"><span class=\"section-header-icon icon icon-key\"><\/span><h2> Methods and techniques of nonverbal behavior analysis<span><\/span><\/h2><\/header>\n<p>In order to perform a professional nonverbal behavior analysis it is fundamental to use techniques able to objectively describe behavior and to attribute a trustworthy meaning to it. The main advantages of the scientific analysis of nonverbal communication are:<\/p>\n<ul>\n<li>To <strong>identify<\/strong> others\u2019 emotions and states of mind with accuracy;<\/li>\n<li>To <strong>anticipate<\/strong> peoples\u2019 behavior;<\/li>\n<li>To <strong>expose<\/strong> lies through the combined analysis of verbal and facial expressions;<\/li>\n<li>To <strong>select<\/strong> the speaker\u2019s strengths and weaknesses during interpersonal relations.<\/li>\n<\/ul>\n<p>It is possible to learn nonverbal behavior analysis techniques in a short time through a focused and interactive training program based on practical exercises.<\/p>\n<h3>The scientific basis<\/h3>\n<p>The first scientific text about emotional expressions was written by Guillaume Benjamin Amand Duchenne de Boulogne, a French neurologist, entitled \u201c<em>M\u00e9canisme de la physionomie humaine, ou Analyse \u00e9lectro-physiologique de l\u2019expression des passions applicable \u00e0 la pratique des arts plastiques\u201d<\/em>. Written in 1862, the text demonstrates the method of applying electrodes to the facial muscles in order to establish the relation between the facial movements and their related emotional expressions. In his honor, the real, authentic smile is nowadays called the Duchenne smile.<\/p>\n<p>In 1872, Charles Darwin wrote <em>The Expressions of the Emotions in Man and Animals<\/em> in which he speculates that emotions are an evolutionary product and thus are innate. Moreover, facial and bodily expressions correspond to these emotions and appear to be the same in humans belonging to different ethnicities as well as in primates and other animals. However, Darwin\u2019s facial expression studies don\u2019t continue after his death due to the hostility shown by the scientific community towards him and his theories: he\u2019s been criticized for having attributed emotions to animals as, according to his detractors, feelings can only belong to humans; lastly, he\u2019s been blamed for being based on direct observation.<\/p>\n<p>The concept of the universality of basic emotional expressions was again discovered in the late 50s. Eminent researchers such as Friesen, Ellsworth, Ekman, Izard, and Birdwhistell tried to validate Darwin\u2019s theory. Together they developed a set of theories, methods, and tests that in their entirety constitute the so-called \u201cFacial Expression Program\u201d. They believed that the origin of emotional expressions and of the emotional experience was a precise number of innate neurological programs. We now know that there is a specific path for each emotion that assures the invariability of facial expressions associated with that emotion. These innate and phylogenetically inherited neuronal programs give birth to adaptive responses ascribable to emotional families. According to evolution\u00a0theory, of which these works belong, emotions have an adaptive function that allows humans to react through an immediate response to different stimuli (internal, external natural and\/or learned stimuli) in order to survive.<\/p>\n<p>There are two groups of nonverbal analysis techniques:<\/p>\n<ul>\n<li>The coding techniques, which describe facial and body movements;<\/li>\n<li>The decoding techniques, which interpret and give meaning to the movements.<\/li>\n<\/ul>\n<p><header class=\"icon-heading\"><span class=\"section-header-icon icon icon-user\"><\/span><h2>Facial Expression scientific analysis techniques<span><\/span><\/h2><\/header><br \/>\n<img loading=\"lazy\" class=\"alignleft wp-image-20720\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2018\/06\/lettura-delle-espressioni-facciali.gif\" alt=\"\" width=\"279\" height=\"360\" \/><\/p>\n<h3>Interpretative System of Facial Expressions (ISFE)<\/h3>\n<p>The Interpretative System of Facial Expressions, developed in 2013 by Jasna Legisa in the NeuroComScience laboratory, is a summary table of the meanings of facial movements. It comprises a set of tables and descriptions that integrate and order the facial expressions according to their related emotions. The information is taken from previous systems and existing literature on the subject.<\/p>\n<p>Besides primary and secondary emotional expressions, other facial signs are described: manipulators, illustrators and regulators. According to Hjorstjo (1969), Izard (1979) and Ekman (1983), emotional expressions are grouped into so-called \u201cbig families\u201d. Each of these \u201cfamilies\u201d include several facial expressions that, despite their slight meaning differences, are united by the fact that they receive the same emotional collocation. For example, the \u201csurprise\u201d family comprises the real surprise, the fake surprise, the annoyed surprise, awe and so on.<\/p>\n<p>The ISFE tables place primary emotional movements into 3 categories:<\/p>\n<ul>\n<li><strong>Category 1\u00a0<\/strong>includes the muscular movements that belong to a specific emotion;<\/li>\n<li><strong>Category 2\u00a0<\/strong>comprises the movements that can belong to one or more primary emotions;<\/li>\n<li><strong>Category 3\u00a0<\/strong> includes minor emotional variations that can be part of more emotional families.<\/li>\n<\/ul>\n<p>Such categorizations facilitate the accuracy and interpretation of the entire analysis.<\/p>\n<h3>Hjorstjo Method: Man&#8217;s Face and Mimic Language<\/h3>\n<p>In 1969, Hjorstjo, an anatomy professor at Lund University in Sweden, attempted to systematically categorize specific facial movements and their meanings into 8 emotional families for the first time. His handbook reports the coding and decoding of facial expressions with which it is possible to determine the contractions of facial muscles,\u00a0singularly or in combination. None of his previous studies carried out with Landis (1924), Frois-Wittman (1930) and Fulcher (1942) achieved such complete results.<\/p>\n<h3>Maximally Discriminative Coding System (MAX)<\/h3>\n<p>This system only identifies the movements\u2019 of behavioral units that the authors gave meaning to, as opposed to previous systems that only described the facial muscular movements regardless of the meaning of such actions. MAX was developed by Izard in 1979 and in 1983 he collaborated with Dougherty and Hembree to establish a more advanced version of MAX; AFFEX. They established facial configurations based, <em>a priori<\/em>, on typical facial expressions of emotions such as anger, sadness, fear, interest, happiness, surprise, pain, disgust, and shame. In other words, for each emotion a prototypical expression was classified.<\/p>\n<h3>Emotion FACS (EMFACS) e Facial Action Coding System Affect Interpretation Dictionary (FACSAID)<\/h3>\n<p>Ekman and Friesen attribute the interpretative meanings to the FACSs\u2019 action units, describing the expressions of 6 emotional families: happiness, sadness, disgust, anger, surprise and fear. This study, carried out in the 80s, is called\u00a0<em>Emotional FACS <\/em>(EMFACS). Since 1994, Hager has worked at Ekman\u2019s laboratory studying the automatic computerized identification techniques of facial expressions. A database with a new interface was developed, creating the so-called\u00a0<em>FACS Affect Interpretation Dictionary<\/em>\u00a0or\u00a0<em>FACSAID<\/em> system.<\/p>\n<h3>Hanest<\/h3>\n<p>In the same publication year of the first version of the Facial Action Coding System, the Hanest Manual was published. The latter, developed by two French scientists (Ermiane &amp; Gergerian) in 1978, has the same aim of FACS, that is to describe facial movements.<\/p>\n<h3><a id=\"_Toc388303575\" name=\"_Toc388303575\"><\/a><a id=\"_Toc388303434\" name=\"_Toc388303434\"><\/a>Facial Action Coding System<\/h3>\n<p>In 1978 Paul Ekman and Vincent W. Friesen introduced the Facial Action Coding System (FACS) and in 2002, with Hager\u2019s collaboration, released an expanded version. It\u2019s a descriptive facial coding system and, as a consequence, it doesn\u2019t ascribe meaning to facial expressions. It represents a detailed description of changes due to facial movements.<\/p>\n<h3><a id=\"_Toc388303436\" name=\"_Toc388303436\"><\/a><a id=\"_Toc383704119\" name=\"_Toc383704119\"><\/a><a id=\"_Toc381709500\" name=\"_Toc381709500\"><\/a><a id=\"_Toc381709296\" name=\"_Toc381709296\"><\/a><a id=\"_Toc380419987\" name=\"_Toc380419987\"><\/a>Baby Facial Action Coding System (BabyFACS)<\/h3>\n<p>The same system structure used for adults is also used for babies and small children. Oster (1993) considered babies\u2019 facial particularities and adapted the descriptions accordingly. The BabyFACS is purely descriptive without giving any emotional meaning.<a id=\"_Toc383704121\" style=\"color: #f15a24; font-size: 2.7rem; font-style: inherit; font-weight: 600;\" name=\"_Toc383704121\"><\/a><a id=\"_Toc381709502\" style=\"color: #f15a24; font-size: 2.7rem; font-style: inherit; font-weight: 600;\" name=\"_Toc381709502\"><\/a><a id=\"_Toc381709298\" style=\"color: #f15a24; font-size: 2.7rem; font-style: inherit; font-weight: 600;\" name=\"_Toc381709298\"><\/a><a id=\"_Toc380419989\" style=\"color: #f15a24; font-size: 2.7rem; font-style: inherit; font-weight: 600;\" name=\"_Toc380419989\"><\/a><a id=\"_Toc388303579\" style=\"color: #f15a24; font-size: 2.7rem; font-style: inherit; font-weight: 600;\" name=\"_Toc388303579\"><\/a><a id=\"_Toc388303438\" style=\"color: #f15a24; font-size: 2.7rem; font-style: inherit; font-weight: 600;\" name=\"_Toc388303438\"><\/a><\/p>\n<h3><a id=\"_Toc388303438\" name=\"_Toc388303438\"><\/a><\/h3>\n<h3>A coding and decoding example of facial expressions<\/h3>\n<p><img loading=\"lazy\" class=\"alignleft wp-image-5007 size-large\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2018\/05\/2-facs-au-facial-action-coding-system.png\" alt=\"facs\" width=\"600\" height=\"367\" \/><\/p>\n<h3><strong>Some basic actions of the upper face<\/strong><\/h3>\n<table class=\"mceItemTable\" border=\"0\">\n<tbody>\n<tr>\n<td>1 &#8211; Raising the inner part of the eyebrows<\/td>\n<td><img loading=\"lazy\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2014\/07\/1A.jpg\" alt=\"1A\" width=\"204\" height=\"191\" \/><\/td>\n<\/tr>\n<tr>\n<td><\/td>\n<td><\/td>\n<\/tr>\n<tr>\n<td>2 -Raising the outer part of the eyebrows<\/td>\n<td><img loading=\"lazy\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2014\/07\/2C.jpg\" alt=\"2C\" width=\"218\" height=\"239\" \/><\/td>\n<\/tr>\n<tr>\n<td><\/td>\n<td><\/td>\n<\/tr>\n<tr>\n<td>4 &#8211;\u00a0Lowering and bringing together the eyebrows<\/td>\n<td><img loading=\"lazy\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2014\/07\/4B.jpg\" alt=\"4B\" width=\"242\" height=\"251\" \/><\/td>\n<\/tr>\n<tr>\n<td><\/td>\n<td><\/td>\n<\/tr>\n<tr>\n<td>5 &#8211; Wide opened eyes<\/td>\n<td><img loading=\"lazy\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2014\/07\/5D.jpg\" alt=\"5D\" width=\"233\" height=\"330\" \/><\/td>\n<\/tr>\n<tr>\n<td><\/td>\n<td><\/td>\n<\/tr>\n<tr>\n<td>6 -Raising the cheeks<\/td>\n<td><img loading=\"lazy\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2014\/07\/6E.jpg\" alt=\"6E\" width=\"235\" height=\"268\" \/><\/td>\n<\/tr>\n<tr>\n<td><\/td>\n<td><\/td>\n<\/tr>\n<tr>\n<td>7 &#8211; Tension of the eyelids<\/td>\n<td><img loading=\"lazy\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2014\/07\/7A.jpg\" alt=\"7A\" width=\"233\" height=\"288\" \/><\/td>\n<\/tr>\n<tr>\n<td><\/td>\n<td><\/td>\n<\/tr>\n<tr>\n<td>\n<h3>Some combination examples of the upper face<\/h3>\n<\/td>\n<td><\/td>\n<\/tr>\n<tr>\n<td>1+2+4 (or 3 depending on the coding techniques). This combination corresponds to the prototypical expression of FEAR. No other primary emotion has this combination.<\/td>\n<td><img loading=\"lazy\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2014\/07\/MAR7582.jpg\" alt=\" MAR7582\" width=\"276\" height=\"290\" \/><\/td>\n<\/tr>\n<tr>\n<td>4+5. This sequence indicates the prototypical expression of ANGER.<\/td>\n<td><img loading=\"lazy\" src=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2014\/07\/MAR7568.jpg\" alt=\" MAR7568\" width=\"273\" height=\"324\" \/><\/td>\n<\/tr>\n<tr>\n<td><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<table class=\"mceItemTable\" style=\"height: 24px;\" border=\"0\" width=\"716\">\n<tbody>\n<tr>\n<td><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<header class=\"icon-heading\"><span class=\"section-header-icon icon icon-hand\"><\/span><h2>Scientific Techniques for Gesture and Posture Analysis<span><\/span><\/h2><\/header>\n<h3>Body Coding System<\/h3>\n<p>The Body Coding System is the system of coding and decoding gestural and postural motor behaviors for the analysis of nonverbal communication, developed by Jasna Legisa. It analyzes bodily nonverbal behavior expressions, breaking them up into action units. The latter are classified in order to get the complete picture of the person\u2019s emotional state. It is based on the observation of tiny bodily changes due to muscular activities. This technique was born out of the need to respond to queries concerning existing links between bodily\u00a0expressions, emotional experiences and communicative processes.<\/p>\n<p>The aims of the Body Coding System are:<\/p>\n<ul>\n<li>To IMPLEMENT a structured gestural and postural analysis method;<\/li>\n<li>To DEFINE the movements\u2019 structure and to classify them;<\/li>\n<li>To ASCRIBE movements and postures to their emotional, conversational and cultural meanings.<\/li>\n<\/ul>\n<p>A certificate is needed to use the Body Coding System. A lot of practical gestural and postural analysis is required in order to take part in the final exam: learners should be able to perfectly remember the bodily action units and to recognize them in the shortest possible time.<\/p>\n<p>Our NCS team will be available to help and guide you through the learning process and will be at your disposal to answer to any queries and doubts.<\/p>\n<p>The examination committee is comprised of two or more NeuroComScience BCS experts and of face and body emotional behavior analysts.<\/p>\n<h3>The Body Action and Posture coding system<\/h3>\n<p>The Body Action and Posture coding system (B.A.P.; Dael, N., Mortillaro, M., &amp; Scherer, K.R., 2012) is a system of coding and decoding gestural and postural motor behaviors that takes into account various parts of the body. The Body Action and Posture Coding System mainly studies the differences of change between action and posture. According to the authors, actions are separated units of body movement. An action unit corresponds to a local deviation of one or more articulators (head, arm, hand, body trunk) different from the previous configuration and that can return to the same or different position (shaking of the head, gesture of pointing a finger).<\/p>\n<p>As opposed to postural units, the action units happen and change more frequently, furthermore, they have a soft starting point, a relatively short duration and a clear final point. These bodily actions are performed by the head, shoulders, arms (elbows), legs (knees) and involve actions such as lowering the head, raising the shoulders, gesticulating, scratching, kicking, and so on. The start of an action unit is a temporary point when the subject changes the current rest position. The end of an action unit is the temporal point in which the subject returns to a position (the rest position, the initial one or even a new one).<\/p>\n<p>The Body Action and Posture coding system divides the description into the transition and configuration phases, which are always correlated for the realization of a posture. The starting transition corresponds to the beginning of a movement necessary to reach the final position or the initial frame in a video. The transition is a temporal point where the transition described in a specific category is concluded or it corresponds to the last frame in a video, if the compromised movement is cut from the video. The frame is the beginning of the setup posture. Not every behavioral action interrupts an ongoing position; in fact, it is possible that a certain body position isn\u2019t interrupted by the action of another part of the body. For example, the head shaking doesn\u2019t modify the head\u2019s position of tilting forward.<\/p>\n<p>During the configuration phase, the person\u2019s final codified position is maintained but it does not mean that the position obtained is static.<\/p>\n<p>In Body Action and Posture coding system, the postures are different from the actions because:<\/p>\n<ul>\n<li>the postures are less subject to frequent changes and consequently have\u00a0a longer duration;<\/li>\n<li>the postures are firm (small movements do not change or distort the posture);<\/li>\n<li>considering that the actions can or cannot always be seen, the body is continuously in one or another postural alignment.<\/li>\n<\/ul>\n<p>This means that when a body part is not involved in an action, it is always in a particular posture.<\/p>\n<h3>An Annotation Scheme for Conversational Gestures: How to economically capture timing and form<\/h3>\n<p>This is another coding system for the gestural motor behavior developed by Kipp et al. (2007). Their goal was to furnish good annotations concerning the gestures a person makes during a\u00a0conversation, in order to offer a universal explanation of these movements although limited just to the study of arms and hands. They mainly measure the height, the distance, the type of radial orientation, as well as the trajectory of arms and hands, but do not take into account all the complex movements that the body can make.<\/p>\n<p>The Neuroges (NGS) system describes gestures in 3 modules, mainly considering the hands\u2019 movements as well:<\/p>\n<ul>\n<li>gestural kinesics;<\/li>\n<li>relational bimanual coding;<\/li>\n<li>gestural functions coding.<\/li>\n<\/ul>\n<p>The first module concerns the characteristics of hand movements: movement versus\u00a0non-movement, the trajectory of the movement, and its dynamism or flow. The second module refers to the relationship between two hands and their spatial and functional relation. The third module regards the function and classification of gestures.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In order to perform a professional nonverbal behavior analysis it is fundamental to use techniques able to objectively describe behavior and to attribute a trustworthy meaning to it. The main advantages of the scientific analysis of nonverbal communication are: To identify others\u2019 emotions and states of mind with accuracy; To anticipate peoples\u2019 behavior; To expose &hellip; <a href=\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Methods and techniques of nonverbal behavior analysis&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"page-templates\/page-right-sidebar.php","meta":{"om_disable_all_campaigns":false},"aioseo_notices":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v18.6 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Methods and techniques of nonverbal behavior analysis - NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Methods and techniques of nonverbal behavior analysis - NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.\" \/>\n<meta property=\"og:description\" content=\"In order to perform a professional nonverbal behavior analysis it is fundamental to use techniques able to objectively describe behavior and to attribute a trustworthy meaning to it. The main advantages of the scientific analysis of nonverbal communication are: To identify others\u2019 emotions and states of mind with accuracy; To anticipate peoples\u2019 behavior; To expose &hellip; Continue reading &quot;Methods and techniques of nonverbal behavior analysis&quot;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/\" \/>\n<meta property=\"og:site_name\" content=\"NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.\" \/>\n<meta property=\"article:modified_time\" content=\"2023-12-18T23:10:09+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2018\/06\/lettura-delle-espressioni-facciali.gif\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"13 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebSite\",\"@id\":\"https:\/\/lab-ncs.com\/#website\",\"url\":\"https:\/\/lab-ncs.com\/\",\"name\":\"NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.\",\"description\":\"Analisi delle espressioni facciali\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/lab-ncs.com\/?s={search_term_string}\"},\"query-input\":\"required name=search_term_string\"}],\"inLanguage\":\"en-US\"},{\"@type\":\"ImageObject\",\"@id\":\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#primaryimage\",\"inLanguage\":\"en-US\",\"url\":\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2018\/06\/lettura-delle-espressioni-facciali.gif\",\"contentUrl\":\"https:\/\/lab-ncs.com\/wp-content\/uploads\/2018\/06\/lettura-delle-espressioni-facciali.gif\",\"width\":400,\"height\":516},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#webpage\",\"url\":\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/\",\"name\":\"Methods and techniques of nonverbal behavior analysis - NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.\",\"isPartOf\":{\"@id\":\"https:\/\/lab-ncs.com\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#primaryimage\"},\"datePublished\":\"2016-12-15T14:56:02+00:00\",\"dateModified\":\"2023-12-18T23:10:09+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/lab-ncs.com\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Methods and techniques of nonverbal behavior analysis\"}]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Methods and techniques of nonverbal behavior analysis - NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/","og_locale":"en_US","og_type":"article","og_title":"Methods and techniques of nonverbal behavior analysis - NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.","og_description":"In order to perform a professional nonverbal behavior analysis it is fundamental to use techniques able to objectively describe behavior and to attribute a trustworthy meaning to it. The main advantages of the scientific analysis of nonverbal communication are: To identify others\u2019 emotions and states of mind with accuracy; To anticipate peoples\u2019 behavior; To expose &hellip; Continue reading \"Methods and techniques of nonverbal behavior analysis\"","og_url":"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/","og_site_name":"NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.","article_modified_time":"2023-12-18T23:10:09+00:00","og_image":[{"url":"https:\/\/lab-ncs.com\/wp-content\/uploads\/2018\/06\/lettura-delle-espressioni-facciali.gif"}],"twitter_misc":{"Est. reading time":"13 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebSite","@id":"https:\/\/lab-ncs.com\/#website","url":"https:\/\/lab-ncs.com\/","name":"NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.","description":"Analisi delle espressioni facciali","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/lab-ncs.com\/?s={search_term_string}"},"query-input":"required name=search_term_string"}],"inLanguage":"en-US"},{"@type":"ImageObject","@id":"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#primaryimage","inLanguage":"en-US","url":"https:\/\/lab-ncs.com\/wp-content\/uploads\/2018\/06\/lettura-delle-espressioni-facciali.gif","contentUrl":"https:\/\/lab-ncs.com\/wp-content\/uploads\/2018\/06\/lettura-delle-espressioni-facciali.gif","width":400,"height":516},{"@type":"WebPage","@id":"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#webpage","url":"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/","name":"Methods and techniques of nonverbal behavior analysis - NeuroComScience, Laboratorio di analisi comportamentale - Comunicazione non verbale.","isPartOf":{"@id":"https:\/\/lab-ncs.com\/#website"},"primaryImageOfPage":{"@id":"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#primaryimage"},"datePublished":"2016-12-15T14:56:02+00:00","dateModified":"2023-12-18T23:10:09+00:00","breadcrumb":{"@id":"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/lab-ncs.com\/en\/techniques-and-methods-of-non-verbal-behavior-analysis-2\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/lab-ncs.com\/en\/"},{"@type":"ListItem","position":2,"name":"Methods and techniques of nonverbal behavior analysis"}]}]}},"_links":{"self":[{"href":"https:\/\/lab-ncs.com\/en\/wp-json\/wp\/v2\/pages\/10816"}],"collection":[{"href":"https:\/\/lab-ncs.com\/en\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/lab-ncs.com\/en\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/lab-ncs.com\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/lab-ncs.com\/en\/wp-json\/wp\/v2\/comments?post=10816"}],"version-history":[{"count":2,"href":"https:\/\/lab-ncs.com\/en\/wp-json\/wp\/v2\/pages\/10816\/revisions"}],"predecessor-version":[{"id":34111,"href":"https:\/\/lab-ncs.com\/en\/wp-json\/wp\/v2\/pages\/10816\/revisions\/34111"}],"wp:attachment":[{"href":"https:\/\/lab-ncs.com\/en\/wp-json\/wp\/v2\/media?parent=10816"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}