From Wikipedia, the free encyclopedia
System of classifying human facial movements
Muscles of head and neckThe Facial Action Coding System (FACS) is a system to taxonomize human facial movements by their appearance on the face, based on a system originally developed by a Swedish anatomist named Carl-Herman Hjortsjö.[1] It was later adopted by Paul Ekman and Wallace V. Friesen, and published in 1978.[2] Ekman, Friesen, and Joseph C. Hager published a significant update to FACS in 2002.[3] Movements of individual facial muscles are encoded by the FACS from slight different instant changes in facial appearance. It has proven useful to psychologists and to animators.
Blind athlete expressing joy in athletic competition. The fact that unsighted persons use the same expressions as sighted people shows that expressions are innate.In 2009, a study was conducted to study spontaneous facial expressions in sighted and blind judo athletes. They discovered that many facial expressions are innate and not visually learned.[4]
Using the FACS[5] human coders can manually code nearly any anatomically possible facial expression, deconstructing it into the specific "action units" (AU) and their temporal segments that produced the expression. As AUs are independent of any interpretation, they can be used for any higher order decision making process including recognition of basic emotions, or pre-programmed commands for an ambient intelligent environment. The FACS manual is over 500 pages in length and provides the AUs, as well as Ekman's interpretation of their meanings.
The FACS defines AUs, as contractions or relaxations of one or more muscles. It also defines a number of "action descriptors", which differ from AUs in that the authors of the FACS have not specified the muscular basis for the action and have not distinguished specific behaviors as precisely as they have for the AUs.
For example, the FACS can be used to distinguish two types of smiles as follows:[6]
The FACS is designed to be self-instructional. People can learn the technique from a number of sources including manuals and workshops,[7] and obtain certification through testing.[8]
Although the labeling of expressions currently requires trained experts, researchers have had some success in using computers to automatically identify the FACS codes.[9] One obstacle to automatic FACS code recognition is a shortage of manually coded ground truth data.[10]
Baby FACS (Facial Action Coding System for Infants and Young Children)[11] is a behavioral coding system that adapts the adult FACS to code facial expressions in infants aged 0–2 years. It corresponds to specific underlying facial muscles, tailored to infant facial anatomy and expression patterns.
It was created by Dr. Harriet Oster and colleagues to address the limitations of applying adult FACS directly to infants, whose facial musculature, proportions, and developmental capabilities differ significantly.
The use of the FACS has been proposed for use in the analysis of depression,[12] and the measurement of pain in patients unable to express themselves verbally.[13]
Cross-species applications[edit]The original FACS has been modified to analyze facial movements in several non-human primates, namely chimpanzees,[14] rhesus macaques,[15] gibbons and siamangs,[16] and orangutans.[17] More recently, it was developed also for domestic species, including dogs,[18] horses[19] and cats.[20] Similarly to the human FACS, the animal FACS has manuals available online for each species with the respective certification tests.[21]
Thus, the FACS can be used to compare facial repertoires across species due to its anatomical basis. A study conducted by Vick and others (2006) suggests that the FACS can be modified by taking differences in underlying morphology into account. Such considerations enable a comparison of the homologous facial movements present in humans and chimpanzees, to show that the facial expressions of both species result from extremely notable appearance changes. The development of FACS tools for different species allows the objective and anatomical study of facial expressions in communicative and emotional contexts. Furthermore, a cross-species analysis of facial expressions can help to answer interesting questions, such as which emotions are uniquely human.[22]
The Emotional Facial Action Coding System (EMFACS)[23] and the Facial Action Coding System Affect Interpretation Dictionary (FACSAID)[24] consider only emotion-related facial actions. Examples of these are:
Emotion Action units Happiness 6+12 Sadness 1+4+15 Surprise 1+2+5B+26 Fear 1+2+4+5+7+20+26 Anger 4+5+7+23 Disgust 9+15+17 Contempt R12A+R14A Computer-generated imagery[edit]FACS coding is also used extensively in computer animation, in particular for computer facial animation, with facial expressions being expressed as vector graphics of AUs.[25] FACS vectors are used as weights for blend shapes corresponding to each AU, with the resulting face mesh then being used to render the finished face.[26][27] Deep learning techniques can be used to determine the FACS vectors from face images obtained during motion capture acting, facial motion capture or other performances.[28]
Codes for action units[edit]For clarification, the FACS is an index of facial expressions, but does not actually provide any bio-mechanical information about the degree of muscle activation. Though muscle activation is not part of the FACS, the main muscles involved in the facial expression have been added here.
Action units (AUs) are the fundamental actions of individual muscles or groups of muscles.
Action descriptors (ADs) are unitary movements that may involve the actions of several muscle groups (e.g., a forward‐thrusting movement of the jaw). The muscular basis for these actions has not been specified and specific behaviors have not been distinguished as precisely as for the AUs.
For the most accurate annotation, the FACS suggests agreement from at least two independent certified FACS encoders.
Intensities of the FACS are annotated by appending letters A–E (for minimal-maximal intensity) to the action unit number (e.g. AU 1A is the weakest trace of AU 1 and AU 1E is the maximum intensity possible for the individual person).
There are other modifiers present in FACS codes for emotional expressions, such as "R" which represents an action that occurs on the right side of the face and "L" for actions which occur on the left. An action which is unilateral (occurs on only one side of the face) but has no specific side is indicated with a "U" and an action which is bilateral but has a stronger side is indicated with an "A" for "asymmetric".
List of AUs and ADs (with underlying facial muscles)[edit] Head movement codes[edit] AU number FACS name Action 51 Head turn left 52 Head turn right 53 Head up 54 Head down 55 Head tilt left M55 Head tilt left The onset of the symmetrical 14 is immediately preceded or accompanied by a head tilt to the left. 56 Head tilt right M56 Head tilt right The onset of the symmetrical 14 is immediately preceded or accompanied by a head tilt to the right. 57 Head forward M57 Head thrust forward The onset of 17+24 is immediately preceded, accompanied, or followed by a head thrust forward. 58 Head back M59 Head shake up and down The onset of 17+24 is immediately preceded, accompanied, or followed by an up-down head shake (nod). M60 Head shake side to side The onset of 17+24 is immediately preceded, accompanied, or followed by a side to side head shake. M83 Head upward and to the side The onset of the symmetrical 14 is immediately preceded or accompanied by a movement of the head, upward and turned or tilted to either the left or right. Eye movement codes[edit] AU number FACS name Action 61 Eyes turn left M61 Eyes left The onset of the symmetrical 14 is immediately preceded or accompanied by eye movement to the left. 62 Eyes turn right M62 Eyes right The onset of the symmetrical 14 is immediately preceded or accompanied by eye movement to the right. 63 Eyes up 64 Eyes down 65 Walleye 66 Cross-eye M68 Upward rolling of eyes The onset of the symmetrical 14 is immediately preceded or accompanied by an upward rolling of the eyes. 69 Eyes positioned to look at other person The 4, 5, or 7, alone or in combination, occurs while the eye position is fixed on the other person in the conversation. M69 Head or eyes look at other person The onset of the symmetrical 14 or AUs 4, 5, and 7, alone or in combination, is immediately preceded or accompanied by a movement of the eyes or of the head and eyes to look at the other person in the conversation. AU number FACS name 70 Brows and forehead not visible 71 Eyes not visible 72 Lower face not visible 73 Entire face not visible 74 Unscorable Gross behavior codes[edit]These codes are reserved for recording information about gross behaviors that may be relevant to the facial actions that are scored.
AU number FACS name Muscular basis 29 Jaw thrust 30 Jaw sideways 31 Jaw clencher masseter 32 [Lip] bite 33 [Cheek] blow 34 [Cheek] puff 35 [Cheek] suck 36 [Tongue] bulge 37 Lip wipe 38 Nostril dilator nasalis (pars alaris) 39 Nostril compressor nasalis (pars transversa) and depressor septi nasi 40 Sniff 41 Lid droop levator palpebrae superioris (relaxation) 42 Slit orbicularis oculi muscle 43 Eyes closed relaxation of levator palpebrae superioris 44 Squint corrugator supercilii and orbicularis oculi muscle 45 Blink relaxation of levator palpebrae superioris; contraction of orbicularis oculi (pars palpebralis) 46 Wink orbicularis oculi 50 Speech 80 Swallow 81 Chewing 82 Shoulder shrug 84 Head shake back and forth 85 Head nod up and down 91 Flash 92 Partial flash 97* Shiver/tremble 98* Fast up-down lookRetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4