Skip to main content
Plastic and Reconstructive Surgery Global Open logoLink to Plastic and Reconstructive Surgery Global Open
. 2020 Mar 11;8(3):e2688. doi: 10.1097/GOX.0000000000002688

Customized Precision Facial Assessment: An AI-assisted Analysis of Facial Microexpressions for Advanced Aesthetic Treatment

Chih-Wei Li *,†,, Chao-Chin Wang §,, Che-Yi Chou ‖,**, Chrang-Shi Lin ††,
PMCID: PMC7253289  PMID: 32537345

Summary:

We introduced a novel protocol based on an artificial intelligence (AI)-assisted analytic system for facial expressions, Customized Precision Facial Assessment (CPFA), to evaluate and quantify the microexpressions of aesthetic concern. With the help of CPFA, physicians may be able to conduct static and dynamic assessments for the microexpressions of the ir patients and perform quantitative measurements before and after the treatments. Through the detection of microexpressions and its active action units of facial muscles, physicians are more likely to optimize the treatment with minimal intervention by precise localization of the foci of aesthetic concern. We presented 3 cases who received neuromodulators and injectable fillers, and we showed the differences in the area of treatment and outcomes of procedures between the CPFA-oriented treatments and human-facilitated ones. We found negative facial expressions decreased in all 3 cases in the group of CPFA while they decreased in only case 1 and case 2 in the group of human facilitated treatment. The CPFA group has more significant decrease in negative facial expression scores than the human group. This pilot study demonstrates that CPFA can objectively recognize and quantify the facial action units associated with negative emotions, and the physician may be able to customize the treatment for individuals accordingly with promising results.

INTRODUCTION

Becoming more attractive is one of the most important reasons to receive cosmetic treatments.1 Attractiveness is strongly associated with facial expressions2 which also contribute to the first impression,3 and a happy facial expression is usually connected to a positive mood and looks more attractive. In contrast, sad and angry facial expressions are considered negative and less attractive. Therefore, enhancing the positive facial features and reducing the negative ones is a nice strategy of beautification. However, some microexpressions may be hard to detect by human eyes, so the artificial intelligence (AI)-assisted facial analytic system (FaceReader, Noldus, Wageningen, The Netherlands)4 may play a role here. We proposed a novel protocol based on this system, Customized Precision Facial Assessment (CPFA), to evaluate and quantify the microexpressions of aesthetic concern. This pilot study aims to demonstrate if CPFA can objectively recognize and quantify the facial action units (AUs) associated with negative emotions so as to serve as a guide for the physicians to customize the treatments for individuals accordingly.

METHOD

CPFA comprises static and dynamic analyses: at first, patients were instructed to make no facial expression for 30 seconds for static analysis while their facial AUs were continuously monitored by CPFA. The patients were then asked to make 6 basic facial expressions for subsequent dynamic analysis, including disgust, sadness, happiness, fear, anger, and surprise. With CPFA, the muscle actions leading to these expressions were analyzed and marked by facial action coding system5,6 (Fig. 1), and the degree of each facial expression was quantitatively recorded as “facial expression score.”

Fig. 1.

Fig. 1.

An example of facial coding system. Action units (AUs) and its corresponding facial expression muscles. DAO, depressor anguli oris. Reprinted with permission from Zarins U, Kondrats S. Anatomy of Facial Expression. © Exonicus, Inc.; 2017.

In this small pilot study, the goal of treatment is set at reducing the negative facial expressions, such as sad and angry. Once a negative facial expression is detected by CPFA, the severity being recorded as “facial expression score,” and its activated AUs of the expression will be the targets of intervention—named as CPFA-oriented treatment, which consists of neuromodulators and possible injectable fillers. CPFA is conducted at baseline, 1 and 3 weeks after the treatment and facial expression scores are measured. After 12 months of washout period for previous intervention, negative facial expression scores of the same group of patients were recorded as baseline. The same physician, who is blinded to the scores and without the guidance of CPFA, made his own assessment and interventions to these patients to reduce the negative facial expressions perceived—namely, human-facilitated treatment. Likewise, the facial expression scores were measured 1 and 3 weeks after the intervention.

The results of CPFA-oriented treatment and human-facilitated treatment were presented and the degree of reduction of negative facial expression scores of both groups were compared (See Video [online], which displays the study design and case reports).

Video 1. Case report study design. Video from “Customized Precision Facial Assessment –A novel AI-assisted protocol to unveil and quantify the facial micro-expressions for advanced aesthetic treatment”.

Download video file (54.5MB, mp4)

CASE REPORT

Case 1

CPFA recognized sadness on the patient's face. A further analysis of the AUs indicated that inner brow raiser (AU1, medial frontalis muscle) was responsible for her sadness. After injecting 8 units of abobotulinumtoxinA to the medial frontalis muscle (AU1), we found the sadness score decreased from 13.9% to 8.4% at 1 week, and to 0% at 3 weeks after treatment.

In the human group, the physician thought the sadness was related to her downturned eyes and injected 8 units of abobotulinumtoxinA on each side of her obcularis oculi muscles. As a result, the sadness score decreased from 6.8% to 5.1% at 1 week, and 6.7% at 3 weeks after treatment.

Case 2

CPFA showed 14.1% angry score which resulted from the activation of lip corner depressors (AU15) and chin raiser (AU17). It also recognized sadness which came from the activity of AU1 and AU4. AbobotulinumtoxinA was injected into AU1 (8U), AU15 (4U/side) and AU17 (4U), respectively. Hyaluronic acid (Perlane, Galderma LP, Fort Worth, Tex.) 1 mL was injected over AU17 to create a synergistic mechanical obstacle to the overcontracting mentalis muscle.7

In the group of human-facilitated treatment, the case was regarded by the physician to have sad face, and she received abobotulinumtoxinA on depressor anguli oris muscle (4U/side) and hyaluronic acid (Perlane) 1 mL over mentalis muscle.

Case 3

CPFA identified 14% angry score caused by lip corner depressors (AU15, depressor anguli oris muscle) and chin raiser (AU17, mentalis muscle) in case 3, and abobotulinumtoxinA was injected into AU15 (4U/side) and AU17 (4U).

In the human group, the physician only injected the depressor anguli oris muscles (AU17, 4U/side). It is noteworthy that the angry score was 10.9% at baseline and became 5.9% and raised to 13.9% when evaluated at 1 and 3 weeks, respectively, after treatment.

DISCUSSION

Neuromodulators and injectable fillers have been used to sooth wrinkles, facial creases, restore volume loss and address excessive muscle movement. However, precise evaluation before treatment is crucial to natural and successful result. In addition to the conventional evaluation by static photography, a dynamic imaging system for standardized evaluation could be a break-through. However, adopting a coding system that is able to efficiently mark the result of dynamic evaluation is of vital importance. Therefore, facial action coding system (FACS), a system to taxonomize human facial movements by their appearance on the face,5,8,9 has been adopted in this scenario. FACS has been extensively used by psychiatrists and animators to study the facial expressions and emotions, and it has been developed into computed automated systems recently. Among these systems, FaceReader is able to recognize facial expressions in a real-time manner and its performance has been validated using datasets ADFES and WSEFEP.4,10

CPFA, the novel protocol based on FaceReader, is the first aesthetic application of the well-established system in psychiatry. Through the detection of microexpressions and its active AUs of facial muscles, physicians are more likely to optimize the treatment with minimal intervention by precise localization of the foci of aesthetic concern.

In this study, the foci of treatment identified by CPFA are not completely identical to those by the evaluation of the physician. In case 1, CPFA indicated the sad face was caused by the activation of medial frontalis, whereas the physician regarded the sadness to be related to the downturned eyes and therefore treated her obicularis oculi muscle. In case 2, CPFA identified sadness, caused by the activation of medial frontalis, in addition to anger which is the common finding between CPFA and the evaluation by the physician. When evaluating the result 3 weeks after treatment, we found negative facial expressions decreased in all 3 cases in the group of CPFA while they decreased in only case 1 and case 2 in the group of human-facilitated treatment. The angry score of case 3 in the human group initially improved at week 1 but rebounded at week 3, which is probably due to inadequate dosing of neuromodulator to strong muscle activity. In addition, the group of CPFA-oriented treatment has more significant decrease in negative facial expression scores than that of human facilitated treatment.

CPFA showed a wide variety of potential applications in aesthetic field. CPFA can simply serve as a quantitative measurement of the facial expression scores before and after treatment. For physicians in training, CPFA could provide a possible guide of treatment to start with; for moderately experienced physicians, they may be able to further improve their outcome of treatment through the identification and better understanding of the microexpressions which could be too trivial to detect by human eyes before. Furthermore, CPFA, an AI system previously developed by humans, has the potential to develop into a training program which in turn trains humans to identify the microexpressions precisely. Through CPFA, physicians may have not only static and dynamic assessments of patients but also quantitative measurements before and after the treatments. The core feature of CPFA to detect and quantify the facial microexpressions may be a game changer to the strategy of aesthetic treatments leading to natural results.

This pilot study has several limitations. The study only evaluates the capability of CPFA-oriented treatments in reducing the negative facial expressions. Further studies would be needed to evaluate whether it works as well in enhancing the positive facial expressions. Due to its small case number, the study is too preliminary to conclude whether it is universal that the CPFA-oriented treatments lead to greater reduction in negative facial expressions than human-facilitated ones. The washout period of 12 months may not be adequate for complete degradation of previous hyaluronic acid placement, which may be a confounding factor to precise evaluation.

CONCLUSIONS

We proposed CPFA, a novel protocol based on an AI-assisted analytic system, to unveil and quantify the static and dynamic facial microexpressions for advanced aesthetic treatment. This pilot study demonstrates that CPFA can objectively recognize and quantify the facial AUs associated with negative emotions and the physician may be able to customize the treatment for individuals accordingly with promising results. Further studies are needed to validate and explore the potential use of this system.

Footnotes

Published online 11 March 2020.

Disclosure: The authors have no financial interest to declare in relation to the content of this article.

Related Digital Media are available in the full-text version of the article on www.PRSGlobalOpen.com.

REFERENCES

  • 1.Furnham A, Levitas J. Factors that motivate people to undergo cosmetic surgery. Can J Plast Surg. 2012;20:e47–e50. [PMC free article] [PubMed] [Google Scholar]
  • 2.Golle J, Mast FW, Lobmaier JS. Something to smile about: the interrelationship between attractiveness and emotional expression. Cogn Emot. 2014;28:298–310. [DOI] [PubMed] [Google Scholar]
  • 3.Ritchie KL, Palermo R, Rhodes G. Forming impressions of facial attractiveness is mandatory. Sci Rep. 2017;7:469. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Stöckli S, Schulte-Mecklenbeck M, Borer S, et al. Facial expression analysis with AFFDEX and FACET: a validation study. Behav Res Methods. 2018;50:1446–1460. [DOI] [PubMed] [Google Scholar]
  • 5.Ekman P., Friesen W. Facial Action Coding System: A Technique for the Measurement of Facial Movement. 1978Palo Alto: Consulting Psychologists Press; [Google Scholar]
  • 6.Kohler CG, Turner T, Stolar NM, et al. Differences in facial expressions of four universal emotions. Psychiatry Res. 2004;128:235–244. [DOI] [PubMed] [Google Scholar]
  • 7.de Maio M. Myomodulation with injectable fillers: an innovative approach to addressing facial muscle movement. Aesthetic Plast Surg. 2018;42:798–814. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Hjortsjö C-H. Man’s Face and Mimic Language. 1969Lund, Sweden: Studen litteratur. [Google Scholar]
  • 9.Ekman P, Friesen W, Hager J. Facial Action Coding System: The Manual on CD-ROM. Instructor’s Guide. 2002Salt Lake City, UT: Network Information Research Co. [Google Scholar]
  • 10.Lewinski P, den Uyl TM, Butler C. Automated facial coding: validation of basic emotions and FACS AUs in FaceReader. J Neurosci Psychol Econ. 2014;7:227. [Google Scholar]

Articles from Plastic and Reconstructive Surgery Global Open are provided here courtesy of Wolters Kluwer Health

RESOURCES