1 JEFSR Vol. 5 No. 1 2020
Detecting Differences Between Concealed and Unconcealed Emotions Using
iMotions EMOTIENT
Shelby Clark1*, B.A and Dr. Shashi K. Jasra, 1PhD.
Abstract
Biometric analysis is everywhere – even in our cell phone security through facial and fingerprint
recognition. It has recently become widely useful in forensic settings as well, being used for facial,
fingerprint/palmprint, iris, and voice identification1. Using the iMotions Facial Expression Analysis software, I
looked at detection differences between concealed and unconcealed emotions when presented with various
stimuli, specifically looking at the time in percentage that each emotion was elicited throughout the stimuli.
Fourteen participants, eight females (F) and six males (M), were shown seven different videos aimed at eliciting
specific emotions to be measured by the iMotions software. Prior to exposure to the stimuli, seven of these
fourteen participants (4F, 3M) were asked to conceal their emotions while watching the following videos. The
seven different emotions that were measured by the software include contempt, disgust, fear, joy, anger,
surprise, and sadness. The alternative hypothesis states that individuals who concealed their emotions during
presented stimuli will have significantly less detectable emotions elicited in comparison to individuals who
were not asked to conceal their emotions. The null hypothesis states that there will be no significant difference
between detectable emotions of individuals of the concealed group and the unconcealed group. There was no
statistically significant difference of emotion detected between the overall concealed and unconcealed
participant averages with regards to time (%) (p=0.07, a ≥0.05).
Keywords: Biometric identification, facial expression analysis
2 JEFSR Vol. 5 No. 1 2020
1 Forensic Sciences, Faculty of Science, University of Windsor, 401Sunset Avenue, Windsor Ontario *Communicating Author Contact: Shelby Clark, [email protected]
Introduction
The iMotions software describes biometrics as a way to investigate internal bodily signals that reveal
emotion and other signs of physiological arousal2. Applying this to a forensic setting, biometrics is the
automated recognition of individuals based on both biological and behavioural characteristics. Biometrics
ranges from facial, fingerprint/palmprint, iris, and voice identification1. This allows for identification because
certain biometric traits are specific to that individual.
The iMotions Facial Expression analysis uses an engine called Affectiva (AFFDEX). This engine identifies
thirty-three feature points on the face (e.g. points on the eyes, nose, mouth, etc.). The movements of these
facial feature points are tracked and analyzed throughout the presentation of stimuli. Specifically, this engine
identifies a smile, brow furrow & raise, frown, eye closure, nose wrinkle, various lip motions, chin raises, and
smirks. iMotions then interprets these expressions into seven basic emotions: joy, anger, surprise, fear,
contempt, sadness, and disgust. iMotions also measures positive, negative, and neutral expressions throughout
all stimuli3. Galvanic Skin Response (GSR) measures skin conductivity through the sweat glands in the skin being
activated which is commonly triggered by emotional stimulation. This allows researchers to measure emotional
stimulation when emotions may not be apparent through facial analysis4. An Eye Tracker is also included in this
iMotions software package, which tracks eye movement throughout presented stimuli and serves as a measure
of participant attention. It analyzes how long someone looks at something, and importantly, measures if a
participant is paying attention to the stimuli2. The combination of facial expression analysis, GSR, and eye
tracking allows for effective emotional analysis, and in this case, the differentiation between concealed and
unconcealed emotional expression.
Materials & Methods
- iMotions software suite (physiological and facial analysis software): version 7.0- iMotions Shimmer Kit: Galvanic Skin Response and Photoplethysmography biosensors- Logitech HD camera- Laptop with Bluetooth capability- GP3 Professional Bundle- Additional display monitor- Windows 10 Operating System
A recruitment e-mail was distributed by the science department secretary that advertised our research
to University of Windsor students. The e-mail contained general information about the study, with attached
consent forms and letters of information about both methods of the study. Participants were also recruited in
3 JEFSR Vol. 5 No. 1 2020
person through info page handouts, as well as postings in the common areas of campus. Fifteen participants
were attained for the study, with a 6:9 distribution of males and females. Fourteen participant’s data was used,
as participant R’s facial recording did not occur due to technical issues. Therefore, for data analysis, fourteen
participant’s data was used in analysis, with a 6:8 ration of male and female participants.
Table 1
Participant Information
Participant Number M/F Concealed/Unconcealed
R1 F Unconcealed
R3 F Unconcealed
R4 F Concealed
R5 F Unconcealed
R6 M Unconcealed
R7 M Concealed
R8 M Unconcealed
R9 M Concealed
R10 F Unconcealed
R11 F Concealed
R12 M Unconcealed
R13 F Unconcealed
R14 F Concealed
R15 M Unconcealed
Table 2
Stimuli presented to participants and their descriptions
4 JEFSR Vol. 5 No. 1 2020
Stimulus Stimulus Description
E1 Contempt – woman being racially discriminatory
E2 Disgust – worm being eaten by insects
E3 Fear – man with camera on his head walking along the outside of a tall building
E4 Joy – dog taking a bath and swimming
E5 Surprise – video of an explosion
E6 Anger – customer yelling at store employee
E7 Sadness – penguin parent discovering deceased child
The following procedure took place in Essex Hall Room 208 at the University of Windsor. Participants
were run through the study between January and March 2019. Prior to procedure, workstation setup must be
completed. An additional monitor was run through the VGA output of the iMotions laptop. The shimmer kit,
GP3 eye-tracker, and Logitech HD were connected through USB ports of laptop. The gazepoint program was
started and a 9-point calibration cycle was complete. Next, the shimmer kit wrist module was turned on and
the probes attached. The iMotions program was started and ensured that each device is properly detected.
Finally, calibration settings were set for the study and the participant profile was ready for the active
participant. Here, each participant was explained the procedure and given a consent form. Once the consent
form was attained, the shimmer kit wrist module was attached to the participants wrists and their proper
fingers (PPG sensors attach to the ring finger, GSR sensors attach to the index and middle fingers). Calibration
for gaze tracking was performed prior to the presentation of stimuli. Following this calibration, some
participants were asked to conceal their emotions, and others were not told anything. Requesting emotion
concealment of participants was alternating, depending on the gender of the participant. Further, each video
was approximately 15 seconds long, with a cooldown period between each one, where only a black screen was
shown. Once the stimuli videos began, the researchers would leave the room. After the videos concluded, all
sensors were removed from the participants and they were free to go.
Since significance between various concealed and unconcealed groups were analyzed, T-Tests were
used through excel to determine statistical significance using the significance level, alpha (a), with a level of
0.05, or 5%. This 5% is the likelihood that a difference existence is being shown, when in reality, there is no
5 JEFSR Vol. 5 No. 1 2020
actual difference. Any p-value lower than 0.05 or 5% is considered significant, while anything higher is not
statistically significant.
Results
Table 5
Average time (%) of detected emotions throughout stimuli presentation
Category Elicited Emotion
Contempt Disgust Fear Joy Surprise Anger Sadness
Overall Concealed Category 1.12 9.47 4.56 5.99 10.47 9.51 12.81
Overall Unconcealed Category 8.21 37.41 6.13 15.5 16.3 13.12 59.51
Unconcealed Female Category 9.31 13.11 24.42 2.15 2.51 34.29 65.33
Concealed Female Category 2.85 11.07 17.33 5.41 1.37 11.52 20.01
Unconcealed Male Category 23.75 13.14 2.91 11.43 15.82 41.59 51.75
Concealed Male Category 10.31 7.44 1.33 3.43 0.8 6.73 3.2
The results of a t-test and given p-values suggest that the time (%) of emotions detected by the overall
concealed and unconcealed participants were not statistically different from one another. iMotions did detect
less emotions, on average, from the participants in the concealed group compared to the participants in the
unconcealed group, but it is more than 5% likely that extreme values were seen by chance. Therefore, this was
not seen as a statistically significant difference. T-tests were performed on all data sets in order to determine
statistical significance.
Table 6
Average time (%) of elicited emotions in concealed and unconcealed category
6 JEFSR Vol. 5 No. 1 2020
Table 7
P-Values from average concealed vs unconcealed participant emotions
Value Elicited Emotion
Contempt Disgust Fear Joy Surprise Anger Sadness
P-Value 0.01 0 0.53 0.19 0.04 0.49 0
Category Elicited Emotion
Contempt Disgust Fear Joy Surprise Anger Sadness
Concealed Category 1.12 9.47 4.56 5.99 10.47 5.11 12.81
Unconcealed Category 8.21 37.41 6.13 15.5 16.3 13.12 59.51
7 JEFSR Vol. 5 No. 1 2020
Figure 1 – Total average time percentage of detected emotions between concealed and unconcealed groups.
There were no statistically significant differences seen in terms of emotion detection between overall
concealed and unconcealed participant averages in regard to time (%) (p=0.07, a≥0.05).
The data was also divided into female and male groups. iMotions software detected significantly less
emotions, on average, from males in the concealed group compared to the male participants in the
unconcealed group. Of female participants, there was found to be no statistically significant difference of
emotions detected between the concealed and unconcealed group.
Table 8
Average time (%) of detected emotions in male participants
Category Elicited Emotion
Contempt Disgust Fear Joy Surprise Anger Sadness
Concealed 23.75 13.14 2.91 11.43 15.82 41.59 51.75
Unconcealed 10.31 7.44 1.33 3.43 0.8 6.73 3.2
8 JEFSR Vol. 5 No. 1 2020
Figure 2 – Total average time percentage of detected emotions between male concealed and unconcealed
groups. Statistical significance was seen between male concealed and unconcealed participant averages in
regard to time (%) (p=0.02, a≥0.05).
Table 9
Average time (%) of detected emotions in female participants
Category Elicited Emotion
Contempt Disgust Fear Joy Surprise Anger Sadness
Concealed 2.85 11.07 17.33 5.41 1.37 11.52 20.01
Unconcealed 9.31 13.11 24.42 2.15 2.51 34.29 65.33
Figure 3 – Total average time percentage of detected emotions between female concealed and unconcealed
groups. There were no statistically significant differences of emotion detection observed between female
concealed and unconcealed participant averages in regard to time (%) (p=0.22, a≥0.05).
9 JEFSR Vol. 5 No. 1 2020
Table 10
Average time (%) of detected emotions in concealed participants per gender
Category Elicited Emotion
Contempt Disgust Fear Joy Surprise Anger Sadness
Female 9.31 13.11 24.42 2.15 2.51 34.29 65.33
Male 23.75 13.14 2.91 11.43 15.82 41.59 51.75
Figure 4 – Total average time percentage of detected emotions between female and male concealed groups.
There were no statistically significant differences of emotion detection observed between female and male
concealed participant averages in regard to time (%) (p=0.90, a≥0.05).
10 JEFSR Vol. 5 No. 1 2020
Table 11
Average time (%) of detected emotions in unconcealed participants per gender
Category Elicited Emotion
Contempt Disgust Fear Joy Surprise Anger Sadness
Female 2.85 11.07 17.33 5.41 1.37 11.52 20.01
Male 10.31 7.44 1.33 3.43 0.8 6.73 3.2
Figure 5 - Total average time percentage of detected emotions between female and male unconcealed groups.
There were no statistically significant differences of emotion detection observed between female and male
unconcealed participant averages in regard to time (%) (p=0.11, a≥0.05).
11 JEFSR Vol. 5 No. 1 2020
Table 12
Average time (%) of detected emotions during baseline measures
Category Elicited Emotion
Contempt Disgust Fear Joy Surprise Anger Sadness
Concealed 0.33 1.97 4.63 5.4 3.7 1.61 1.84
Unconcealed 6.97 8.61 12.27 11.58 14.95 5.49 13.12
Figure 6 – Total
average
time
percentage of detected emotions during the baseline measure. There were statistically significant differences
detected between the concealed and unconcealed group during baseline measures (p<0.01, a≥0.05).
Table 13
Total average time (%) of detected emotions during E1 stimulus presentation
Participant Emotion Elicited
Contempt Disgust Fear Joy Surprise Anger Sadness
12 JEFSR Vol. 5 No. 1 2020
R1 0.15 0 3.81 9.651 0 21.77 0
R2 0 0 0 0 0 0 0
R3 3.81 0.17 1.85 0 10.31 0.79 0
R4 0 0.17 2.50 31.17 3.36 0 0
R6 0.45 77.70 89.20 66.78 1.99 0 1.66
R7 0 0 0 0 0 0 0
R8 0 0 0 0 0 0 0
R9 0 0 0.33 0 0 0 1.14
R10 7.22 1.17 0 1.45 0 0 0
R11 0 0 0.17 0 0 0 0
R12 0 0 16.27 1.99 14.29 0 61.91
R13 0 0 5.47 2.52 0 0 0
R14 0.90 0 0 0 0 0 0
R15 0.60 0.17 0 9 5.48 0 0
13 JEFSR Vol. 5 No. 1 2020
Figure 7 - Total
average
time
percentage of detected emotions during the E1 stimulus presentation (attempting to trigger contempt from
participants). There were statistically significant differences detected between the concealed and unconcealed
group during this stimulus (p<0.01, a≥0.05).
Table 14
Total average time (%) of detected emotions during E2 stimulus presentation
Participant Emotion Elicited
Contempt Disgust Fear Joy Surprise Anger Sadness
R1 1.96 12.46 10.78 26.62 43.76 14.98 31.99
R2 0 0 0 0 0 0 0
R3 0 0 0 0 0.83 0 0
R4 0 3.18 0.49 29.5 11.60 0 0
R6 5.714 69.551 36.711 58.516 21.358 10.394 46.122
14 JEFSR Vol. 5 No. 1 2020
R7 0.299 0 0 0 0 0 0
R8 1.805 51.741 96.013 62.833 31.281 77.725 46.736
R9 0 0 6.291 0 6.323 9.89 6.198
R10 80.602 96 73.466 99.668 20.73 27.287 74.327
R11 18.496 8.97 20.598 9.167 36 95.447 89.119
R12 20.965 0 45.763 83.028 14.95 0 92.133
R13 46.142 92.833 83.25 72.651 31.419 11.22 7.008
R14 0 0 0 0 0 0 0
R15 5.12 0.331 0 76.833 26.91 2.997 0.104
Figure 8 - Total
average
time
percentage of detected emotions during the E2 stimulus presentation (attempting to trigger disgust from
participants). There were statistically significant differences detected between the concealed and unconcealed
group during this stimulus (p<0.01, a≥0.05).
15 JEFSR Vol. 5 No. 1 2020
Table 15
Total average time (%) of detected emotions during E3 stimulus presentation
Participant Emotion Elicited
Contempt Disgust Fear Joy Surprise Anger Sadness
R1 0 0 0 0 0 2.99 0
R2 0 0 0 0 0 0.16 0
R3 15.07 0 3.87 0 11.31 1.74 6.04
R4 0 1.84 0.33 17.5 3.36 0 0
R6 18.65 38.27 33.39 36.09 82.12 3.78 13.13
R7 0 0 0 0 0 0 0
R8 0.45 0 0.99 0 0.33 0.32 0.21
R10 8.42 0 0 1.33 0 1.58 0
R11 20.15 10.30 8.97 39.5 8 27.00 14.40
R12 0 2.65 0.17 0 8.97 0 0.52
R13 1.51 1.33 3.15 1.34 0.51 0 0.11
R14 0 0 0 0 0 0 0
R15 0 0 0 50.83 6.98 5.99 1.14
16 JEFSR Vol. 5 No. 1 2020
Figure 9 - Total
average
time
percentage of detected emotions during the E3 stimulus presentation (attempting to trigger fear from
participants). There were no statistically significant differences detected between the concealed and
unconcealed group during this stimulus (p=0.43, a≥0.05).
Table 16
Total average time (%) of detected emotions during E4 stimulus presentation
Participant Emotion Elicited
Contempt Disgust Fear Joy Surprise Anger Sadness
R1 0 0 5.48 4.66 0 28.71 0
R2 0 0 0 0 0 0 0
R3 0 0 13.45 92.58 6.32 0 0
R4 0 3.85 5.66 36 21.01 0 0
R6 7.07 88.02 100 68.97 2.32 0.16 11.27
R7 0 0 0 0 0 0 0
17 JEFSR Vol. 5 No. 1 2020
R8 0.15 2.65 1.16 4.33 1.83 2.21 3.52
R9 17.216 1.9 4.139 26.027 29.784 1.727 6.508
R10 0 0 0 17.61 0 0 0
R11 0 0.17 10.30 0 0 0 0
R12 0 0 20.17 88.85 23.09 0 72.88
R13 0 0 21.39 70.47 0 0 0
R14 0 0 0 0 0 0 0
R15 6.33 10.25 0 80.67 29.24 2.68 0
Figure 10 - Total
average
time
percentage of detected emotions during the E4 stimulus presentation (attempting to trigger joy from
participants). There were no statistically significant differences detected between the concealed and
unconcealed group during this stimulus (p=0.19, a≥0.05).
18 JEFSR Vol. 5 No. 1 2020
Table 17
Total average time (%) of detected emotions during E5 stimulus presentation
Participant Emotion Elicited
Contempt Disgust Fear Joy Surprise Anger Sadness
R1 0 0 0 0 0 0 0.73
R2 1.80 0 0 0 0 0.63 0
R3 20.40 14.55 25.38 0 75.71 68.88 55.26
R4 0 0.17 0 0 0 0 0
R6 20.90 18.14 3.65 3.04 6.46 0.79 3.52
R7 0.15 0 0 0 0 0.16 0.62
R8 0.15 0 0 0 0 1.90 1.76
R9 0 0 0 0 0 0 0
R10 23.16 51.67 76.78 56.98 58.21 77.60 71.12
R11 43.61 58.80 43.19 73.17 89.17 91.05 81.04
R12 0 0.66 0 0 0.17 0 0
R13 0.15 0.83 2.99 0.17 0.85 2.44 0
R14 0 0 2.5 0 0 0 0
R15 0 13.22 0 0 3.65 8.83 1.35
19 JEFSR Vol. 5 No. 1 2020
Figure 11 – Total
average
time
percentage of detected emotions during the E5 stimulus presentation (attempting to trigger surprise from
participants). There were no statistically significant differences detected between the concealed and
unconcealed group during this stimulus (p=0.04, a≥0.05).
Table 18
Total average time (%) of detected emotions during E6 stimulus presentation
Participant Emotion Elicited
Contempt Disgust Fear Joy Surprise Anger Sadness
R1 0 0 0 0 0 8.04 0.21
R2 0 0 0 0 0 0 0
R3 15.53 91.64 35.63 4.89 8.32 77.41 99.79
R4 0 43.48 0.49 1.5 2.35 0 0
R6 1.35 20.63 18.11 46.71 34.11 45.83 71.56
R7 0 0 0 0 0 0 0
R8 0 0 3.49 3.67 7.65 8.70 4.97
20 JEFSR Vol. 5 No. 1 2020
R9 0 0 0 10.788 0.166 64.364 67.562
R10 1.20 0 0 0.66 0 0 0
R11 9.17 0.66 0.66 6.67 49.83 99.53 94.72
R12 0 0 0 0 1.33 0 7.87
R13 1.82 1 8.96 0 0 11.87 0
R14 0 0 0.83 0 0 0 0
R15 0.90 0 0 0 1.50 10.88 0
Figure 12 – Total average time percentage of detected emotions during the E6 stimulus presentation
(attempting to trigger anger from participants). There were no statistically significant differences detected
between the concealed and unconcealed group during this stimulus (p=0.43, a≥0.05).
21 JEFSR Vol. 5 No. 1 2020
Table 19
Total average time (%) of detected emotions during 76 stimulus presentation
Participant Emotion Elicited
Contempt Disgust Fear Joy Surprise Anger Sadness
R1 12.97 48.84 21.23 14.98 88.69 11.20 62.42
R2 0 0 0 0 0 0 2.29
R3 5.02 79.10 5.21 1.69 61.73 48.03 98.86
R4 0 0 0.50 0 1.513 0 0
R6 0 2.16 0 41.65 77.48 2.84 6.10
R7 2.39 0 0.99 0.33 0 0.47 21.2
R8 0.90 51.08 97.34 99.5 99.33 99.05 99.79
R9 0 0 7.45 0 0 0 0
R10 92.48 99.5 100 67.77 98.67 100 100
R11 1.35 0.83 0.66 0.17 0.17 0 0.10
R12 37.56 74.88 93.90 12.15 73.26 35.49 82.20
R13 81.85 98.33 65.01 72.32 96.45 97.24 99.58
R14 17.12 49.83 90.83 94.86 100 100 100
R15 3.01 3.31 2.32 17.67 3.65 2.21 2.28
22 JEFSR Vol. 5 No. 1 2020
Figure 13 – Total
average
time
percentage of detected emotions during the E7 stimulus presentation (attempting to trigger sadness from
participants). There were statistically significant differences detected between the concealed and unconcealed
group during this stimulus (p<0.01, a≥0.05).
Discussion and Conclusion
The objective of this experiment was to analyze the differences between concealed and unconcealed
emotions when presented with an array of stimuli. The total average time (%) of detected emotions between
concealed and unconcealed groups was analyzed. There were no statistically significant differences seen in
terms of emotion detection between overall concealed and unconcealed participant averages in regard to time
(%) (p=0.07, a≥0.05). This was contrary to our alternative hypothesis, that predicted that individuals who
concealed their emotions during presented stimuli will have significantly less detectable emotions elicited in
comparison to individuals who were not asked to conceal their emotions. We therefore accept the null
hypothesis: that there will be no significant difference between detectable emotions of individuals of the
concealed group and the unconcealed group.
Male and female concealed and unconcealed groups were compared as well. Total average time (%) of
detected emotions between male concealed and unconcealed groups. Statistical significance was seen
between male concealed and unconcealed participant averages in regard to time (%) (p=0.02, a≥0.05). Total
23 JEFSR Vol. 5 No. 1 2020
average time (%) of detected emotions between female concealed and unconcealed groups. There were no
statistically significant differences of emotion detection observed between female concealed and unconcealed
participant averages in regard to time (%) (p=0.22, a≥0.05). Gender was also compared in terms of concealed
and unconcealed categories. The total average time (%) of detected emotions between female and male
concealed groups showed no statistically significant differences of emotion detection (p=0.90, a≥0.05).
Comparing female and males in the unconcealed category, there were no statistically significant differences of
emotion detection (p=0.11, a≥0.05).
Looking at specific emotions elicited throughout the stimuli, there were statistically significant
differences between certain emotions shown between concealed and unconcealed participants. Displays of
contempt (p=0.01), disgust (p<0.01), surprise (p=0.04), and sadness (p<0.01) shown to have large differences
between the two groups. The participants in the concealed group on average elicited sadness the most out of
all the emotions, 59.51% of the time. Participants in the unconcealed group also elicited sadness the most out
of all emotions measured, at 12.81%. The concealed group showed fear the least (4.56% of the time), while the
unconcealed group showed contempt the least (8.21%). These shown differences relate to the findings of Lei,
Sala, and Jasra5 who concluded that individuals react differently depending on the stimulus presented. This is
important to consider when looking at this data. Personal biases and experiences may play a role in how a
stimulus is processed and how emotion is expressed by an individual. Further, stimulus was selected to the
best of our ability, but may not have been representative of certain emotions for one person in compared to
others.
Obtaining a larger sample size may help increase accuracy for this experiment, as only fourteen
participants were taken into account. This could also explain the large standard error bars seen in all figures.
For future research, further looking into further gender differences in emotion expression using iMotions
software may be useful when looking at concealed and unconcealed emotions. It also may be useful for future
research to look further into the types of emotions elicited and measured by iMotions when comparing
concealed and unconcealed participants.
24 JEFSR Vol. 5 No. 1 2020
Acknowledgements
I would like to thank the Forensic Science programs at the University of Windsor for providing access to
the iMotions software and all the resources needed. I would also like to thank my fellow researcher partners
Aikede Aikebaier and Joel VanSteensel who assisted in the intake of participant information and completed
other analyses with iMotions software. Thank you to Dr. Pardeep Jasra for stepping in as research supervisor
for Dr. Shashi Jasra during her absence. Finally, thank you to all participants for volunteering their time to help
with this study.
References
Jain, A. & Ross, A. (2015). Bridging the gap: From biometrics to forensics. The Royal Society Publishing.
doi.org/10.1098/rstb.2014.0254
Farnsworth, B. (2017). What is biometric research? iMotions. Retrieved from https://imotions.com/blog/what-
is-biometric-research/
iMotions. (n.d.). Automatic Facial Expression Analysis. Retrieved from https://imotions.com/facial-expressions/
iMotions. (n.d.). Galavnic skin response (GSR): The complete pocket guide. Retrieved
https://imotions.com/blog/galvanic-skin-response/
Lei, J., Sala, J., & Jasra, S. (2017). Identifying correlation between facial expression and heart rate and skin
conductance with iMotions biometric platform. Journal of Emerging Forensic Science Research. Retrieved
from https://imotions.com/publications/identifying-correlation-between-facial-expression-and-heart-rate-
and-skin-conductance-with-imotions-biometric-platform/