Advisor:
Dr. Heera Lee
Committee members:
Dr. Susannah Paletz
Dr. Ge Gao
Video conferencing usage increased dramatically since Covid.
In multilingual teams, all members showed respect towards using English
(Gao & Fussell, 2017).
Is there any way to break the process from Step 1 to Step 2?
Current tools cannot fully help NNES-NES conversation. Can we help on misunderstanding process STEP 3's 1st point "Confused expressions varied by different cultures"?
Learning different culture people's ways of behavior, and thinking before communication (Terui & Hishiyama, 2014).
Time-consuming in real-life settings
There's still limitations. Then, can we help on misunderstanding process STEP 3's 2nd point "NES attributed NNES' body language to the wrong reasons"?
· Line of the transcript NNES is checking.
· Words NNES is searching in dictionary.
· Showing NES dynamics of NNES' confusion.
1. Mark questions on the presentation slides (Glassman et al., 2015)
2. Express confusion by submitting quesitons (Park & Cho, 2014)
3. Mark confusion at corresponding video content (Kim et al., 2021)
4. Report the extent of confusion via a scale (Rivera-Pelayo et al., 2013)
5. Pressed comprehension level button to show on speaker's Google Glass (Zarraonandia et al., 2019)
Use devices (e.g., smartphones), which might interrupt the meeting participation (Park & Cho, 2014)
External devices needed to be purchased and implemented.
Facial expression recognition tool can detect NNES' confusion non-intrusively without external device.
I investigated how NES and a facial expression recognition tool identified NNES' confusion during video conferencing group meetings, and how the awareness of NNES’ confusion affected the communication approaches NES adopted in the following video conferencing.
RQ 1: To what extent do NES and a facial expression recognition tool recognize NNES' confusion in communication during video conferencing?
RQ 2-1: How does the awareness of NNES’ confusion, as perceived by NES during video conferencing, affect the communication approach of NES when interacting with NNES?
RQ 2-2: How does the awareness of NNES’ confusion, identified by the facial expression recognition tool, affect the communication approach of NES when interacting with NNES?
Procedures were adapted from He et al. (2017)
I used OpenFace to process the AUs of confused faces of NNES (Baltrušaitis et al., 2018)
Cultural differences are valid, but an International Core Pattern (ICP) of AU was found in confused emotion
(Cordaro et al., 2018)
>> test out whether ICP works on my NNES participants' ethnicities (e.g., South Asian and East Asian)
(Ekman et al, 2002)
I calculated the accuracy with TPR in the Confusion matrix (Buolamwini & Gebru, 2018)
It means NE easily read emotion in a positive way.
NE tend to trust NNES' self-reports when it was shown with the tool's data together
Compared with the tool, NES can judge based on more cues: verbal cues, non-facial body language, and contextual cues, leading to higher accuracy.
NC's accuracy is always higher than NE's.
Ask for clarification anonymously.
Develop it to assist attention deficient population.
Real-time notification of teammate's confusion.
(Buolamwini & Gebru, 2018)
Bias in self-identifying NES or NNES
Participants guessed out study purpose by the study title
Imagine NNES will have confusion in English
Sample size too small to be stated valid
One coder of thematic analysis will decrease the validity