The research was conducted in collaboration with a team led by Professor Park Eun-ji of Chung-Ang University in South Korea and a team led by Professor James Diefendorf of the University of Akron in the United States.
The research team built a data set of customer consultations from workers who actually engage in emotional labor, developed customer response scenarios in call centers, and
The researchers collected data on the voice, behavior, and biometric signals of 31 workers. They extracted a total of 176 characteristics from the workers' voice data, including time of day, frequency, and tone.
In addition, the team also extracted additional features from biosignals to predict workers' suppressed emotions. They also determined a total of 228 features, including skin potential, brainwaves, electrocardiograms, and body temperature, and applied nine AI models.
The model learned, compared, and evaluated the results. As a result, the learning model was able to distinguish between situations where staff were highly emotionally unavailable and those where they were not with 87% accuracy.
Professor Lee said, "We will demonstrate the technology we have developed by linking it with an app for managing the mental health of people engaged in emotional labor."
2025/02/13 09:25 KST
Copyrights(C) Edaily wowkorea.jp 101