Abstract
The COVID-19 pandemic has forced a sudden change of traditional office works to smart working models, which however force many workers staying at home with a significant increase of sedentary lifestyle. Metabolic disorders, mental illnesses, and musculoskeletal injuries are also caused by the physical inactivity and chronic stress at work, threatening office workers’ physical and physiological health. In the modern vision of smart workplaces, cyber-physical systems play a central role to augment objects, environments, and workers with integrated sensing, data processing, and communication capabilities. In this context, a work engagement system is proposed to monitor psycho-physical comfort and provide health suggestion to the office workers. Recognizing their activity, such as sitting postures and facial expressions, could help assessing the level of work engagement. In particular, head and body posture could reflects their state of engagement, boredom or neutral condition. In this paper we proposed a method to recognize such activities using an infrared sensor array by analyzing the sitting postures. The proposed approach can unobstructively sense their activities in a privacy-preserving way. To evaluate the performance of the system, a working scenario has been set up, and their activities were annotated by reviewing the video of the subjects. We carried out an experimental analysis and compared Decision Tree and k-NN classifiers, both of them showed high recognition rate for the eight postures. As to the work engagement assessment, we analyzed the sitting postures to give the users suggestions to take a break when the postures such as lean left/right with arm support, lean left/right without arm support happens very often.
Keywords: Work Engagement, Smart Office, Sitting Posture Recognition, Infrared Sensor
References
- 1.Gravina R., Fortino G. Wearable body sensor networks: state-of-the-art and research directions. IEEE Sensors Journal. 2020;PP(99):1. [Google Scholar]
- 2.Miorandi Daniele, Sicari Sabrina, Pellegrini Francesco, Chlamtac Imrich. Internet of things: Vision, applications and research challenges. Ad Hoc Networks. 2012;10(7):1497–1516. [Google Scholar]
- 3.Augusto J.C. Ambient intelligence: The confluence of ubiquitous/pervasive computing and artificial intelligence. Springer; London: 2007. [Google Scholar]
- 4.Ren X., Yu B., Lu Y., Zhang B., Brombacher A. Lightsit: An unobtrusive health-promoting system for relaxation and fitness microbreaks at work. Sensors. 2019;19(9):2162. doi: 10.3390/s19092162. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Chandola, Tarani Chronic stress at work and the metabolic syndrome: Prospective study. Medical Benefits. 2006:521–525. doi: 10.1136/bmj.38693.435301.80. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Emilie Vayre, Anne-Marie Vonthron. Identifying work-related internet’s uses-at work and outside usual workplaces and hours-and their relationships with work-home interface, work engagement, and problematic internet behavior. Frontiers in psychology. 2019;10:2118. doi: 10.3389/fpsyg.2019.02118. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Shigeta Hironori, Nakase Junya, Tsunematsu Yuta, Kiyokawa Kiyoshi, Hatanaka Masahide, Hosoda Kazufumi, Okada Masashi, Ishihara Yasunori, Ooshita Fukuhito, Kakugawa Hirotsugu, Kurihara Satoshi, Moriyama Koichi. Implementation of a smart office system in an ambient environment. 2012 IEEE Virtual Reality Workshops (VRW) 2012:1–2. [Google Scholar]
- 8.Alexandros Zenonos, Aftab Khan, Georgios Kalogridis, Stefanos Vatsikas, Tim Lewis, and Mahesh Sooriyabandara. Healthy-office: Mood recognition at work using smartphones and wearable sensors. In 2016 IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), pages 1–6, 2016.
- 9.Sergio M., Oscar A., Sánchez-Rada J, Carlos I. An emotion aware task automation architecture based on semantic technologies for smart offices. Sensors. 2018;18(5):1499. doi: 10.3390/s18051499. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Moscoso P.C., Groba B., Martínez-Martínez FJ, Miranda-Duro Mdc, Pereira J. Study for the design of a protocol to assess the impact of stress in the quality of life of workers. International Journal of Environmental Research and Public Health. 2021;18(1413) doi: 10.3390/ijerph18041413. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Arakawa Y. Sensing and changing human behavior for workplace wellness. Journal of Information Processing. 2019;27:614–623. [Google Scholar]
- 12.Hang Li. A novel design for a comprehensive smart automation system for the office environment. In Proceedings of the 2014 IEEE Emerging Technology and Factory Automation (ETFA), pages 1–4, 2014.
- 13.Sirmacek B., Riveiro M. Occupancy prediction using low-cost and low-resolution heat sensors for smart offices. Sensors. 2020;20(19):5497. doi: 10.3390/s20195497. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Sun S., Zheng X., Gong B., Ordieres-Meré J. Healthy operator 4.0: A human cyber-physical system architecture for smart workplaces. Sensors. 2020;20(7):2011. doi: 10.3390/s20072011. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Kajiwara Y., Shimauchi T., Kimura H. Predicting emotion and engagement of workers in order picking based on behavior and pulse waves acquired by wearable devices. Sensors. 2019;19(1) doi: 10.3390/s19010165. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Gravina Raffaele, Li Qimeng. Emotion-relevant activity recognition based on smart cushion using multi-sensor fusion. Information Fusion. 2019;48:1–10. [Google Scholar]
- 17.Kuang Y., Guo M., Peng Y., Pei Z. Learner posture recognition via a fusing model based on improved siltp and ldp. Multimedia Tools and Applications. 2019;78(2) [Google Scholar]
- 18.Ashwin T.S., Mohana Reddy Guddeti Ram. Automatic detection of students’ affective states in classroom environment using hybrid convolutional neural networks. Education and information technologies. 2020;25(2):1387–1415. [Google Scholar]
- 19.Tao Huang, Yunshan Mei, Hao Zhang, Sanya Liu, and Huali Yang. Fine-grained engagement recognition in online learning environment. In 2019 IEEE 9th International Conference on Electronics Information and Emergency Communication (ICEIEC), pages 338–341, 2019.
- 20.Zeng Haipeng, Shu Xinhuan, Wang Yanbang, Wang Yong, Zhang Liguo, Pong Ting-Chuen, Qu Huamin. Emotioncues: Emotion-oriented visual summarization of classroom videos. IEEE Transactions on Visualization and Computer Graphics. 2019:1. doi: 10.1109/TVCG.2019.2963659. [DOI] [PubMed] [Google Scholar]
- 21.Zaletelj Janez, Košir Andrej. Predicting students’ attention in the classroom from kinect facial and body features. EURASIP journal on image and video processing. 2017;1:1–12. [Google Scholar]
- 22.Lu Yu, Zhang Sen, Zhang Zhiqiang, Xiao Wendong, Yu Shengquan. A framework for learning analytics using commodity wearable devices. Sensors. 2017;17(6) doi: 10.3390/s17061382. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Kim P.W. Ambient intelligence in a smart classroom for assessing students’ engagement levels. Journal of ambient intelligence and humanized computing. 2019;10(10):3847–3852. [Google Scholar]
- 24.Ma Congcong, Li Wenfeng, Cao Jingjing, Du Juan, Li Qimeng, Gravina Raffaele. Adaptive sliding window based activity recognition for assisted livings. Information Fusion. 2020;53:55–65. [Google Scholar]
- 25.Ma Congcong, Li Wenfeng, Gravina Raffaele, Du Juan, Li Qimeng, Fortino Giancarlo. Smart cushion-based activity recognition: Prompting users to maintain a healthy seated posture. IEEE Systems, Man, and Cybernetics Magazine. 2020;6(4):6–14. [Google Scholar]
- 26.Congcong Ma, Raffaele Gravina, Qimeng Li, Yu Zhang, Wenfeng Li, and Giancarlo Fortino. Activity recognition of wheelchair users based on sequence feature in time-series. In 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pages 3659–3664, 2017.
- 27.Ma C., Li W., Gravina R, Cao J., Li Q., Fortino G. Activity level assessment using a smart cushion for people with a sedentary lifestyle. Sensors. 2017;17(10):2269. doi: 10.3390/s17102269. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Selene Mota and Rosalind W. Picard. Automated posture analysis for detecting learner’s interest level. In 2003 Conference on Computer Vision and Pattern Recognition Workshop, volume 5, pages 49–49, 2003.
- 29.Tatsuya Shibata and Yohei Kijima. Emotion recognition modeling of sitting postures by using pressure sensors and accelerometers. In Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012), pages 1124–1127, 2012.
- 30.Wataya Ryo, Iwai Daisuke, Sato Kousuke. Sensing of audience excitation and boredom emotion based on the synchrony of sitting body sway. Electronics & Communications in Japan. 2015;98(4):11–19. [Google Scholar]
- 31.https://www.sekorm.com/web/search/keyword/mlx90640, September. 2021.
- 32.https://mqtt.org/, September. 2021.
- 33.https://www.nvidia.cn/autonomous-machines/embedded-systems/jetson-tx2/, September. 2021.
- 34.N. Dalal and B. Triggs. Histograms of oriented gradients for human detection. In IEEE Computer Society Conference on Computer Vision & Pattern Recognition, 2005.
