top of page
Above the Clouds

Theory and Method of Affective Computing in Human Habitats

Advisor(s)

Xinyi Fu and Prof. Yingqing Xu,

Tsinghua University Future Lab, Beijing

Status

Paper "Gesture-based fear recognition using non-performance dataset from VR horror games" accepted by International Conference on Affective Computing and Intelligent Interaction (ACII'21)

Duration

September 2020 - August 2021

Role

Team member (multi-camera system design and setup, sensing floor setup,  paper drafting, project presentation)

With the development and growing ubiquity of smart home technologies that enable creative interactions between humans and devices in the habitats, we envision the possibility of smart and sympathetic homes in the future. The FutureHome project is a long term project that seeks to build such smart human habitats which can fully understand human behavior and the living environment, process information through affective computing (passive input) and interaction intension analysis (active input), and then formulate a central control hub and information feedback system that would commend household devices and help humans to be more context-aware. 

Screen Shot 2021-11-04 at 3.51.41 PM.png
Screen Shot 2021-11-04 at 3.51.41 PM.png

Human behavior detection

Human context

Screen Shot 2021-11-04 at 3.44.43 PM.png

Environment sensing

Environment context

Screen Shot 2021-11-04 at 5.08.40 PM.png

Center control hub for human activity recognition, affective computing, and coordinating smart home devices.

Control

Screen Shot 2021-11-04 at 3.48.56 PM.png

Creating a context-aware FutureHome

Screen Shot 2021-11-04 at 4.21.33 PM.png

We started with "prototyping" the FutureHome. Sensors, smart home appliances, and computers are embedded in a household. We are planning on long term human habitant data collection in this house to construct a natural, multimodal, and comprehensive human behavior dataset in the future. 

Picture1.jpg

For a mid-phase exploration, we investigated gesture-based fear recognition. We set up a VR horror game environment to illicit fear and collected participants' skeleton points data for model training.

Xinyi Fu, Cheng Xue, Qiuyi Yin, Yu Jiang, Ye Li, Yichen Cai, and Weilin Sun. 2021. Gesture based fear recognition using nonperformance dataset from VR horror games. In 2021 9th International Conference on Affective Computing and Intelligent Interaction (ACII). 1–8. doi: 10.1109/ACII52823.2021.9597450.

bottom of page