Nazmi Sofian Suhaimi and James Mountstephens and Jason Teo (2021) Class-based analysis of Russell’s four-quadrant emotion prediction in virtual reality using multi-layer feedforward Anns.
![]() |
Text
FULLTEXT.pdf Restricted to Registered users only Download (758kB) | Request a copy |
Abstract
The following research describes the potential of classifying a four class emotion using a wearable EEG headset and using VR to induce emotional responses from the users. Various researchers have conducted emotion recognition using medical-grade EEG devices supported with a 2D monitor screen to induce emotional responses. This method of approach could cause additional artifacts due to the lack of concentration focusing within the border of the monitor screen of the intended stimulation thus reducing the classification accuracies. The large and complex EEG machine used by medical professions are sensitive equipment must be operated by trained professions thus making it difficult to seek permit to access such device. Hence, using a wearable EEG headset which is small and portable was considered for the brainwave signal samplings. this favors the researchers for use in conducting experiments for a human recognition system. The wearable EEG headset collects the brainwave signals at TP9, TP10, AF7, and AF8 electrode placements sampled at 256Hz with the five-bands (Delta, Theta, Alpha, Beta, Gamma). Additionally, the wearable EEG headset combines with the virtual reality (VR) headset to induce emotional responses presented to the users using the prepared VR video stimulus. The VR video was presented using the Arousal Valence Space (AVS) model with each of the respective quadrant having four videos presented in 80-seconds with a 10-second rest interval during transitions totaling up to 360-seconds from beginning to end. The collected samples are classified using Feedforward Artificial Neural Network (FANN) with 10-fold cross-validation and the model was trained using 90% of the total dataset with 10% used for validation purposes. The highest average classification result obtained from FANN was at 41.04%. While the classification performance was low, the confusion matrix presented a different view of the four-classes performed using different trained epoch values. Observations of trained epoch (2000, 3000, and 5000) showed each of the emotion classes happy, scared, bored, and calm, achieved classification accuracy of 75.15%, 75.12%, 75.02%, and 74.24% respectively
Item Type: | Proceedings |
---|---|
Keyword: | Machine Learning, Electroencephalography, Emotion Classification, Virtual Reality, Wearable Technology |
Subjects: | B Philosophy. Psychology. Religion > BF Psychology > BF1-990 Psychology > BF511-593 Affection. Feeling. Emotion Q Science > QA Mathematics > QA1-939 Mathematics > QA71-90 Instruments and machines |
Department: | FACULTY > Labuan Faculty of International Finance |
Depositing User: | JUNAINE JASNI - |
Date Deposited: | 13 Aug 2025 13:37 |
Last Modified: | 13 Aug 2025 13:37 |
URI: | https://eprints.ums.edu.my/id/eprint/44934 |
Actions (login required)
![]() |
View Item |