Araştırma Makalesi
BibTex RIS Kaynak Göster
Yıl 2022, Cilt: 7 Sayı: 2, 148 - 168, 08.08.2022
https://doi.org/10.28978/nesciences.1159248

Öz

Kaynakça

  • Abhang, P. A., Gawali, B. W., & Mehrotra, S. C. (2016). Introduction to EEG- and Speech-Based Emotion Recognition. In: Introduction to EEG- and Speech-Based Emotion Recognition. https://doi.org/10.1016/C2015-0-01959-1
  • Aftanas, L. I., Varlamov, A. A., Pavlov, S. V., Makhnev, V. P., & Reva, N. V. (2002). Time-dependent cortical asymmetries induced by emotional arousal: EEG analysis of event-related synchronization and desynchronization in individually defined frequency bands. International Journal of Psychophysiology, 44(1). https://doi.org/10.1016/S0167-8760(01)00194-5.
  • Altan, G., & Inat, G. (2021). EEG based Spatial Attention Shifts Detection using Time-Frequency features on Empirical Wavelet Transform. Journal of Intelligent Systems with Applications. https://doi.org/10.54856/10.54856/jiswa.202112181.
  • Altan, G., & Kutlu, Y. (2018). Generative Autoencoder Kernels on Deep Learning for Brain Activity Analysis. Natural and Engineering Sciences, 3(3). https://doi.org/10.28978/nesciences.468978.
  • Altan, G., Kutlu, Y., & Allahverdi, N. (2016). Deep Belief Networks Based Brain Activity Classification Using EEG from Slow Cortical Potentials in Stroke. International Journal of Applied Mathematics, Electronics and Computers, 205–205. https://doi.org/10.18100/ijamec.270307
  • Altan, G., Yayık, A., & Kutlu, Y. (2021). Deep Learning with ConvNet Predicts Imagery Tasks Through EEG. Neural Processing Letters, 53(4). https://doi.org/10.1007/s11063-021-10533-7. Badcock, N. A., Mousikou, P., Mahajan, Y., De Lissa, P., Thie, J., & McArthur, G. (2013). Validation of the Emotiv EPOC® EEG gaming systemfor measuring research quality auditory ERPs. PeerJ, 2013(1), 1–17. https://doi.org/10.7717/peerj.38.
  • Bălan, O., Moise, G., Moldoveanu, A., Leordeanu, M., & Moldoveanu, F. (2019). Fear level classification based on emotional dimensions and machine learning techniques. Sensors (Switzerland), 19(7), 1–18. https://doi.org/10.3390/s19071738.
  • Barrett, L. F., & Wager, T. D. (2006). The structure of emotion evidence from neuroimaging studies. Current Directions in Psychological Science, 15(2). https://doi.org/10.1111/j.0963-7214.2006.00411.x. Basar, M. D., Duru, A. D., & Akan, A. (2020). Emotional state detection based on common spatial patterns of EEG. Signal, Image and Video Processing, 14(3). https://doi.org/10.1007/s11760-019-01580-8.
  • Bazgir, O., Mohammadi, Z., & Habibi, S. A. H. (2018). Emotion Recognition with Machine Learning Using EEG Signals. 2018 25th Iranian Conference on Biomedical Engineering and 2018 3rd International Iranian Conference on Biomedical Engineering, ICBME 2018. https://doi.org/10.1109/ICBME.2018.8703559.
  • Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1). https://doi.org/10.1016/0005-7916(94)90063-9.
  • Chabin, T., Gabriel, D., Haffen, E., Moulin, T., & Pazart, L. (2020). Are the new mobile wireless EEG headsets reliable for the evaluation of musical pleasure? PLoS ONE, 15(12 December). https://doi.org/10.1371/journal.pone.0244820.
  • Ciuk, D., Troy, A. K., & Jones, M. C. (2015). Measuring Emotion: Self-Reports vs. Physiological Indicators. SSRN Electronic Journal, October. https://doi.org/10.2139/ssrn.2595359.
  • Cruz-Garza, J. G., Brantley, J. A., Nakagome, S., Kontson, K., Megjhani, M., Robleto, D., & Contreras-Vidal, J. L. (2017). Deployment of mobile EEG technology in an art museum setting: Evaluation of signal quality and usability. Frontiers in Human Neuroscience, 11. https://doi.org/10.3389/fnhum.2017.00527.
  • Damasio, A. R. (1998). Emotion in the perspective of an integrated nervous system. Brain Research Reviews, 26(2–3), 83–86. https://doi.org/10.1016/S0165-0173(97)00064-7.
  • de Cesarei, A., & Codispoti, M. (2011). Affective modulation of the LPP and α-ERD during picture viewing. Psychophysiology, 48(10). https://doi.org/10.1111/j.1469-8986.2011.01204.x.
  • Di Flumeri, G., Aricò, P., Borghini, G., Sciaraffa, N., Di Florio, A., & Babiloni, F. (2019). The dry revolution: Evaluation of three different eeg dry electrode types in terms of signal spectral features, mental states classification and usability. Sensors (Switzerland), 19(6). https://doi.org/10.3390/s19061365.
  • Duvinage, M., Castermans, T., Petieau, M., Hoellinger, T., Cheron, G., & Dutoit, T. (2013). Performance of the Emotiv Epoc headset for P300-based applications. BioMedical Engineering Online, 12(1), 1–15. https://doi.org/10.1186/1475-925X-12-56.
  • EEGLAB download page. (n.d.). https://sccn.ucsd.edu/eeglab/download.php.
  • EEGLAB Plotting Channel Spectra Tutorial. (n.d.). https://eeglab.org/tutorials/08_Plot_data/Plotting_Channel_Spectra_and_Maps.html.
  • Eijlers, E., Smidts, A., & Boksem, M. A. S. (2019). Implicit measurement of emotional experience and its dynamics. PLoS ONE, 14(2), 1–15. https://doi.org/10.1371/journal.pone.0211496.
  • Ekman, P., & Friesen, W. V. (1971). Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 17(2). https://doi.org/10.1037/h0030377.
  • Emotiv home page. (n.d.). https://www.emotiv.com/.
  • EPOC user manual. (n.d.). https://emotiv.gitbook.io/epoc-user-manual/.
  • Fakhruzzaman, M. N., Riksakomara, E., & Suryotrisongko, H. (2015). EEG Wave Identification in Human Brain with Emotiv EPOC for Motor Imagery. Procedia Computer Science, 72. https://doi.org/10.1016/j.procs.2015.12.140.
  • Frantzidis, C. A., Bratsas, C., Papadelis, C. L., Konstantinidis, E., Pappas, C., & Bamidis, P. D. (2010). Toward emotion aware computing: An integrated approach using multichannel neurophysiological recordings and affective visual stimuli. IEEE Transactions on Information Technology in Biomedicine, 14(3). https://doi.org/10.1109/TITB.2010.2041553.
  • Harmon-Jones, E., Gable, P. A., & Peterson, C. K. (2010). The role of asymmetric frontal cortical activity in emotion-related phenomena: A review and update. Biological Psychology, 84 (3). https://doi.org/10.1016/j.biopsycho.2009.08.010.
  • Hassouneh, A., Mutawa, A. M., & Murugappan, M. (2020). Development of a Real-Time Emotion Recognition System Using Facial Expressions and EEG based on machine learning and deep neural network methods. Informatics in Medicine Unlocked, 20, 100372. https://doi.org/10.1016/j.imu.2020.100372.
  • Iacoviello, D., Petracca, A., Spezialetti, M., & Placidi, G. (2015). A classification algorithm for electroencephalography signals by self-induced emotional stimuli. IEEE Transactions on Cybernetics, 46(10). https://doi.org/10.1109/TCYB.2015.2498974.
  • Joshi, V. M., & Ghongade, R. B. (2020). IDEA: Intellect database for emotion analysis using EEG signal. Journal of King Saud University-Computer and Information Sciences, xxxx. https://doi.org/10.1016/j.jksuci.2020.10.007.
  • Katsigiannis, S., & Ramzan, N. (2018). DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices. IEEE Journal of Biomedical and Health Informatics, 22(1), 98–107. https://doi.org/10.1109/JBHI.2017.2688239.
  • Klimesch, W. (2012). Alpha-band oscillations, attention, and controlled access to stored information. Trends in Cognitive Sciences, 16(12). https://doi.org/10.1016/j.tics.2012.10.007
  • Klug, M., & Gramann, K. (2020). Identifying key factors for improving ICA-based decomposition of EEG data in mobile and stationary experiments. European Journal of Neuroscience, May, 1–15. https://doi.org/10.1111/ejn.14992.
  • Kumar, N., Khaund, K., & Hazarika, S. M. (2016). Bispectral Analysis of EEG for Emotion Recognition. Procedia Computer Science, 84, 31–35. https://doi.org/10.1016/j.procs.2016.04.062.
  • Lakhan, P., Banluesombatkul, N., Changniam, V., Dhithijaiyratn, R., Leelaarporn, P., Boonchieng, E., Hompoonsup, S., & Wilaiprasitporn, T. (2019). Consumer grade brain sensing for emotion recognition. IEEE Sensors Journal, 19(21), 9896–9907. https://doi.org/10.1109/JSEN.2019.2928781.
  • Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (1997). International affective picture system (IAPS): Technical manual and affective ratings. NIMH Center for the Study of Emotion and Attention, 39–58.
  • Lau-Zhu, A., Lau, M. P. H., & McLoughlin, G. (2019). Mobile EEG in research on neurodevelopmental disorders: Opportunities and challenges. Developmental Cognitive Neuroscience, 36. https://doi.org/10.1016/j.dcn.2019.100635.
  • Laufs, H., Krakow, K., Sterzer, P., Eger, E., Beyerle, A., Salek-Haddadi, A., & Kleinschmidt, A. (2003). Electroencephalographic signatures of attentional and cognitive default modes in spontaneous brain activity fluctuations at rest. Proceedings of the National Academy of Sciences of the United States of America, 100(19). https://doi.org/10.1073/pnas.1831638100.
  • Ledoux, J. E., & Ledoux, J. E. (2000). Emotıon Cırcuıts in the Brain. New York, 155–184. https://doi.org/10.1146/annurev.neuro.23.1.155.
  • Lee, H. W., Cho, H., Lasko, E., Kim, J. W., & Kwon, W. (2020). From knowing the game to enjoying the game: EEG/ERP assessment of emotional processing. International Journal of Sports Marketing and Sponsorship, 21(2), 305–323. https://doi.org/10.1108/IJSMS-11-2018-0119.
  • Lin, Y. P., Wang, C. H., Jung, T. P., Wu, T. L., Jeng, S. K., Duann, J. R., & Chen, J. H. (2010). EEG-based emotion recognition in music listening. IEEE Transactions on Biomedical Engineering, 57(7). https://doi.org/10.1109/TBME.2010.2048568.
  • Masood, N., & Farooq, H. (2019). Investigating EEG patterns for dual-stimuli induced human fear emotional state. Sensors (Switzerland), 19(3), 1–22. https://doi.org/10.3390/s19030522.
  • Mauss, I. B., & Robinson, M. D. (2009). Measures of emotion: A review. Cognition and Emotion 23 (2). https://doi.org/10.1080/02699930802204677.
  • Morris, J. D. (1995). OBSERVATIONS: SAM: The Self-Assessment Manikin - An Efficient Cross-Cultural Measurement of Emotional Response. Journal of Advertising Research, 35(6), 63–68.
  • Murugappan, M., & Murugappan, S. (2013). Human emotion recognition through short time Electroencephalogram (EEG) signals using Fast Fourier Transform (FFT). Proceedings - 2013 IEEE 9th International Colloquium on Signal Processing and Its Applications, CSPA 2013. https://doi.org/10.1109/CSPA.2013.6530058
  • Olofsson, J. K., Nordin, S., Sequeira, H., & Polich, J. (2008). Affective picture processing: An integrative review of ERP findings. Biological Psychology, 77(3). https://doi.org/10.1016/j.biopsycho.2007.11.006.
  • Patel, S. H., & Azzam, P. N. (2005). Characterization of N200 and P300: Selected studies of the Event-Related Potential. International Journal of Medical Sciences, 2(4), 147–154. https://doi.org/10.7150/ijms.2.147.
  • Pessoa, L. (2018). Understanding emotion with brain networks. Current Opinion in Behavioral Sciences, 19. https://doi.org/10.1016/j.cobeha.2017.09.005.
  • Pizzo, F., Roehri, N., Medina Villalon, S., Trébuchon, A., Chen, S., Lagarde, S., Carron, R., Gavaret, M., Giusiano, B., McGonigal, A., Bartolomei, F., Badier, J. M., & Bénar, C. G. (2019). Deep brain activities can be detected with magnetoencephalography. Nature Communications, 10(1). https://doi.org/10.1038/s41467-019-08665-5.
  • Ramirez, R., Planas, J., Escude, N., Mercade, J., & Farriols, C. (2018). EEG-based analysis of the emotional effect of music therapy on palliative care cancer patients. Frontiers in Psychology, 9(MAR). https://doi.org/10.3389/fpsyg.2018.00254.
  • Reuderink, B., Mühl, C., & Poel, M. (2013). Valence, arousal and dominance in the EEG during game play. International Journal of Autonomous and Adaptive Communications Systems, 6(1), 45-62.
  • Ros, T., Théberge, J., Frewen, P. A., Kluetsch, R., Densmore, M., Calhoun, V. D., & Lanius, R. A. (2013). Mind over chatter: Plastic up-regulation of the fMRI salience network directly after EEG neurofeedback. NeuroImage, 65. https://doi.org/10.1016/j.neuroimage.2012.09.046.
  • Sánchez-Reolid, R., García, A. S., Vicente-Querol, M. A., Fernández-Aguilar, L., López, M. T., Fernández-Caballero, A., & González, P. (2018). Artificial neural networks to assess emotional states from brain-computer interface. Electronics (Switzerland), 7(12), 1–12. https://doi.org/10.3390/electronics7120384.
  • Schupp, H. T., Stockburger, J., Bublatzky, F., Junghöfer, M., Weike, A. I., & Hamm, A. O. (2008). The selective processing of emotional visual stimuli while detecting auditory targets: An ERP analysis. Brain Research, 1230. https://doi.org/10.1016/j.brainres.2008.07.024.
  • Shu, L., Xie, J., Yang, M., Li, Z., Li, Z., Liao, D., Xu, X., & Yang, X. (2018). A review of emotion recognition using physiological signals. Sensors (Switzerland), 18(7). https://doi.org/10.3390/s18072074.
  • Simons, R. F., Detenber, B. H., Cuthbert, B. N., Schwartz, D. D., & Reiss, J. E. (2003). Attention to television: Alpha power and its relationship to image motion and emotional content. Media Psychology, 5(3). https://doi.org/10.1207/S1532785XMEP0503_03.
  • Singh, M. I., & Singh, M. (2021). Emotion recognition: An evaluation of ERP features acquired from frontal EEG electrodes. Applied Sciences (Switzerland), 11(9). https://doi.org/10.3390/app11094131.
  • Sokhadze, E. M., Casanova, M. F., Casanova, E., Lamina, E., Kelly, D. P., & Khachidze, I. (2017). Event-related potentials (ERP) in cognitive neuroscience research and applications. NeuroRegulation, 4(1). https://doi.org/10.15540/nr.4.1.14.
  • Soufineyestani, M., Dowling, D., & Khan, A. (2020). Electroencephalography (EEG) technology applications and available devices. Applied Sciences (Switzerland), 10(21), 1–23. https://doi.org/10.3390/app10217453
  • Suhaimi, N. S., Mountstephens, J., & Teo, J. (2020). EEG-Based Emotion Recognition: A State-of-the-Art Review of Current Trends and Opportunities. Computational Intelligence and Neuroscience, 2020. https://doi.org/10.1155/2020/8875426
  • Uusberg, A., Uibo, H., Kreegipuu, K., & Allik, J. (2013). EEG alpha and cortical inhibition in affective attention. International Journal of Psychophysiology, 89(1). https://doi.org/10.1016/j.ijpsycho.2013.04.020.
  • Warriner, A. B., Kuperman, V., & Brysbaert, M. (2013). Norms of valence, arousal, and dominance for 13,915 English lemmas. Behavior Research Methods, 45(4). https://doi.org/10.3758/s13428-012-0314-x.
  • Xu, H., Wang, X., Li, W., Wang, H., & Bi, Q. (2019). Research on EEG channel selection method for emotion recognition. IEEE International Conference on Robotics and Biomimetics, ROBIO 2019. https://doi.org/10.1109/ROBIO49542.2019.8961740.
  • Yuvaraj, R., Murugappan, M., Mohamed Ibrahim, N., Sundaraj, K., Omar, M. I., Mohamad, K., & Palaniappan, R. (2014). Detection of emotions in Parkinson’s disease using higher order spectral features from brain’s electrical activity. Biomedical Signal Processing and Control, 14(1), 108–116. https://doi.org/10.1016/j.bspc.2014.07.005.
  • Zhuang, N., Zeng, Y., Yang, K., Zhang, C., Tong, L., & Yan, B. (2018). Investigating patterns for self-induced emotion recognition from EEG signals. Sensors (Switzerland), 18(3). https://doi.org/10.3390/s18030841.

Detection of EEG Patterns for Induced Fear Emotion State via EMOTIV EEG Testbench

Yıl 2022, Cilt: 7 Sayı: 2, 148 - 168, 08.08.2022
https://doi.org/10.28978/nesciences.1159248

Öz

In this study, International Affective Picture System (IAPS) were used to evoke fear and neutral stimuli using EMOTIV EPOC EEG recognition system (n=15). During the experiments, EEG data were recorded using the Test bench program. To synchronize the EEG records, IAPS pictures were reflected on the screen. A Python script was written in the Open Sesame program to provide a synchronized data flow in the Input/Output channels of the installed virtual serial port. The Event-Related Oscillations (ERO) responses and Event-Related Potentials (ERPs) were calculated. Statistically significant differences (p<0.05) were observed among the mean amplitude differences in the P7, O1, F3, AF3, P8 channels at 200-400 milliseconds in the ERP analysis, and also significant (p<0.05) differences were found in alpha(∝) and beta(β) brainwaves compared to neutral stimuli, in the Fast Fourier Transform (FFT) analysis. After these evaluations, different time-spectral signal activity patterns occurred in the right frontal lobe (F4) at the (∝) band, and in the left parietal lobe at the (β) band, respectively.

Kaynakça

  • Abhang, P. A., Gawali, B. W., & Mehrotra, S. C. (2016). Introduction to EEG- and Speech-Based Emotion Recognition. In: Introduction to EEG- and Speech-Based Emotion Recognition. https://doi.org/10.1016/C2015-0-01959-1
  • Aftanas, L. I., Varlamov, A. A., Pavlov, S. V., Makhnev, V. P., & Reva, N. V. (2002). Time-dependent cortical asymmetries induced by emotional arousal: EEG analysis of event-related synchronization and desynchronization in individually defined frequency bands. International Journal of Psychophysiology, 44(1). https://doi.org/10.1016/S0167-8760(01)00194-5.
  • Altan, G., & Inat, G. (2021). EEG based Spatial Attention Shifts Detection using Time-Frequency features on Empirical Wavelet Transform. Journal of Intelligent Systems with Applications. https://doi.org/10.54856/10.54856/jiswa.202112181.
  • Altan, G., & Kutlu, Y. (2018). Generative Autoencoder Kernels on Deep Learning for Brain Activity Analysis. Natural and Engineering Sciences, 3(3). https://doi.org/10.28978/nesciences.468978.
  • Altan, G., Kutlu, Y., & Allahverdi, N. (2016). Deep Belief Networks Based Brain Activity Classification Using EEG from Slow Cortical Potentials in Stroke. International Journal of Applied Mathematics, Electronics and Computers, 205–205. https://doi.org/10.18100/ijamec.270307
  • Altan, G., Yayık, A., & Kutlu, Y. (2021). Deep Learning with ConvNet Predicts Imagery Tasks Through EEG. Neural Processing Letters, 53(4). https://doi.org/10.1007/s11063-021-10533-7. Badcock, N. A., Mousikou, P., Mahajan, Y., De Lissa, P., Thie, J., & McArthur, G. (2013). Validation of the Emotiv EPOC® EEG gaming systemfor measuring research quality auditory ERPs. PeerJ, 2013(1), 1–17. https://doi.org/10.7717/peerj.38.
  • Bălan, O., Moise, G., Moldoveanu, A., Leordeanu, M., & Moldoveanu, F. (2019). Fear level classification based on emotional dimensions and machine learning techniques. Sensors (Switzerland), 19(7), 1–18. https://doi.org/10.3390/s19071738.
  • Barrett, L. F., & Wager, T. D. (2006). The structure of emotion evidence from neuroimaging studies. Current Directions in Psychological Science, 15(2). https://doi.org/10.1111/j.0963-7214.2006.00411.x. Basar, M. D., Duru, A. D., & Akan, A. (2020). Emotional state detection based on common spatial patterns of EEG. Signal, Image and Video Processing, 14(3). https://doi.org/10.1007/s11760-019-01580-8.
  • Bazgir, O., Mohammadi, Z., & Habibi, S. A. H. (2018). Emotion Recognition with Machine Learning Using EEG Signals. 2018 25th Iranian Conference on Biomedical Engineering and 2018 3rd International Iranian Conference on Biomedical Engineering, ICBME 2018. https://doi.org/10.1109/ICBME.2018.8703559.
  • Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1). https://doi.org/10.1016/0005-7916(94)90063-9.
  • Chabin, T., Gabriel, D., Haffen, E., Moulin, T., & Pazart, L. (2020). Are the new mobile wireless EEG headsets reliable for the evaluation of musical pleasure? PLoS ONE, 15(12 December). https://doi.org/10.1371/journal.pone.0244820.
  • Ciuk, D., Troy, A. K., & Jones, M. C. (2015). Measuring Emotion: Self-Reports vs. Physiological Indicators. SSRN Electronic Journal, October. https://doi.org/10.2139/ssrn.2595359.
  • Cruz-Garza, J. G., Brantley, J. A., Nakagome, S., Kontson, K., Megjhani, M., Robleto, D., & Contreras-Vidal, J. L. (2017). Deployment of mobile EEG technology in an art museum setting: Evaluation of signal quality and usability. Frontiers in Human Neuroscience, 11. https://doi.org/10.3389/fnhum.2017.00527.
  • Damasio, A. R. (1998). Emotion in the perspective of an integrated nervous system. Brain Research Reviews, 26(2–3), 83–86. https://doi.org/10.1016/S0165-0173(97)00064-7.
  • de Cesarei, A., & Codispoti, M. (2011). Affective modulation of the LPP and α-ERD during picture viewing. Psychophysiology, 48(10). https://doi.org/10.1111/j.1469-8986.2011.01204.x.
  • Di Flumeri, G., Aricò, P., Borghini, G., Sciaraffa, N., Di Florio, A., & Babiloni, F. (2019). The dry revolution: Evaluation of three different eeg dry electrode types in terms of signal spectral features, mental states classification and usability. Sensors (Switzerland), 19(6). https://doi.org/10.3390/s19061365.
  • Duvinage, M., Castermans, T., Petieau, M., Hoellinger, T., Cheron, G., & Dutoit, T. (2013). Performance of the Emotiv Epoc headset for P300-based applications. BioMedical Engineering Online, 12(1), 1–15. https://doi.org/10.1186/1475-925X-12-56.
  • EEGLAB download page. (n.d.). https://sccn.ucsd.edu/eeglab/download.php.
  • EEGLAB Plotting Channel Spectra Tutorial. (n.d.). https://eeglab.org/tutorials/08_Plot_data/Plotting_Channel_Spectra_and_Maps.html.
  • Eijlers, E., Smidts, A., & Boksem, M. A. S. (2019). Implicit measurement of emotional experience and its dynamics. PLoS ONE, 14(2), 1–15. https://doi.org/10.1371/journal.pone.0211496.
  • Ekman, P., & Friesen, W. V. (1971). Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 17(2). https://doi.org/10.1037/h0030377.
  • Emotiv home page. (n.d.). https://www.emotiv.com/.
  • EPOC user manual. (n.d.). https://emotiv.gitbook.io/epoc-user-manual/.
  • Fakhruzzaman, M. N., Riksakomara, E., & Suryotrisongko, H. (2015). EEG Wave Identification in Human Brain with Emotiv EPOC for Motor Imagery. Procedia Computer Science, 72. https://doi.org/10.1016/j.procs.2015.12.140.
  • Frantzidis, C. A., Bratsas, C., Papadelis, C. L., Konstantinidis, E., Pappas, C., & Bamidis, P. D. (2010). Toward emotion aware computing: An integrated approach using multichannel neurophysiological recordings and affective visual stimuli. IEEE Transactions on Information Technology in Biomedicine, 14(3). https://doi.org/10.1109/TITB.2010.2041553.
  • Harmon-Jones, E., Gable, P. A., & Peterson, C. K. (2010). The role of asymmetric frontal cortical activity in emotion-related phenomena: A review and update. Biological Psychology, 84 (3). https://doi.org/10.1016/j.biopsycho.2009.08.010.
  • Hassouneh, A., Mutawa, A. M., & Murugappan, M. (2020). Development of a Real-Time Emotion Recognition System Using Facial Expressions and EEG based on machine learning and deep neural network methods. Informatics in Medicine Unlocked, 20, 100372. https://doi.org/10.1016/j.imu.2020.100372.
  • Iacoviello, D., Petracca, A., Spezialetti, M., & Placidi, G. (2015). A classification algorithm for electroencephalography signals by self-induced emotional stimuli. IEEE Transactions on Cybernetics, 46(10). https://doi.org/10.1109/TCYB.2015.2498974.
  • Joshi, V. M., & Ghongade, R. B. (2020). IDEA: Intellect database for emotion analysis using EEG signal. Journal of King Saud University-Computer and Information Sciences, xxxx. https://doi.org/10.1016/j.jksuci.2020.10.007.
  • Katsigiannis, S., & Ramzan, N. (2018). DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices. IEEE Journal of Biomedical and Health Informatics, 22(1), 98–107. https://doi.org/10.1109/JBHI.2017.2688239.
  • Klimesch, W. (2012). Alpha-band oscillations, attention, and controlled access to stored information. Trends in Cognitive Sciences, 16(12). https://doi.org/10.1016/j.tics.2012.10.007
  • Klug, M., & Gramann, K. (2020). Identifying key factors for improving ICA-based decomposition of EEG data in mobile and stationary experiments. European Journal of Neuroscience, May, 1–15. https://doi.org/10.1111/ejn.14992.
  • Kumar, N., Khaund, K., & Hazarika, S. M. (2016). Bispectral Analysis of EEG for Emotion Recognition. Procedia Computer Science, 84, 31–35. https://doi.org/10.1016/j.procs.2016.04.062.
  • Lakhan, P., Banluesombatkul, N., Changniam, V., Dhithijaiyratn, R., Leelaarporn, P., Boonchieng, E., Hompoonsup, S., & Wilaiprasitporn, T. (2019). Consumer grade brain sensing for emotion recognition. IEEE Sensors Journal, 19(21), 9896–9907. https://doi.org/10.1109/JSEN.2019.2928781.
  • Lang, P. J., Bradley, M. M., & Cuthbert, B. N. (1997). International affective picture system (IAPS): Technical manual and affective ratings. NIMH Center for the Study of Emotion and Attention, 39–58.
  • Lau-Zhu, A., Lau, M. P. H., & McLoughlin, G. (2019). Mobile EEG in research on neurodevelopmental disorders: Opportunities and challenges. Developmental Cognitive Neuroscience, 36. https://doi.org/10.1016/j.dcn.2019.100635.
  • Laufs, H., Krakow, K., Sterzer, P., Eger, E., Beyerle, A., Salek-Haddadi, A., & Kleinschmidt, A. (2003). Electroencephalographic signatures of attentional and cognitive default modes in spontaneous brain activity fluctuations at rest. Proceedings of the National Academy of Sciences of the United States of America, 100(19). https://doi.org/10.1073/pnas.1831638100.
  • Ledoux, J. E., & Ledoux, J. E. (2000). Emotıon Cırcuıts in the Brain. New York, 155–184. https://doi.org/10.1146/annurev.neuro.23.1.155.
  • Lee, H. W., Cho, H., Lasko, E., Kim, J. W., & Kwon, W. (2020). From knowing the game to enjoying the game: EEG/ERP assessment of emotional processing. International Journal of Sports Marketing and Sponsorship, 21(2), 305–323. https://doi.org/10.1108/IJSMS-11-2018-0119.
  • Lin, Y. P., Wang, C. H., Jung, T. P., Wu, T. L., Jeng, S. K., Duann, J. R., & Chen, J. H. (2010). EEG-based emotion recognition in music listening. IEEE Transactions on Biomedical Engineering, 57(7). https://doi.org/10.1109/TBME.2010.2048568.
  • Masood, N., & Farooq, H. (2019). Investigating EEG patterns for dual-stimuli induced human fear emotional state. Sensors (Switzerland), 19(3), 1–22. https://doi.org/10.3390/s19030522.
  • Mauss, I. B., & Robinson, M. D. (2009). Measures of emotion: A review. Cognition and Emotion 23 (2). https://doi.org/10.1080/02699930802204677.
  • Morris, J. D. (1995). OBSERVATIONS: SAM: The Self-Assessment Manikin - An Efficient Cross-Cultural Measurement of Emotional Response. Journal of Advertising Research, 35(6), 63–68.
  • Murugappan, M., & Murugappan, S. (2013). Human emotion recognition through short time Electroencephalogram (EEG) signals using Fast Fourier Transform (FFT). Proceedings - 2013 IEEE 9th International Colloquium on Signal Processing and Its Applications, CSPA 2013. https://doi.org/10.1109/CSPA.2013.6530058
  • Olofsson, J. K., Nordin, S., Sequeira, H., & Polich, J. (2008). Affective picture processing: An integrative review of ERP findings. Biological Psychology, 77(3). https://doi.org/10.1016/j.biopsycho.2007.11.006.
  • Patel, S. H., & Azzam, P. N. (2005). Characterization of N200 and P300: Selected studies of the Event-Related Potential. International Journal of Medical Sciences, 2(4), 147–154. https://doi.org/10.7150/ijms.2.147.
  • Pessoa, L. (2018). Understanding emotion with brain networks. Current Opinion in Behavioral Sciences, 19. https://doi.org/10.1016/j.cobeha.2017.09.005.
  • Pizzo, F., Roehri, N., Medina Villalon, S., Trébuchon, A., Chen, S., Lagarde, S., Carron, R., Gavaret, M., Giusiano, B., McGonigal, A., Bartolomei, F., Badier, J. M., & Bénar, C. G. (2019). Deep brain activities can be detected with magnetoencephalography. Nature Communications, 10(1). https://doi.org/10.1038/s41467-019-08665-5.
  • Ramirez, R., Planas, J., Escude, N., Mercade, J., & Farriols, C. (2018). EEG-based analysis of the emotional effect of music therapy on palliative care cancer patients. Frontiers in Psychology, 9(MAR). https://doi.org/10.3389/fpsyg.2018.00254.
  • Reuderink, B., Mühl, C., & Poel, M. (2013). Valence, arousal and dominance in the EEG during game play. International Journal of Autonomous and Adaptive Communications Systems, 6(1), 45-62.
  • Ros, T., Théberge, J., Frewen, P. A., Kluetsch, R., Densmore, M., Calhoun, V. D., & Lanius, R. A. (2013). Mind over chatter: Plastic up-regulation of the fMRI salience network directly after EEG neurofeedback. NeuroImage, 65. https://doi.org/10.1016/j.neuroimage.2012.09.046.
  • Sánchez-Reolid, R., García, A. S., Vicente-Querol, M. A., Fernández-Aguilar, L., López, M. T., Fernández-Caballero, A., & González, P. (2018). Artificial neural networks to assess emotional states from brain-computer interface. Electronics (Switzerland), 7(12), 1–12. https://doi.org/10.3390/electronics7120384.
  • Schupp, H. T., Stockburger, J., Bublatzky, F., Junghöfer, M., Weike, A. I., & Hamm, A. O. (2008). The selective processing of emotional visual stimuli while detecting auditory targets: An ERP analysis. Brain Research, 1230. https://doi.org/10.1016/j.brainres.2008.07.024.
  • Shu, L., Xie, J., Yang, M., Li, Z., Li, Z., Liao, D., Xu, X., & Yang, X. (2018). A review of emotion recognition using physiological signals. Sensors (Switzerland), 18(7). https://doi.org/10.3390/s18072074.
  • Simons, R. F., Detenber, B. H., Cuthbert, B. N., Schwartz, D. D., & Reiss, J. E. (2003). Attention to television: Alpha power and its relationship to image motion and emotional content. Media Psychology, 5(3). https://doi.org/10.1207/S1532785XMEP0503_03.
  • Singh, M. I., & Singh, M. (2021). Emotion recognition: An evaluation of ERP features acquired from frontal EEG electrodes. Applied Sciences (Switzerland), 11(9). https://doi.org/10.3390/app11094131.
  • Sokhadze, E. M., Casanova, M. F., Casanova, E., Lamina, E., Kelly, D. P., & Khachidze, I. (2017). Event-related potentials (ERP) in cognitive neuroscience research and applications. NeuroRegulation, 4(1). https://doi.org/10.15540/nr.4.1.14.
  • Soufineyestani, M., Dowling, D., & Khan, A. (2020). Electroencephalography (EEG) technology applications and available devices. Applied Sciences (Switzerland), 10(21), 1–23. https://doi.org/10.3390/app10217453
  • Suhaimi, N. S., Mountstephens, J., & Teo, J. (2020). EEG-Based Emotion Recognition: A State-of-the-Art Review of Current Trends and Opportunities. Computational Intelligence and Neuroscience, 2020. https://doi.org/10.1155/2020/8875426
  • Uusberg, A., Uibo, H., Kreegipuu, K., & Allik, J. (2013). EEG alpha and cortical inhibition in affective attention. International Journal of Psychophysiology, 89(1). https://doi.org/10.1016/j.ijpsycho.2013.04.020.
  • Warriner, A. B., Kuperman, V., & Brysbaert, M. (2013). Norms of valence, arousal, and dominance for 13,915 English lemmas. Behavior Research Methods, 45(4). https://doi.org/10.3758/s13428-012-0314-x.
  • Xu, H., Wang, X., Li, W., Wang, H., & Bi, Q. (2019). Research on EEG channel selection method for emotion recognition. IEEE International Conference on Robotics and Biomimetics, ROBIO 2019. https://doi.org/10.1109/ROBIO49542.2019.8961740.
  • Yuvaraj, R., Murugappan, M., Mohamed Ibrahim, N., Sundaraj, K., Omar, M. I., Mohamad, K., & Palaniappan, R. (2014). Detection of emotions in Parkinson’s disease using higher order spectral features from brain’s electrical activity. Biomedical Signal Processing and Control, 14(1), 108–116. https://doi.org/10.1016/j.bspc.2014.07.005.
  • Zhuang, N., Zeng, Y., Yang, K., Zhang, C., Tong, L., & Yan, B. (2018). Investigating patterns for self-induced emotion recognition from EEG signals. Sensors (Switzerland), 18(3). https://doi.org/10.3390/s18030841.
Toplam 64 adet kaynakça vardır.

Ayrıntılar

Birincil Dil İngilizce
Konular Klinik Tıp Bilimleri
Bölüm Articles
Yazarlar

Ahmet Ergun Gümüş 0000-0002-2044-5504

Çağlar Uyulan Bu kişi benim 0000-0002-6423-6720

Zozan Guleken 0000-0002-4136-4447

Yayımlanma Tarihi 8 Ağustos 2022
Gönderilme Tarihi 4 Ocak 2022
Yayımlandığı Sayı Yıl 2022 Cilt: 7 Sayı: 2

Kaynak Göster

APA Gümüş, A. E., Uyulan, Ç., & Guleken, Z. (2022). Detection of EEG Patterns for Induced Fear Emotion State via EMOTIV EEG Testbench. Natural and Engineering Sciences, 7(2), 148-168. https://doi.org/10.28978/nesciences.1159248

                                                                                               We welcome all your submissions

                                                                                                             Warm regards,
                                                                                                      


All published work is licensed under a Creative Commons Attribution 4.0 International License Link . Creative Commons License
                                                                                         NESciences.com © 2015