Open Access Open Access  Restricted Access Subscription or Fee Access

A Fractal Approach to Characterize Emotions in Audio and Visual Domain: A Study on Cross-Modal Interaction

Sayan Nag, Uddalok Sarkar, Shankha Sanyal, Archi Banerjee, Souparno Roy, Samir Karmakar, Ranjan Sengupta, Dipak Ghosh

Abstract


Abstract: It is already known that both auditory and visual stimulus is able to convey emotions in human mind to different extent. The strength or intensity of the emotional arousal varies depending on the type of stimulus chosen. In this study, we try to investigate the emotional arousal in a cross-modal scenario involving both auditory and visual stimulus while studying their source characteristics. A robust fractal analytic technique called detrended fluctuation analysis (DFA) and its 2D analogue has been used to characterize three (3) standardized audio and video signals quantifying their scaling exponent corresponding to positive and negative valence. It was found that there is significant difference in scaling exponents corresponding to the two different modalities. Detrended cross-correlation analysis (DCCA) has also been applied to decipher degree of cross-correlation among the individual audio and visual stimulus. This is the first of its kind study which proposes a novel algorithm with which emotional arousal can be classified in cross-modal scenario using only the source audio and visual signals while also attempting a correlation between them.

Keywords: Cross-modal valence, Emotions, Audio/visual stimuli, 2D-DFA, Hurst Exponent

Cite this Article: Sayan Nag, Uddalok Sarkar, Shankha Sanyal, Archi Banerjee, Souparno Roy. Samir Karmakar, Ranjan Sengupta, Dipak Ghosh. A Fractal Approach to Characterize Emotions in Audio and Visual Domain: a Study on Cross-Modal Interaction. Journal of Image Processing & Pattern Recognition Progress. 2019; 6(3): 1–7p.


Full Text:

PDF

Refbacks

  • There are currently no refbacks.


This site has been shifted to https://stmcomputers.stmjournals.com/