Variation of Facial Expression Produced by Acoustic Stimuli
Academic Article in Scopus
-
- Overview
-
- Identity
-
- Additional document info
-
- View All
-
Overview
abstract
-
© 2018 IEEE.In this paper we will present the analyzed results of an experiment that relates the facial expression with acoustic stimuli. Music therapy can be used for treating autistic children, diseases and disorders, software development and personalized musical systems [5]. Through the data obtained using iMotions', the analyzed data was processed using Matlab' and Excel. Matlab' was used to analyze the raw data using the data analysis tools, in order to determine the impact of artificially synthesized versus human composed audios. The data was obtained from three experiments, each experiment had different conditions, each experiment had different number of participants, the length and type of audio segment changed for each experiment. The data obtained from these three experiments were analyzed using FACET of iMotions'. For the next stage we analyzed Neutral Emotion, the purpose of this analysis is of outmost importance for future stages for a proper development of a recurrent neural network to develop an algorithm to create music sequences, with potential therapeutic, educational and productivity improvement implementations.
status
publication date
Identity
Digital Object Identifier (DOI)
Additional document info
has global citation frequency
start page
end page