A better understanding of observable and quantifiable psychophysiological outputs such as electroencephalography (EEG) during computer video gameplay has significant potential to support the development of an automated, emotionally intelligent system. Integrated into a game engine, such a system could facilitate an effective biofeedback loop, accurately interpreting player emotions and adjusting gameplay parameters to respond to players' emotional states in a way that moves towards exciting ventures in affective interactivity. This paper presents a crucial step to reaching this objective by way of examining the statistical features of EEG that may relate to user experience during audio-centric gameplay. An audioonly test game ensures that game sound is the exclusive stimulus modality with gameplay contextualisation and qualitative data collection enabling the study to focus specifically upon fear. Though requiring of an unambiguous horror-game context, the results documented within this paper identify several statistical features of EEG data that could differentiate fear from calm.