112 research outputs found

    Designing Game Audio Based on Avatar-Centred Subjectivity

    Get PDF
    This chapter explores a selection of practical approaches for designing video game audio based on the subjective perception of a player avatar. The authors discuss several prototype video game systems developed as part of their practice-led research, which provide interactive audio systems that represent the aural experience of a virtual avatar undergoing an altered state of consciousness. Through the discussion of these prototypes, the authors expose a variety of possible approaches for sound design in order to represent the subjective perceptual experiences of a player avatar

    In-Game Intoxication: Demonstrating the Evaluation of the Audio Experience of Games with a Focus on Altered States of Consciousness

    Get PDF
    In this chapter, we consider a particular method of specifically evaluating the user experience of game audio. To provide a domain of game audio to evaluate, we focus on an increasingly occuring phenomenon in game; that of the altered state of consciousness. Our approach seeks to evaluate user experience of game audio from normal gameplay and gameplay that features altered states. As such, a brief background to person-centered approaches to use experience evaluation is presented and then we provide a detailed description of the method that has been adopted in this chapter: the use of personal construct theory via repertory grid interviews

    Digitized Direct Animation: Creating Materials for Electroacoustic Visual Music Using 8mm Film

    Get PDF
    “Direct animation” (also called “drawn-on animation” or “camera-less animation”) involves the direct application of paints and other artistic materials to celluloid film in order to construct animated visual material. This article provides a brief overview regarding the historical use of this technique in the work of other experimental film-makers, before discussing the use of this process to create Mezcal Animations #1–3: a piece of psychedelic visual music with electroacoustic sound

    The Sound of the Smell (and Taste) of My Shoes Too: Mapping the Senses Using Emotion as a Medium

    Get PDF
    This work discusses basic human senses: sight; sound; touch; taste; and smell; and the way in which it may be possible to compensate for lack of one, or more, of these by explicitly representing stimuli using the remaining senses. There may be many situations or scenarios where not all five of these base senses are being stimulated, either because of an optional restriction or deficit or because of a physical or sensory impairment such as loss of sight or touch sensation. Related to this there are other scenarios where sensory matching problems may occur. For example: a user immersed in a virtual environment may have a sense of smell from the real world that is unconnected to the virtual world. In particular, this paper is concerned with how sound can be used to compensate for the lack of other sensory stimulation and vice-versa. As a link is well established already between the visual, touch, and auditory systems, more attention is given to taste and smell, and their relationship with sound. This work presents theoretical concepts, largely oriented around mapping other sensory qualities to sound, based upon existing work in the literature and emerging technologies, to discuss where particular gaps currently exist, how emotion could be a medium to cross-modal representations, and how these might be addressed in future research. It is postulated that descriptive qualities, such as timbre or emotion, are currently the most viable routes for further study and that this may be later integrated with the wider body of research into sensory augmentation

    Simulating Auditory Hallucinations in a Video Game: Three Prototype Mechanisms

    Get PDF
    In previous work the authors have proposed the concept of ’ASC Simulations’: including audio-visual installations and experiences, as well as interactive video game systems, which simulate altered states of consciousness (ASCs) such as dreams and hallucinations. Building on the discussion of the authors’ previous paper, where a large-scale qualitative study explored the changes to auditory perception that users of various intoxicating substances report, here the authors present three prototype audio mechanisms for simulating hallucinations in a video game. These were designed in the Unity video game engine as an early proof-of-concept. The first mechanism simulates ’selective auditory attention’ to different sound sources, by attenuating the amplitude of unattended sources. The second simulates ’enhanced sounds’, by adjusting perceived brightness through filtering. The third simulates ’spatial disruptions’ to perception, by dislocating sound sources from their virtual acoustic origin in 3D-space, causing them to move in oscillations around a central location. In terms of programming structure, these mechanisms are designed using scripts that are attached to the collection of assets that make up the player character, and in future developments of this type of work we foresee a more advanced, standardised interface that models the senses, emotions and state of consciousness of player avatars

    Sound Through The Rabbit Hole: Sound Design Based On Reports of Auditory Hallucination

    Get PDF
    As video game developers seek to provide increasing levels of realism and sophistication, there is a need for game characters to be able to exhibit psychological states including ‘altered states of consciousness’ (ASC) realistically. ‘Auditory hallucination’ (AH) is a feature of ASC in which an individual may perceive distortions to auditory perception, or hear sounds with no apparent acoustic origin. Appropriate use of game sound may enable realistic representations of these sounds in video games. However to achieve this requires rigorous approaches informed by research. This paper seeks to inform the process of designing sounds based on auditory hallucination, by reporting the outcomes of analysing nearly 2000 experience reports that describe drug-induced intoxication. Many of these reports include descriptions of auditory hallucination. Through analysis of these reports, our research establishes a classification system, which we propose can be used for designing sounds based on auditory hallucination

    Easter Eggs: Hidden Tracks and Messages in Musical Mediums

    Get PDF

    An interactive music playlist generator that responds to user emotion and context

    Get PDF
    This paper aims to demonstrate the mechanisms of a music recommendation system, and accompanying graphical user interface (GUI), that is capable of generating a playlist of songs based upon an individual’s emotion or context. This interactive music playlist generator has been designed as part of a broader system, Intended for mobile devices, which aims to suggest music based upon ‘how the user is feeling’ and ‘what the user is doing’ by evaluating real-time physiological and contextual sensory data using machine learning technologies. For instance, heart rate and skin temperature in conjunction with ambient light, temperature and global positioning satellite (GPS) could be used to a degree to infer one’s current situation and corresponding mood. At present, this interactive music playlist generator has the ability to conceptually demonstrate how a playlist can be formed in accordance with such physiological and contextual parameters. In particular, the affective aspect of the interface is visually represented as a two-dimensional arousal-valence space based upon Russell’s circumplex model of affect (1980). Context refers to environmental, locomotion and activity concepts, and are visually represented in the interface as sliders. These affective and contextual components are discussed in more detail next in Sections 2 and 3, respectively. Section 4 will demonstrate how an affective and contextual music playlist can be formed by interacting with the GUI parameters. For a comprehensive discussion in terms of the development of this research, refer to (Griffiths et al. 2013a, 2013b, 2015). Moreover, refer to Teng et al. (2013) and Yang et al. (2008) for related work in these broader research areas

    Holophonor: On the Future Technology of Visual Music

    Get PDF
    This chapter discusses the progression of visual music and related audio-visual artworks through the 20th Century and considers the next steps for this field of research. The principles of visual music are described, with reference to the films of early pioneers such as John Whitney. A further exploration of the wider spectrum of subsequent work in various audio-visual art forms is then given. These include visualisations, light synthesizers, VJ performances, digital audio-visual artworks, projection mapping artworks, and interactive visual music artworks. Through consideration of visual music as a continuum of related work, the authors consider the Holophonor, a fictional audio-visual instrument, as an example of the ideal visual music instrument of the future. They conclude by proposing that a device such as the Holophonor could be constructed in the near future by utilising inter-disciplinary approaches from the fields of HCI and affective computing

    EEG as a Controller for Psychedelic Visual Music in an Immersive Dome Environment

    Get PDF
    Psych Dome (Weinel 2013a) is a short interactive piece of visual music first presented in an immersive 'full dome' environment that forms part of the authors' on-going research regarding Altered States of Consciousness (ASC) as a basis for the design of computer-based artworks
    corecore