Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2012, Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering
…
2 pages
1 file
British Library, 2017
ABSTRACT An increasing understanding from various research fields shows that body-movement affects the way we perceive and evaluate ourselves and the environment, this raises new and challenging questions regarding embodiment in interaction, in virtual reality (VR) and data-live environments. New technologies have brought the body back to the centre of the interactive experience. This research is interdisciplinary and explores body-interactive audio-visuals in relation to embodied cognition, embodiment and body perception theory in the context of embodied interactive audio-visual art installations. This was practice-based research, through a series of experimental studies that results in a Pilot Study and Main study. This was to inform the aesthetic and technical design of embodied interactive audio visuals useful to art research, digital research, and human-computer-interaction (HCI) User experience was important to consider in the creative design of interactive digital audio-visual installation environments and technologies in this research. The aim was to uncover phenomena and perceptual experiencers of user interactive experience with the technologies in these studies, to inform body-movement perception. This was processed from interview/questionnaires which were recorded, transcribed and analyzed. This research contributes to knowledge in the field of human-computer-interaction in interactive audio-visual art environments, artist’s research practice in interactive audio-visual research, from a phenomenological view of users’ perception and consciousness in relation to sound and 3-D visual interactive technologies. This research in embodied interaction with audio-visuals and human-computer-interaction (HCI) found users’ perception oscillated between proprioceptive and movement-vision, modes of perception and interaction. It was found that these salient modes of perception during interaction, that Proprioceptive were more for sound interaction, and movement-vision more for 3-D interaction. I developed a series of software tools; for body-interactive 3-D visuals and analogue synthesizer modulation. Full-body-part tracking of motion x, y, z, co-ordinates in performance space; raw, pitch and roll data of all body parts tracked in motion, which can be used in simultaneously interactive sound and 3-D visuals. A theory of embodied interactive audio-visuals data-live environments was developed from analysis of user experience. The research expands existing knowledge in the fields of human-computer-interaction (HCI): digital media, augmented and virtual realities, embodied interaction, interactive audio-visuals, and new media theory.
PhD Thesis Introduction, 2017
ABSTRACT An increasing understanding from various research fields shows that body-movement affects the way we perceive and evaluate ourselves and the environment, this raises new and challenging questions regarding embodiment in interaction, in virtual reality (VR) and data-live environments. New technologies have brought the body back to the centre of the interactive experience. This research is interdisciplinary and explores body-interactive audio-visuals in relation to embodied cognition, embodiment and body perception theory in the context of embodied interactive audio-visual art installations. This was practice-based research, through a series of experimental studies that results in a Pilot Study and Main study. This was to inform the aesthetic and technical design of embodied interactive audio visuals useful to art research, digital research, and human-computer-interaction (HCI) User experience was important to consider in the creative design of interactive digital audio-visual installation environments and technologies in this research. The aim was to uncover phenomena and perceptual experiencers of user interactive experience with the technologies in these studies, to inform body-movement perception. This was processed from interview/questionnaires which were recorded, transcribed and analyzed. This research contributes to knowledge in the field of human-computer-interaction in interactive audio-visual art environments, artist’s research practice in interactive audio-visual research, from a phenomenological view of users’ perception and consciousness in relation to sound and 3-D visual interactive technologies. This research in embodied interaction with audio-visuals and human-computer-interaction (HCI) found users’ perception oscillated between proprioceptive and movement-vision, modes of perception and interaction. It was found that these salient modes of perception during interaction, that Proprioceptive were more for sound interaction, and movement-vision more for 3-D interaction. I developed a series of software tools; for body-interactive 3-D visuals and analogue synthesizer modulation. Full-body-part tracking of motion x, y, z, co-ordinates in performance space; raw, pitch and roll data of all body parts tracked in motion, which can be used in simultaneously interactive sound and 3-D visuals. A theory of embodied interactive audio-visuals data-live environments was developed from analysis of user experience. The research expands existing knowledge in the fields of human-computer-interaction (HCI): digital media, augmented and virtual realities, embodied interaction, interactive audio-visuals, and new media theory.
This paper presents user's communication behavior in a pseudo same-room videoconferencing named “Being Here System,” in comparison with a conventional videoconferencing. The system extracts the remote person's figure and superimposes it on the local site's front view in a large display in real-time. This method makes the local person feel as if the remote person was before him/her in his/her spatial environment. To investigate the influence of the system on user’s communication, the recorded video of the system evaluation experiment was analyzed. This revealed that the system significantly affected user's communication behavior such as turn taking, speech overlap, and gaze directions.
Proceedings of the 2014 Federated Conference on Computer Science and Information Systems, 2014
Humans can interact remotely with each other through computers. Systems supporting this include teleconferencing, games and virtual environments. There are delays from when a human does an action until it is reflected remotely. When delays are too large, they will result in inconsistencies in what the state of the interaction is as seen by each participant. The delays can be reduced, but they cannot be removed. When delays become too large the effects they create on the humanto-human remote interaction can be partially masked to achieve an illusion of insignificant delays. The MultiStage system is a human-to-human interaction system meant to be used by actors at remote stages creating a common virtual stage. Each actor is remotely represented by a remote presence created based on a stream of data continuously recorded about the actor and being sent to all stages. We in particular report on the subsystem of MultiStage masking the effects of delays. The most advanced masking approach is done by having each stage continuously look for late data, and when masking is determined to be needed, the system switches from using a live stream to a pre-recorded video of an actor. The system can also use a computable model of an actor creating a remote presence substituting for the live stream. The present prototype uses a simple human skeleton model.
2007
In future Ambient Intelligence environments we assume intelligence embedded in the environment and its virtual, sometimes visualized agents (virtual humans). These environments support the human inhabitants or visitors in their activities and interactions by perceiving them through their sensors. In this paper we look at our research on bodily and gestural interaction with environments equipped with simple sensors, application-dependent intelligence, and an embodied virtual agent employed in the display of reactive and pro-active activity. The virtual humans we discuss play roles such as dance partner, conductor or trainer. All of them require the perception and the generation of bodily activity and other display of nonverbal communication. The role of affect and persuasion in these ambient entertainment environments is touched upon.
Body, Space & Technology
This paper summarizes my own arts-practice as research on body-movement-interaction in interactive digital 3D audiovisual installations. In this research I developed a series of interactive 3D audiovisual installations that were informed by body-perception theories, embodiment theory, new media philosophy, cultural theory and virtual reality. The interactive designs for these installations explored body-movement perception, which is a form of embodied movement awareness. The interactive designs were created to alter an experiencers' body-movement sense perception to focus upon how body-movement perception affects creative imagination. I explored this to see what affects the changes and adjustments made by the body caused by the interactive environment might instigate in terms of a heighted awareness and perception towards a more creative imagination from the experiencer. 1 These interactive installations also explore human computer interaction (HCI) from an embodied psychological perspective.
Anais do XVIII Simpósio Brasileiro de Computação Musical (SBCM 2021)
Camera-based audiovisual interactions have been greatly used in art installations. For such, it is often necessary to use specialized hardware and software. This restricts audiovisual installations to areas that can both afford them financially and be physically accessed by the audience. In this work, we present the development of a web-deployed composition tool for audiovisual interactions that runs on the client side and does not require installing any additional software. The tool allows quickly configuring areas for movement capture, and customizing the corresponding audio feedback. Simultaneously, it provides visual feedback that can aid the audience to understand the experience. Consequently, the tool can be used to compose audiovisual interactions that reach a large audience via web. Moreover, its virtual and ubiquitous nature can foster a diversity of artistic explorations related to the remote characteristic of the relationship between audience and artwork.
2001
together, people have used them for communication. From email—one of the first network applications—to the original MIT Zephyr notification system to the now-popular AOL Instant Messenger System, people use computers to break down physical distances and provide a sense of awareness about another person in another place. Beyond basic text interaction, research—such as the Portholes project at Xerox PARC—has strived to bring people together by giving them the ability to, for example, share snapshots or videos of themselves in remote places. Although these systems use still shots or video, they still required the human to perform intelligent processing to determine whether people were present, busy, or available to talk. As interaction with computers moves away from the traditional domain of a fixed keyboard and mouse moving a pointer on a screen, it becomes possible for the computer itself to determine a person’s state using perceptive presence applications. This state could include b...
This paper discusses two multichannel interactive audiovisual artworks, Action A/V and SoundLabyrinth, that explore approaches to the experience of gesture, sound and place. Both works were situated in a geodesic dome frame and built within the Max, Ableton and Max for Live computer programs, and produced ostensibly similar outcomes, however the approaches taken by the two authors differ in intention, processes, and philosophy. These approaches were presented and discussed in workshops delivered at ISEA2013, on Sunday June 9 and Monday June 10 2013. In these workshops participants improvised with the two systems, both through moving in the dome and by operating the related software, and discussed approaches and understandings of the three terms listed in the title of this paper.
International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2023
Fabrication, 2024
Religion, Brain and Behaviour (forthcoming), 2025
Animal Research International, 2024
Journal of the American Academy of Religion, 2016
Journal of Pharmaceutical Sciences, 2002
Comics, Culture, and Religion: Faith Imagined, 2023
Korokon, tereken túl : Tanulmánykötet Kiss László tiszteletére. Eszterházy Károly Katolikus Egyetem Líceum Kiadó, Eger, pp. 77-112. ISBN 978-963-496-261-8, 2023
Anais do XIII Encontro Nacional de Pesquisa em Educação em Ciências, 2021
Alcoholism: Clinical and Experimental Research, 1995
JAMA Dermatology, 2015
Current Oncology, 2018
Journal of Head Trauma Rehabilitation, 2011
An Official Journal of the Japan Primary Care Association, 2018
Marine Mammal Science, 1999
Química Nova, 2001
Microbial Biotechnology, 2020