References: [1] Tatsuo Unemi, “SBArt4 as Automatic Art and Live Performance Tool”, GA 2011 – XIV ... more References: [1] Tatsuo Unemi, “SBArt4 as Automatic Art and Live Performance Tool”, GA 2011 – XIV Generative Art Conference, Rome, 2011. [2] www.intlab.soka.ac.jp/ ~unemi/sbart/4/ Abstract: As everyone knows, the sound effect and background music of motion picture is effective to emphasize what the author wanted to express. However, in case of fully automated generative animation, it is difficult to introduce such a method for design of accompanying sounds because the generator has no intention behind the process. This paper introduces a method to synthesize waveforms of sounds for an automated evolutionary animation [1, 2] by computer. To emphasize the emotional effects for viewers, it was designed as to fit the psychological effect of sounds with visuals under some intuitive correspondences between these two different modalities, such as a brighter image is associated with a higher pitch, a more complex texture inspires a noisier or more solid tone, and so on. The other mappings be...
References: [1] Tatsuo Unemi, “SBArt4 Breeding Abstract Animations in Real time”, Proc. of CEC 20... more References: [1] Tatsuo Unemi, “SBArt4 Breeding Abstract Animations in Real time”, Proc. of CEC 2010, pp. 4004-4009, Barcelona, Spain, 2010. Abstract: The recent innovation of graphics processing unit (GPU) improved the calculation performance to be fast enough to realize breeding animations in real time on the personal computer. SBArt4 compiles each expression in genotype into a type of shading language, Core Image kernel language, that directly runs on GPU. Even when it renders each frame of the animation in real time, it achieves enough speed for users to evaluate the product of an abstract animation immediately. The compiled code can be exported to another application that utilizes Core Image framework on MacOS X. Four types of video effect plug-ins for Final Cut Pro and an independent application for slide presentation were examined. It is useful not only to create an abstract animation in arbitrary size and duration but also to make a transition effect by deformation and/or dis...
Identity SA is an interactive and generative installation that combines a swarm-based simulation ... more Identity SA is an interactive and generative installation that combines a swarm-based simulation with real time camera based interaction. The agents’ distributions are transformed into “painterly” images by employing a variety of different visualization techniques and styles, such as texture surfaces, short line segments, font glyphs, curved lines, etc. Camera based Interaction is based on a simple motion detection algorithm that affects the agents’ movements as well as their coloring. Normally, an agent’s color is solely determined by its orientation, but whenever the tracking system detects a visitor’s motion, the agent’s color is additionally affected by the corresponding pixel color in the camera frame. It acts as a visual and acoustic mirror, which distorts the continuity of the visitor’s physical existence into ephemeral patterns and flowing motions.
The recent innovation of graphics processing unit (GPU) improved the calculation performance to b... more The recent innovation of graphics processing unit (GPU) improved the calculation performance to be fast enough to realize breeding animations in real time on the personal computer. SBArt4 compiles each expression in genotype into a type of shading language, Core Image kernel language, that directly runs on GPU. Even when it renders each frame of the animation in real time, it achieves enough speed for users to evaluate the product of an abstract animation immediately. The compiled code can be exported to another application that utilizes Core Image framework on MacOS X. Four types of video effect plug-ins for Final Cut Pro and an independent application for slide presentation were examined. It is useful not only to create an abstract animation in arbitrary size and duration but also to make a transition effect by deformation and/or discoloration.
Cycles is an interactive installation that establishes an intimate relationship between the visit... more Cycles is an interactive installation that establishes an intimate relationship between the visitor's physical body and simulated organisms. It explores notions of transience and identity that draw inspiration from Buddhist philosophy. Cycles creates a situation that causes the visitor to experience his or her own body in a state of mutability and transience. Cycles merges the appearance of the visitor's hand with a visual repres - entation of a swarm simulation. By bridging the gap between the virtual and physical, a hybrid entity comes into existence whose rapidly changing body blends artificial and natural properties. This hybrid entity progresses through a life cycle that reenacts the four Buddhist sufferings.
This paper presents an interactive installation that employs flocking algorithms to produce music... more This paper presents an interactive installation that employs flocking algorithms to produce music and visuals. The user's motions are captured by a video camera and influence the flocks behaviour. Each agent moving in a virtual 3D space controls a MIDI instrument whose playing style depends on the agent's state. In this system, the user acts as a conductor influencing the flock's musical activity. In addition to gestu-raI interaction, the acoustic properties of the system can be modified on the fly by using an intuitive GUI. The acoustical and visual output of the system results from the combination of the flock's and user's behaviour. It therefore creates on the behavioural level a mixing of natural and artificial reality. The system has been designed to run an a variety of different computational configuration ranging from small laptops to exhibition scale installations.
IEEE International Conference on Systems, Man and Cybernetics
This paper proposes a method of Interactive Evolutionary Computation (IEC) for large scale target... more This paper proposes a method of Interactive Evolutionary Computation (IEC) for large scale target domains of which structure is well organized. In general, it is effective to divide a large problem into a number of sub-problems to solve it efficiently. For IEC tools, it becomes possible by adding a GUI-based facility for the user to indicate some part of genotype protected against random initialization and mutation. The user can try alternative candidates only on the non-protected parts as a sub-problem through breeding process: iteration of subjective selection, mutation, and crossover. This method was invented through development of an application for musical composition support system, but it is useful for other domains, such as graphics.
Premise "Identity SA" is an interactive and generative installation that combines a swarmbased si... more Premise "Identity SA" is an interactive and generative installation that combines a swarmbased simulation with real time camera based interaction that we presented at GA conference 2007. We extended it by embedding two methods to generate sounds and music from pre-recorded sampled sounds. One is to let agent trigger sounds at intervals that are synchronized to a particular musical rhythm. The other one is to generate sounds whose transposition and timing is purely related to the agent's properties. The probability that an agent triggers a sound is proportional to the square of its angular velocity for both cases. By mixture of these two methods, it generates a rich variety of attractive sounds that react with the visitor's motion.
Cooperation or defection and participation or withdrawal are wellknown options of behavior in gam... more Cooperation or defection and participation or withdrawal are wellknown options of behavior in game-like activities in free societies, yet the coevolutionary dynamics of these behavioral traits in the individual level are not well understood. Here we investigate the continuous voluntary public good game, in which individuals have two types of continuous-valued options: a probability of joining the public good game and a level of cooperative investment in the game. Our numerical results reveal hitherto unreported phenomena: (i) The evolutionary dynamics are initially characterized by oscillations in individual cooperation and participation levels, in contrast to the population-level oscillations that have previously been reported. (ii) Eventually, the population's average cooperation and participation levels converge to and stabilize at a center. (iii) Then, a most peculiar phenomenon unfolds: The strategies present in the population diversify and give rise to a "cloud" of tinkering individuals who each tries out a different strategy, and this process
The project MediaFlies realizes an interactive multi agent system, which remixes life and prereco... more The project MediaFlies realizes an interactive multi agent system, which remixes life and prerecorded audio and video material. Agents engage in flocking and behavior synchronization and thereby control the material's continuously changing fragmentation and rearrangement. Visitors can influence the agents' behaviors via a video tracking system and thus shift the ratio of disorder and recognizability in MediaFlies acoustic and visual feedback.
References: [1] Tatsuo Unemi, “SBArt4 as Automatic Art and Live Performance Tool”, GA 2011 – XIV ... more References: [1] Tatsuo Unemi, “SBArt4 as Automatic Art and Live Performance Tool”, GA 2011 – XIV Generative Art Conference, Rome, 2011. [2] www.intlab.soka.ac.jp/ ~unemi/sbart/4/ Abstract: As everyone knows, the sound effect and background music of motion picture is effective to emphasize what the author wanted to express. However, in case of fully automated generative animation, it is difficult to introduce such a method for design of accompanying sounds because the generator has no intention behind the process. This paper introduces a method to synthesize waveforms of sounds for an automated evolutionary animation [1, 2] by computer. To emphasize the emotional effects for viewers, it was designed as to fit the psychological effect of sounds with visuals under some intuitive correspondences between these two different modalities, such as a brighter image is associated with a higher pitch, a more complex texture inspires a noisier or more solid tone, and so on. The other mappings be...
References: [1] Tatsuo Unemi, “SBArt4 Breeding Abstract Animations in Real time”, Proc. of CEC 20... more References: [1] Tatsuo Unemi, “SBArt4 Breeding Abstract Animations in Real time”, Proc. of CEC 2010, pp. 4004-4009, Barcelona, Spain, 2010. Abstract: The recent innovation of graphics processing unit (GPU) improved the calculation performance to be fast enough to realize breeding animations in real time on the personal computer. SBArt4 compiles each expression in genotype into a type of shading language, Core Image kernel language, that directly runs on GPU. Even when it renders each frame of the animation in real time, it achieves enough speed for users to evaluate the product of an abstract animation immediately. The compiled code can be exported to another application that utilizes Core Image framework on MacOS X. Four types of video effect plug-ins for Final Cut Pro and an independent application for slide presentation were examined. It is useful not only to create an abstract animation in arbitrary size and duration but also to make a transition effect by deformation and/or dis...
Identity SA is an interactive and generative installation that combines a swarm-based simulation ... more Identity SA is an interactive and generative installation that combines a swarm-based simulation with real time camera based interaction. The agents’ distributions are transformed into “painterly” images by employing a variety of different visualization techniques and styles, such as texture surfaces, short line segments, font glyphs, curved lines, etc. Camera based Interaction is based on a simple motion detection algorithm that affects the agents’ movements as well as their coloring. Normally, an agent’s color is solely determined by its orientation, but whenever the tracking system detects a visitor’s motion, the agent’s color is additionally affected by the corresponding pixel color in the camera frame. It acts as a visual and acoustic mirror, which distorts the continuity of the visitor’s physical existence into ephemeral patterns and flowing motions.
The recent innovation of graphics processing unit (GPU) improved the calculation performance to b... more The recent innovation of graphics processing unit (GPU) improved the calculation performance to be fast enough to realize breeding animations in real time on the personal computer. SBArt4 compiles each expression in genotype into a type of shading language, Core Image kernel language, that directly runs on GPU. Even when it renders each frame of the animation in real time, it achieves enough speed for users to evaluate the product of an abstract animation immediately. The compiled code can be exported to another application that utilizes Core Image framework on MacOS X. Four types of video effect plug-ins for Final Cut Pro and an independent application for slide presentation were examined. It is useful not only to create an abstract animation in arbitrary size and duration but also to make a transition effect by deformation and/or discoloration.
Cycles is an interactive installation that establishes an intimate relationship between the visit... more Cycles is an interactive installation that establishes an intimate relationship between the visitor's physical body and simulated organisms. It explores notions of transience and identity that draw inspiration from Buddhist philosophy. Cycles creates a situation that causes the visitor to experience his or her own body in a state of mutability and transience. Cycles merges the appearance of the visitor's hand with a visual repres - entation of a swarm simulation. By bridging the gap between the virtual and physical, a hybrid entity comes into existence whose rapidly changing body blends artificial and natural properties. This hybrid entity progresses through a life cycle that reenacts the four Buddhist sufferings.
This paper presents an interactive installation that employs flocking algorithms to produce music... more This paper presents an interactive installation that employs flocking algorithms to produce music and visuals. The user's motions are captured by a video camera and influence the flocks behaviour. Each agent moving in a virtual 3D space controls a MIDI instrument whose playing style depends on the agent's state. In this system, the user acts as a conductor influencing the flock's musical activity. In addition to gestu-raI interaction, the acoustic properties of the system can be modified on the fly by using an intuitive GUI. The acoustical and visual output of the system results from the combination of the flock's and user's behaviour. It therefore creates on the behavioural level a mixing of natural and artificial reality. The system has been designed to run an a variety of different computational configuration ranging from small laptops to exhibition scale installations.
IEEE International Conference on Systems, Man and Cybernetics
This paper proposes a method of Interactive Evolutionary Computation (IEC) for large scale target... more This paper proposes a method of Interactive Evolutionary Computation (IEC) for large scale target domains of which structure is well organized. In general, it is effective to divide a large problem into a number of sub-problems to solve it efficiently. For IEC tools, it becomes possible by adding a GUI-based facility for the user to indicate some part of genotype protected against random initialization and mutation. The user can try alternative candidates only on the non-protected parts as a sub-problem through breeding process: iteration of subjective selection, mutation, and crossover. This method was invented through development of an application for musical composition support system, but it is useful for other domains, such as graphics.
Premise "Identity SA" is an interactive and generative installation that combines a swarmbased si... more Premise "Identity SA" is an interactive and generative installation that combines a swarmbased simulation with real time camera based interaction that we presented at GA conference 2007. We extended it by embedding two methods to generate sounds and music from pre-recorded sampled sounds. One is to let agent trigger sounds at intervals that are synchronized to a particular musical rhythm. The other one is to generate sounds whose transposition and timing is purely related to the agent's properties. The probability that an agent triggers a sound is proportional to the square of its angular velocity for both cases. By mixture of these two methods, it generates a rich variety of attractive sounds that react with the visitor's motion.
Cooperation or defection and participation or withdrawal are wellknown options of behavior in gam... more Cooperation or defection and participation or withdrawal are wellknown options of behavior in game-like activities in free societies, yet the coevolutionary dynamics of these behavioral traits in the individual level are not well understood. Here we investigate the continuous voluntary public good game, in which individuals have two types of continuous-valued options: a probability of joining the public good game and a level of cooperative investment in the game. Our numerical results reveal hitherto unreported phenomena: (i) The evolutionary dynamics are initially characterized by oscillations in individual cooperation and participation levels, in contrast to the population-level oscillations that have previously been reported. (ii) Eventually, the population's average cooperation and participation levels converge to and stabilize at a center. (iii) Then, a most peculiar phenomenon unfolds: The strategies present in the population diversify and give rise to a "cloud" of tinkering individuals who each tries out a different strategy, and this process
The project MediaFlies realizes an interactive multi agent system, which remixes life and prereco... more The project MediaFlies realizes an interactive multi agent system, which remixes life and prerecorded audio and video material. Agents engage in flocking and behavior synchronization and thereby control the material's continuously changing fragmentation and rearrangement. Visitors can influence the agents' behaviors via a video tracking system and thus shift the ratio of disorder and recognizability in MediaFlies acoustic and visual feedback.
Uploads
Papers by Tatsuo Unemi