Experimentation #1 is a project consisting of use of Kinect, Processing,
and Second Life. The project focuses on gestural interface, via which
three components (avatar, sound, visual) are brought together in
real-time in a mixed-reality performance. The gestural interface via
Kinect and Processing programming enable elimination of pre-programmed
(pre-animated) and pre-recorded sequences of avatar movements, sound and
visual components.
Kinect captures real-life body movement to control avatar movement in
Second Life. Kinect is also used for the part of real-life audio-visual
performance, for which Kinect captures the real-life imagery of the
performer and her surroundings and manipulates it in real-time via
programming in Processing. The sound is generated also through gestural interface. The audio-visual performance
is then streamed into Second Life to be combined with the movement of
the avatar in real-time. Because of the gestural interface for the
movement of the avatar and incorporation of real-life surroundings, each
performance becomes truly unique.
Experimentation #1 is my first real-time mixed-reality virtual
performance, and has been made possible with support from HUMlab, Umeå
University, Sweden. This machinima was made of documentation materials
taken in HUMlab's H3 location. Second Life location at HUMlab sim.
Monday, 19 March 2012
Wednesday, 25 January 2012
Tuesday, 17 January 2012
Work in Progress
From my new project using Kinect. Linking Second Life and Real Life via Kinect. Avatar's movements and the projected image are done live, both of which are managed by my own movements in real-time.
Subscribe to:
Posts (Atom)