0 Comments

Since following an online course on ‘machine learning’ I have been able to use some of what I have learned to manipulate Kinect and Leap Motion data to create visuals that respond to an environment through the 3d depth data provided by Kinect, and gestures through Leap Motion and also the level and pitch of audio input. Here is a sample of one of the presets I have generated, projected on to a make shift environment made from found materials.

https://vimeo.com/166276820


0 Comments