Metamekanix

I began Metamekanix in January 2016. I had completed a number of other pieces where I had used the current position of an item to determine its future direction and subsequent trajectory and speed. This was in the context of on screen elements. It had also been in conjunction with a desire to impart “freedom” and self-determination to a hierarchy of relations by means of object oriented programming.

I studied for an MA in Electronic Arts at Middlesex University and began to use this approach and apply it to groups or families of objects.

Past Exhibitions

The most fully developed applications of this were:

  • “Loc Reverb” (2002) Deluxe Gallery, Hoxton (Site Soundings + Digital Terrains) with Jeremy Gardiner and John Foxx
  • “Image Recoder” (2007), (2011) Watermans (Sense Detectives), and Federal University of Rio de Janeiro (Arte Forum). https://www.youtube.com/watch?v=-eQtqflcpYc (“Image Recoder” was rewritten in 2014 within the OpenFrameworks platform using its C++ library. This has not yet been exhibited) Webcam data real time piece
  • “Air Painting” (2014) Now Gallery, Greenwich

In “Air Painting” the position of the family of objects was determined by hand positions gathered via a Kinect games controller.

Applying programming to motor movement

I had introduced students to control of motors by means of the Arduino micro controller platform. What really interested me was to find ways of applying this control of screen graphics so that I could create similar kinds of movement in the physical world using electric motors.

I see Metamekanix as the outcome of the physical application of algorithmic procedural movement. The challenge for me has been to implement what I wanted in a field where I was unfamiliar and where other researchers are more focused on robotics, motion control of cameras and other commercial applications. My guides in my own mind were not these but rather Rebecca Horn, Jean Tinguely, Tony Oursler and Rafael Lozano-Hemmer. I wanted the movements to be strange and mysterious but still graceful like Edward Ihnatowicz.

It was also important that I engage with finding the solutions myself, rather than employing a motion control expert. This would mean that I could capitalise on unforeseen possibilities. I decided to spend the major part of the funding on fabrication rather than on mechanical consultancy. However, Metamekanix also offered an opportunity to employ research assistants and I have been ably supported by Jason Taylor and Leticia Barzilai.

Jason has assisted greatly with construction issues, with 3D printing and with other logistics. Leticia pointed the way in the programming field. Also in my mind has been the example of LAB212 in France. This company included programmers, electronics engineers, fabricators and ergonomics designers. I was inspired by the constant interchange between the analogue and digital aspects of their work.

Construction problems cannot all be solved on the drawing board. The physicality of materials imposes its own directives and no matter how much possible problems are built into the planning, the reality is always slightly at odds with any predictions.

The aim of the research was to try and answer the question about the feasibility of applying screen based programming techniques to electric motor movement. I know I have come somewhere near to being able to answer this question positively. In the process, I have sometimes watched the movement of the elements of Metamekanix and sensed that it has become something different to what I planned. Explanations by artists are not always in the best interest of clear elucidation of intentions. I sense there is a level here where words try to overreach their rightful domain.

Richard Colson, London, July 2016

 


0 Comments

I have spent about 4 days programming both with the Arduino IDE and also Processing

My aim has been to use the measurement of time passing as the basis for creating acceleration and deceleration of the motors.

Motor speeds up between 0 and 8 secs
Motor slows down between 8 and 16 secs and so on

If you look at this processing example of Easing
https://processing.org/examples/easing.html you can see the element slowing down. The code works on the present position of the element to determine its speed.

I have been using the amount of time that has passed to do the same thing:

int val1, myPos1,myPos2,speed1,speed1a,speed2,testSpeed,xInt;
int count=0;
int timeExtent1 = 10000;
int timeExtent2 = 18000;
int timeExtent3 = 18000;
int timeExtent4 = 26000;
int timeExtent5 = 27000;
int timeExtent6 = 35000;
int timeExtent7 = 35000;
int timeExtent8 = 43000;
float easing = 0.05;

unsigned long startTime,currTime,ratioTime,targetX,dx,x,sensVal;
#include <SoftwareSerial.h>
void setup() {
// put your setup code here, to run once:
Serial.begin(9600);
startTime = millis();
currTime=millis();
speed1 = 0;
}

void loop() {

while(currTime>=startTime + timeExtent1 && currTime<=startTime + timeExtent2){
currTime=millis();

ratioTime= int(currTime-timeExtent1);
targetX = timeExtent2;
dx = targetX-currTime;
x += dx;
xInt = int (x);
// Serial.println(“doing it 1”);
// Serial.println(ratioTime);
// sensVal = constrain(x, 0, 36000);
sensVal = map(dx,0,8000,0,127);
// put your main code here, to run repeatedly:
speed1 += (sensVal)*easing;

//speed1 = (testSpeed*easing)+127;
speed1a = map(speed1,0,4200,0,127)+127;
Serial.println(speed1a);

}
currTime=millis();
}


0 Comments