MA Fine Art: Site and Archive Intervention
I’ve now managed to translate the x,y coordinates for each paving slab into a series of sine waves. This was done using a patch in MAx which Dan Wilkinson created for me.
We are installing on Wednesday and the installation will run from Thursday 28th June – Saturday 30th June 2012 as part of the lancashire Science Festival.
I’ve attached an image which is the proposed arrangement of IR sensors and laser triggers. It took a while to sort the triggering out but this is now working and uses a MAx patch linked to an Arduino.
We may use more lasers when the project is repeated for Digital Aesthetic 3 in October.
Had a great meeting yesterday with Daniel and Wei from the school of comouting and engineering at UCLAN. They are up for helping with the projecy and we discussed a varuety of ways the space could be made interactive. The resolution of the grid is really dependant on money and a compromise may have to be made. I think its doable though as the disti=ribution of gum near the entrance is greatest then this area can have a finer resolution grid than the area furthest from the door.
Its also a question of the resolution of a listener’s hearing and the resoluton of movement – half a step might be enough?
So the next step is to get a few Passive Infra red sensors and start testing their range and coverage. Most IR sensors have quite a wide beam say 110 degrees.
I’ve got an idea for a way to narrow the beam that doesn’t involve sticking tape over the sensor.
I think if the sensor is at one end of a tube then this will significantly narrow the sensing diameter. Increasing the length of the tube should decrease the sensing area.
Last week was the UCLAN MA Fine Art Interim show. This was an opportunity for all the artists who finsih in September 2012 to have a test run and see how their work is looking in the gallery space.
It was a great show and really well attended.
Those of us on the Site & Archive pathway aren’t required to show anything if we don’t want to and usually for the final degree show put up a small display of documenation relating to our project.
I showed 3 pieces of work that came about through the mapping of the space outside the Victoria building:
A Plan view of the space made up of individual paving slab photos – x marks indicating the locations of chewing gum splats
I made a screenprint using the cross marks and musical staves – which I hoped was a way for the audience to engage with the idea
Finally I made some sheet music based upon the position of the chewing gum in realtion to an imaginary piano keyboard along one axis of the space. the other axis becam the time and half a paving slab indicated 1 bar. This was displayed on a music stand facing out of the gallery towards the mapped space.
I got some interesting feedback about the work – many peope wanted to be able to hear the music. I was able to have a good convesation with Dan, who is helping out with some technical aspects, about different ways the mapped data could be interpreted and how we can set the space up to sense the presence of audience members and trigger sounds.
The screenprint seemed to be the least succesful of the three pieces, perhaps made redundant by the sheet music which was completed after the print.
Since the show I’ve been doing more research into Infra red sensors and also multi camera sensing – I came across the work of David Rokeby who has been using multi camera interactivity since the 90’s. more recently KMA have been making complex interactive zones with light projections.
I’m meeting a couple of guys tomorrow who work in UCLAN’s Computing and engineering dept, hopefully they will have some suggestions for how we can make the space interactive and handle the programming side of things.
At the end of June its the Lancashire Science festival and I have managed to secure provisional funding to have the piece installed for that weekend, this is dependant on the next few meetings and cost, but is looking good.
As for the music to be played on automatic piano or sequencer, thats progressing slowly. I managed to get a set of x,y values out of the illustrator document which maps the space. I’ve put these values into a spreadsheet and have been trying to figure out how to get them into a sequencer. I want to place them very accurately as I don’t want to disrupt the integrity of the mapping which is faithful to the chewing gum splats. For this reason I don’t want to quantise or round the coordinates up to the nearest 16th note.
I managed to make a rough version yesterday using some software which was designed for scanning piano rolls so that they can be restored. this involved changing my ai file into a 1bit bitmap. Converting this to a CIS file and then using that file to create a midi file – I’ll post a bit more of that process soon. Its certainly one solution but needs tinkering with.
I’m looking forward to finsihing that piece and being able to start experimenting with sounds for the interactive installation.
I’ve written a few posts already in projects unedited
I decided to move the writing about my MA onto degrees unedited in the run up to September when my course finishes.
I’m currently working on translating the distribution of chewing gum outside the PR1 Gallery into music.
Over the last few months I’ve been photgraphing all the paving slabs to create a detailed aerial view of the space so that I could precisely map the position of all chewing gum splats.
Through discussions with Dan Wilkinson who is a tutor on the music production course at UCLAN the piece has developed from being a compositon to be played on automatic piano into an interactive sound installation.
The plan is to use the x,y coordinates of each piece of gum along with either xbox kinect sensors or an infra red network. If all goes well, and its looking increasingly like a lot of work and testing, then as the audience moves around the space they will trigger samples (currently working on what they will be of) related to where they are which will play from 4 speakers set in the corners of the space.