There are so many options to choose from when starting out with interactive artworks that it can seem overwhelming. Arduino, Pico, Python, MicroPython, Raspberry Pi… where to start? So it is important to orientate yourself and take some time to do a little homework.

I went along to the Feeling Machines Weekender in Bristol in the Spring to get a flavour or what was possible and to talk to others about what they were doing and how they got started. The key seems to be to find a community which you can join and get help and advice when you need it. There is a thriving creative coding community in Bristol, but that is just too far for me in Cambridge. So I joined my local MakeSpace Raspberry Pi club instead.

The good thing about Raspberry Pi as a starting point is that it is relatively cheap (around £40), its Windows interface is familiar, and it is designed to be an entry-level product to start you coding. The CamJam EduKit #2 for just £9 gives you step-by-step instructions for wiring up a simple motion sensor and writing the Python code that goes with it. I was on my way.

Inside a Raspberry Pi


My aim was to create an artwork that responded to the presence of the viewer, and I wanted to mimic the way in which social media keeps people hooked.

So my project was going to show some content on a screen, and every time the viewer starts to move away to look at something else, I wanted my artwork to make some sort of a sound and refresh the content in an attempt to lure the viewer back.

The end result was Catatonic, which I showed as a prototype at Cambridge MakeSpace Open Studio in July.

I thought that this would be a relatively straightforward first project that uses a motion sensor to detect presence and then trigger a response accordingly, but it turned out to be surprisingly difficult to achieve. Read on to find out how I  got on.

Catatonic at Cambridge Makespace