I met with optometrists Ritz Cannell and her colleague Claire at The Specky Wren, Brighton and showed them the Quizzer. Both found it easy to use and confirmed what Ritz had told me that it works through binocular vision, which most people have so most people should be able to see the effect. Claire, who had specialised in the subject confirmed what I had noticed from making work:

Movement will trump stillness and something travelling towards the viewer will trump all other movement.

They suggested that putting a cross in the real world and in on the image would help people to match the image with the live environment.

I asked them about difficulties people had looking through the Quizzer wearing glasses. They suggested building something into the design so that the glasses are anchored against the viewfinder and held parallel to it.

Paul Graham put me in touch with a colleague Lucas Wilkins at spoke to me from his lab via Skype where he was watching a parasitic wasp hatch. I was looking for the scientific vocabulary to describe what I had observed. He pointed me to binocular fusion and rivalry terms to explain what happens to merge the video and the real world in our brains, and how our brains prioritise certain information over others. So we ‘see’ someone move in a video over someone standing still in the real world. I watched a parkour sequence of my son through the Quizzer, while a man was having lunch on a wall where we had shot the footage. When my son leapt over the wall in the video the seated man momentarily disappeared. The effect was  reliable even when I knew this was happening and watched for it. Perhaps I could make a sequence with live performers which plays with this phenomenon.

Fusebox residents and the wider XR brighton community were invited to the School of Engineering and Informatics at Sussex University Sussex University to view current demonstrations of work in progress. I met Gianluca Memoli who lectures in Novel Interfaces and Interactions who showed me 3D printed structures that direct sound. Outshift is a supporting company for his “Aurora: Controlling sound like we do with light” project. I am looking forward to getting his input on some of the phenomenon I have observed with the Quizzer.


The stories I tell about places come from those locations. I try to make the resonances of a place visible, and part of those resonances are to do with the people who frequent that location. I wanted to pilot ways of making a participatory AR documentary using the Quizzer.  Through an ONCA hosted peer mentoring group using the Action Learning Set run by Helena Joyce, I met Sara Fernee, an artist, teacher and forest play leader with an established after school group that met weekly in the woods in  Bevendean. She was interested in the Quizzer as a possible way of evidencing the magic that happens during free play in an outdoor space as well as a way to encourage older kids to take part. We were both curious as to  whether the Quizzer could be a bridge between real world and digital play and learning. We were both passionate about ‘being in the zone,’ a particular quality of attention, that comes about when playing and exploring an outdoor natural space. Would the Quizzer draw kids into or distract them from  the place itself?

Filming free play

She invited me to come along to the after school club for a few sessions and take things from there. I observed the children’s play and the ways they inhabited the woods, getting drawn into their games (and trying not to influence them). On the fourth session I introduced the 360 camera and my small handheld camera, being aware of changes in how the children behaved when the camera was there. A common response was to for them to approach it as though they were the presenter of a youtube video.

Testing 360 in the Quizzer

I wanted to test the 360 footage in the Quizzer. Would the real world and the video match up as the viewer panned around the scene? The 360 camera was a good way of recording the children without affecting their behaviour. I was struck by the compelling nature of the sound as they walked towards and away from the camera. Trying to match the 360 image with the real world however was difficult. The scale of the foreground image and the background image did not match up and the image would not stay still enough to make the initial match.

Pilot workshop

Sara and I planned and delivered a pilot summer workshop for 9-12 year olds exploring the use of tech in natural spaces. After exercises led by Sara to open our eyes to the space, in pairs the participants identified a route through the wood and made a series of film sequences that someone could use to navigate that trail. Other pairs then tested the sequences to see if they could follow the trail.
Despite heavy rain and wind the exercises worked really well, though the cameras needed to be higher spec. The participants enjoyed the process and had lots of ideas for sequences and experimenting with types of routes and movements.

Participants tested the Quizzer in the site using footage I had shot at an earlier session. Different light levels affected the merging of recorded and live images in the Quizzer. The footage had been shot on a bright sunny day and was viewed on a dark rainy day which made the brightness of the image over power the background.

Demoing at primary schools

At Sara’s invitation, I demoed the Quizzer at Bev Fest community festival as a way of showing the community what we were up to, and at two primary schools: Bevendean Primary and Downs Juniors. It generally works well with young people as they are open to the experience and have good eyesight! Under 7s don’t quite have the focus or coordination to do the matching.

A two year collaborative project?

I had been chatting with Paul Graham of Sussex University about human and animal navigation after meeting at a Science Festival open day. He had been interested in the flick book navigated walks I had made as it tied in with how some animals including humans store image sequences in order to remember routes.

I brought Sara and  Paul together to brainstorm ideas for a two year project based in the woods in Bevendean. We shaped the initial ideas and our roles looking at the Wellcome Trust Public Engagement Fund as a possible funding source.



I was picked to demo at Caravan Marketplace at Brighton Festival, with a small grant for expenses. I decided to develop a way of showing existing sequences off -location, as well as making a short bespoke sequence for the room where the demo would take place, the Founder’s Room at Brighton Dome.

Fellow resident Maf’j Alvarez wanted to see if the kanban method of project management would work for me as an artist. Maf’j would work though one of her own projects at the same time. We drew up a kanban board and stood up at 10 am each morning to chat through current tasks and blocks. It was helpful to break down each larger task into smaller action points and have that contact each morning. After a two week sprint you have a ‘retrospective’ where you review what you have achieved and what it took to get there. Maf’j had to force me to do this as it fell just before the Caravan deadline, but it was useful none-the-less and showed me how much had gone into getting the thing done.


Robin’s Eye

Robins see the Earth’s magnetic field in one eye. I like the idea of placing a dynamic robin’s eye view of the magnetosphere into one eye using the Quizzer, like a heads up compass. The visualisation above was created by academics at the University of Illinois.

I decided to try this as my first Unity project. It would also be a good way of demonstrating the way the Quizzer can be used with AR apps which overlay onto a smartphone’s camera feed. The Quizzer makes the experience more ‘in the world,’ rather than at arms length on a smartphone screen.

With the help of Chris Chowen at the Immersive Lab and I was able to use Magic Paintbrush to draw a 3D egg shaped twinkling line to mimic how a robin ‘sees’ the earth’s magnetic field and put  this asset into Unity, placing the camera/viewer inside the ring so that the viewer can turn their body to look around a ring that surrounds them.

After consulting Neil and researching webcam textures I cut and pasted some code to attach to a screen in the app so that the ring is overlaid on the camera feed.

The next step, is to anchor the ring to the phone’s magnetometer and export it as an android app.

Volumetric Capture Field Trip

George from Mutiny Media organised a Fusebox field trip to Dimension volumetric video 3D capture studios in Wimbledon and the DoubleMe start-up based in Ravensbourne University.

The Dimension studio makes high quality 3D video and images using hundreds of cameras, kinects and infrared cameras arrayed in a cylindrical studio with banks of computers to crunch the data.

The DoubleMe software works with 2 x Kinect depth cameras which takes a subjects’ 3D data and processes and transports it over Wi-Fi, enabling 3D real-time holograms of the subject to be viewed with a HoloLens in another location.

Basically we saw high and low res ways of making 3D assets. Thanks George!




I wanted a demo piece that would work against a live background so arranged with the Dome for access to the Founder’s room in advance and took some photos. Initial ideas to do with the heritage of the Dome included levitating suffragettes and horses (the Dome was formerly a stables). As the deadline drew near, horse footage was more easily available so I bought some footage and Neil composited it into the backgrounds. I had meant to learn to rotoscope and composite using the trial version of Creative Suite, but the deadline was approaching and Neil as the expert took over. I love the short sequence that resulted.

We demoed at the Marketplace and had great conversations with the delegates, producers from around the world. A producer from the Dome suggested I talk to the team working on interpretation for the current restoration project.