During a residency at Blast Theory, I had discovered a new way of merging video with a live environment, a serendipitous discovery, that came directly from my artist inquiry –  how to animate a landscape to a person walking through it.  I made  a prototype device, The Quizzer, with technologist Neil Manuell, and proof of concept sequence during the Reframed Project Development Residency at Lighthouse. On the back of that I became an XR Resident at Fusebox Brighton, a collaborative R and D space for tech start ups and artists with ideas around virtual, augmented, mixed reality.

I had stumbled on an exciting solution to the fundamental contradiction of my practice: how to use video or image sequences to heighten a live environment for a walker, without distracting them from the here and now. But I didn’t know how this new invention worked.  I didn’t know if different individual eye sight would affect the experience. I was massively excited about its potential and wanted to find out its limitations and so understand what I could do with it.

At Fusebox I called it an AR hack, as I fought my impostor syndrome, realising I didn’t know how to talk about the discovery in this new context. People around me were developing products and use cases. Was it a product? I had never developed a product before. Did I want to focus on making it a product or was it more important to make work with it first?

I have worked on the idea from the beginning with developer Neil Manuell, but his availability to work on the project is limited. The a-n Bursary would fund the residency at Fusebox for ten months which meant having access to the Immersive Lab and the expertise of fellow residents. I wanted to learn the basics of Unity and how to process 360 images and video so I could rough out and test ideas before getting Neil involved.

My aim was to find out what I could do with the invention, how it could intersect with XR, and to become more independent with the digital processes needed to develop work with it.


0 Comments

My work on the Robin’s eye was a way of testing if you could navigate an audience using a visual effect over the camera feed.  At first I  looked for AR compass apps, then Peter from VR Craftworks suggested I try instagram filters. They work! Revelation! Have been trying to test this for ages. Thought it meant full on app making, but no. The most interesting ones mutate the feed itself. Noticing how people in the real world appear clearly but with a halo of the filter effects around them – my brain deciding what is most important for me to see.

The only problem is the mirrorless Quizzer reduces the binocular overlap by around 20 degrees because the phone gets in the way.

Ways of solving this new problem: find a lens that allows us to hold the phone closer to our eye, add an extra mirror, make a periscope, use a separate tiny monitor…..


0 Comments

Following up my initial tests with Sara and the forest club I made some more footage out and about in Hollingdean. I chose the old skatepark because I love it and the huge meadow that leads to the dew pond. My kids provided the action.

It was still horrendously difficult to line up, especially in the meadow and I struggled to define the exact nature of the problem. There were still too many variables: keeping the Quizzer still enough to stay matched up with the x, y and z orientation, finding the right viewing position and scaling the image. I had to zoom in a lot to approximate the right scale and this meant distorting the image which affected the match up.

I decided to try some high res footage Nick Driftwood had made using six panasonic lumix cameras by the West Pier and bandstand on the seafront. Neil made buttons which would fix the x, y and z axes. This  made it easier to make the initial match but it was still tricky to match up the position and scale of near and far objects in relation to each other.

Why was this?

I had thought that if you take a picture with a wide angle lens then the image becomes distorted in terms of perspective, but I found out from photographers’ forums that this is not the case. If you take a very wide and very close photo from the same position, objects will appear in exactly the same position in relation to each other. You can prove this by blowing up the wide photo to the same field of view as the close up one and comparing the two.

However the nearer an object in the foreground the more accurate you have to be with the viewing position. A slight shift of position up down or side to side will change the position of the foreground object in relation to the background. So viewing position is key.

Very wide lenses used in 360 photography do distort the image, but not in the way I thought. More research led me to tilt-shift lenses which correct perspective by physically tilting or shifting the lens. If you take a photo of a tall building the image will show it narrowing towards the top, something our brains correct for which means that through the naked eye, this narrowing effect is not in evidence. A tilt-shift lens will make the image look more like how we see it.

Maybe this was the key?

I took Neil down to the bandstand to test Nick’s 360 footage.

He agreed with me about the variables suggested a test we could do. We would place crosses at different distances and take photos from different positions and view them through the mirrorless Quizzer to see where the distortion happened.

To my surprise, even when we viewed a photo of a very near object it lined up perfectly with the background in position and scale if we viewed it from exactly the same position it was taken, it was just much harder to find that precise viewing position. Of course from a user experience point of view it’s probably better to make sequences without very near objects.

Michael Danks from 4ground Media looked into the event room and suggested we try with a 360 image taken with his “two eyed” camera, which takes very high quality photos from the left eye and right eye view points. The next task is to test them in the Quizzer.

 

 


0 Comments

The case has to fit all sizes of phones, it would be great if it makes the phone no longer look like a phone. I like this flexy silicone lego…

Strip it back a bit.

And a bit more….

Found this super wide elastic! Simplest solution is the best.


0 Comments

Heard the news that I was accepted for the Developing Your Creative Practice grants from Arts Council England. Fantastic.

Now I can get on with the work and have a rest from funding applications!

This will enable me to:

Learn the skills and commission bespoke tools for a smooth and independent workflow.

Continue to find out what kind of work I can make with the Quizzer.

Iterate the physical and app design to make it audience ready.

Explore ways this new tool can intersect with XR technologies to create a low cost, accessible and truly immersive way of experiencing visual place-based stories.

 

 


0 Comments

Along with Fusebox residents Maf’j Alvarez, Andy Baker, George Butler and Iona Scott, I was invited to demo the Quizzer at TOMtech’s intensive 4 day VRlab, part of Brighton Digital Festival. At first I was unsure why I needed to do any more demoing – it’s quite disruptive when you are in a research phase – but realized that showing to a high volume of people would give me the opportunity to gather user data. So I made up a questionnaire and 86 testers filled them in. Questions concentrated on how easy it was to line up and merge the images, ease of using the app controls and how it felt to hold something up to your eye rather than wear a headset. I also asked about their eyesight, whether they were long or short sighted or had other sight peculiarities.

I still need to work out how to quantify the results. These were some of the comments:

Victorian

Cool

Excited. I have never experienced anything like it.

Engaged. Fascinated.

The swimmer made a wonderful effect, especially with the rewind and fast-forward controls.

Experimental – testing out the way my eyes worked together. Ethereal and beautiful.

Breifly in quite a spooky world.

It made me feel like I was there and that was very cool and interesting.

Very clever and unique design, good to experience something where you can still see the real world.

Surreal and delightful!

(It felt like) it was coming to life.

Compelled to know/experience more, poetic blend of imagery and sound creating intriguing narratives. Nostalgic (a lot of memory in spaces. )

What a joyous experience. Do more. Brilliant.

Immersed in the experience.

Fascinated. Entranced. I like the backwards and forwards and suspending the image.

Would have liked longer clip.

I like the contract between the background and the moving image, which felt like a more old fashioned type of image.

(It felt like I was) immersed in the image.

Very different to any other AR/VR I have done.

Awestruck

Relaxed

Involved


0 Comments