0 Comments

 

It works against a printed background!!

So, how could I show existing sequences off-location? I made a low cost print of the background photo and viewed the sequence against it. It worked really well! So I decided to make two 180 desktop stands to view the sequences against. These stands should be portable and easy to set up.

I made a card template and searched the Brighton Woodstore shop opposite, and various kitchen shops for inspiration for the stand.

Exhibit Printing along the corridor in New England House said they could print a vinyl sticker if I sourced a piece of plastic.  Brighton and Hove Plastics in Portslade had exactly what I needed in terms of size and bendiness.

Now I needed some 180 panoramas of the backgrounds. I borrowed the 360 camera from the Immersive Lab and took some photos. The resolution of these turned out to be too low for the print so I took a series of photos with my Olympus and found some software to stitch them. Unfortunately the council were building flats in Kensington Street, the site of my first sequence, so it was unrecognisable so I took photos of a different, but similar, gratified street and asked Neil’s help to merge the two.

Lucy from Lucid Design came up with the graphic design and before long I had two beautifully printed 180 backdrops on bendy plastic and some oiled wood blocks and withy to make the stand. Thanks to Chris Willatt for making the blocks.


0 Comments

The invention works only when the viewer stands in a particular location. I wanted a way of showing it at the Fusebox so I made a sequence for the corridor outside the office. New England House, built in 1963, was the first purpose built industrial complex. Its wide corridors and industrial lifts are characteristically brutalist. Neil and I used footage of the performer Mim King swimming under water that I’d shot in 2011 for Flickers: Under the Water for White Night, Brighton. Now we had a woman swimming down the corridor and disappearing through the door.

I was given the opportunity of speaking and demoing at the Virtual Reality and Heritage or GLAM (Galleries, Libraries, Archives and Museums) VR Meetup at Fusebox. This would happen in the evening but how could I demo at night? Light levels affect how the image merges with the real environment. Joe from Wired Sussex suggested I talk resident, Dr Esin Yavuz, Co Founder and Chief Scientist of Cyanapse who are building an automated image editing platform which includes AI-powered photo filters. Would it be useful for the viewer to set the lighting of the scene, depending on when they viewed it? Though this will be something really useful for the future, for the time being we went low tech: I took a night time photo of the corridor and Neil composited in the swimmer to the new background.

I wanted to make a series of sequences that viewers could follow around the building, so set about trying different settings for our existing footage of underwater Mim. We tried her swimming across the lightwell towards the viewer and floating in the massive industrial lift. While there are all the problems of scaling the figure to look right in the setting, which Neil managed to solve as usual, the biggest challenge was that I couldn’t get the foreground and the background to line up in terms of scale. Both these new sequences had window frames in the near foreground and no though I took photos at many different focal lengths it remained difficult to match up the photos with what I could see with my live eye. As I understood it, a 50mm lens should mimic our vision. But this wasn’t working.

Attempting to make these sequences had revealed a couple of excellent and interesting problems.

How can a photograph accurately match how our eyes see the scale of near foreground and background objects?

How can the viewer change the lighting of the background to fit the light levels when they view it?

I spoke at the VR Meetup (49:51) and demoed the corridor sequences. People had difficulty lining up the lightwell sequence as expected but the original one the corridor seemed to work. I chatted to fellow speaker Dr. Abigail Wincott of the University of Brighton spoke about her project Towards a Museum of Sound: Using location based apps to create a collaborative, creative methodology to develop and evaluate sonic heritage.


0 Comments