0 Comments

its hitting 9pm – still working. Gordon couldn’t deliver the screens so I have mocked up some ones so when they come tomorrow it won’t take long to install. The open exhibition period came and went pretty quickly. We have the computer running two screens. This is great, as the guy at the mac store told me that this wasn’t possible – the graphics card wouldn’t be able to handle it. The resolution is OK – I still haven’t worked out how this is working with the camera, but anyway, a great progression.

I don’t like the fact that the pixy asks you to be outside of the room to interact with it – or to find its viewing spot. At the same time, I hate the way my computer is sitting in front of the screen – people become so aware of it and then forget about the screen. Tomorrow we will fine tune it again – get the systems outside of the room – so we can still control it – but don’t have everyone looking at it. This will be a good step forward.

For the current version of the Chameleon Project the work demands that you stand infront of the mind reader for it to work – I don’t think this works with Pixy – I am wondering if a small change of narrative can happen. for example – the character stays walking back and forth until you engage with it – and then the face comes clearer – this allows people to walk around the room more. I will ask jeff about it tonight.

I haven’t heard from Rana, Youssef or Abdehlrahmen about the crashes, Jeff Mann who has developed the video engine has said its not on his end. Tomorrow I will test it with the old version of the face reader and see what happens.

Kim from the UCL interaction lab arrived today. I have been trying to gather participants for the evaluation. Today we gathered another three or four. It make take longer. Thank god we didn’t start it yesterday.

Michael Roy seems to have the better luminosity sorted out now. Natacha has been spending the day getting the next pixy display together. This one is going to be hung as a curtain. I was hoping it may have been hung as a curve, but this can’t happen which is ashame. Anyway, the next one should be up tomorrow.

I have asked jeff to make the pixy be driven by the one computer. We need some more computers now – but it takes time to set them up and I sort of need Jeff to do that.

We are using a new camera that seems to respond better to the infra-red – but doesn’t seem to latch onto your face as easy as the other one. The logitech one didn’t work anymore. Michael from fabrica tried pulling out the infra-red sheild – and it didn’t work. strange.


0 Comments

KARL BROOME

Chameleon sets out to explore the scientific foundations of emotional
contagion. Utilising the six key emotions of disgust, happiness, anger,
neutrality, sadness and surprise originally identified by Paul Ekman,
the ‘face reading’ software attempts to identify the emotional
expressions of the participant. As the individual emotes, both their
facial expression, and potentially their ‘mind’ are ‘read’. How should
we understand these emotions in the context of the ‘emotional dialogue’
that Prototype 8 affords: where do they sit on the explanatory continuum
with biological explanations on one end and social on the other?
Sociological debates concerning emotions have been characterised with
the conceptualisations of emotions varying across a continuum with
‘organismic’ approaches on one end, ‘social-constructionist’ accounts on
the other, and ‘social-interactionist’ accounts somewhere in the middle.
At the ‘organismic’ end, we would find the likes of Charles Darwin, and
Paul Ekman, emphasising the innate, biological and ‘pre-cultural’ basis
of emotion and their expression – causes of emotion are wired in the
brain for instinct and survival. At the other end of the continuum we
find the ‘social constructionist’ accounts of emotion that have stressed
the ‘social’ nature of human emotions, understanding the emergence of
emotions in terms of their social, cultural, and historical variability,
meaning and experience, with the biological being understood as largely
irrelevant. For Social interactionists, emotions are recognised as
having biological substrates, but socially shaped and subject to
hierarchical manipulation. In contradistinction to constructionist
accounts, Interactionists recognise the importance of biological
process, and recognise the ‘embodied nature’ of emotions. Thinking
about ‘visitors’ experiences of interacting with the Chameleon project
provides an interesting opportunity to revisit some of these polarities
in sociological theorising. The social ecology of the space in which
works such as Chameleon are exhibited significantly impacts upon
affective experience of the work. Mundane material, physical and spatial
elements, and their affordances in terms of movement, interaction,
proximity, distance and visibility all play their part in terms of
interaction with the work, and the various forms of social interaction
taking place. Observing and interacting with people visiting the
exhibit today I became aware of people’s reluctance to stand in front of
the face -reading camera for any extensive period. Visitors appeared to
be much more comfortable in entering the dark room where the only source
of illumination was the relatively small amount of light produced by the
Pixy ‘screen’ – they appeared to be much more comfortable and interested
in watching Pixy from inside the room. I heard various people comment
upon how they felt Pixy to be the most interesting part of the work,
although not quite understanding how it worked. I feel that at this
testing/evaluation stage it takes a bit of active engagement and time to
really experience what it is that makes Chameleon so special- at the
moment visitors seem somewhat distracted by the presence of the computer
monitor and the face reading camera. Undoubtedly, with a few more tweaks
visitors will experience a collaborative work of an exceptional intensity.


0 Comments

So far – the face reader seems to be crashing. I am still to work out whether this happens only when the pixy screen is added. So far – when I simulate the face reader I get no crash. When I have the face reader running with the pixy screen we get a ‘timeout’. when I run the video engine without the pixy screen – so far no crash.

There are so many different parts to this project – the face reader – the algorithms – now pixy – the video engine – its hard to know how to get to the bottom of it all.

I am hoping Gordon comes in today with more screens…its looking unlikely so far… not a great start. However, past 1pm I will start setting up next door – Jeff sent the new video engine that runs two screens.


0 Comments

In June when I was in Brighton working with Fabrica about how we might approach the autumn exhibition, I shot Alice-Gale. She wasn’t available for a third shoot, which was ashame. Below Alice writes about the experience of being in front of the camera and being asked to express different emotional states.

Alice-Gale:

For me it was a real eye opener; to become aware of the unsaid emotions and feelings that have been pushed to the back of my mind after the years of new experiences and new memories.
The project was very prominent in my mind for the two days of my involvement,and I am still currently thinking about my responses, what I could have said etc. What i found most consuming was the thoughts about my emotions or lack of emotions and what this meant about me as a person.
During the shoot, I felt i sometimes tried to convince myself of sadness or happiness as my mind almost went blank in front of the camera (an unnatural setting to expose yourself) I then became aware that I had not felt real strong emotion for a long time , or thought I hadn’t; in the attempt to recreate the emotion, the feelings flooded back.

I felt quite moved both during and after as it bought up for me feelings about friends, relatives that i had forgotten or moved on from in the business of my mind and life. Sometimes i felt regret about thoughts and feelings..I feel I’ve learn t about the necessity of holding back on having to vocalize feelings when it could trigger a consequential negative response in myself.
I was surprised that i found it difficult, but equally surprised how liberating it was; I am more inspired to be honest with myself about emotions from now on.


0 Comments

I shot Amy in the studio last week. It was our third shoot together. Its been great to get to know her a little bit more. Its interesting to look at the footage, as the more we got to know each other, the more at ease her face and body became. In the end, she revealed a lot of body movement, with out particularly realizing it. It looked like she could have been a dancer in another life.

Amy:

I’d never been in front of a camera before so I had to ajust fairly quicky to talking to someone and thinking about things whist looking down a very close lense and being lit by a spot light. I did managed to feel the emotions I was asked to recall a little as I spoke, but being in such an unfarmilier environment it was hard to remove myself from it, athough by the second day things where somewhat easier. How easy the emotion was to act out depended for me on how recently the incident I was talking about had occured, the event i talked about for “sad” was only days old and very fresh in my mind, but for “disgusted” and perticually for “angry” they were years old, so although they were both very strong instances of those emotions i felt more as if i was distantly remembering how i felt rather than truely recalling it.

One peculiar thing was although I produce some artwork myself which I have always wanted to be quite personal (I am currently doing an art foundation at the age of 24) I found recalling emotions when asked very hard. I do have a wealth of exeriances to draw from emotionally, but, (possibly because of having only done so under therapy conditions), I am incredebly guarded about talking and perticually about recalling such things.


0 Comments