VR and 360 filmmaking is a very different process compared to traditional TV and cinema, and with it comes the baggage of learning new techniques for filming. Last year I described 2016 as the ‘Wild West’ period of the industry where there are fewer rules and filmmaking techniques were less established. The new, somewhat untraveled land forced some filmmakers to learn new techniques on the fly, adapting to the situations they want to portray.
One of the people researching the field is Alia Sheikh, a Senior Development Producer in BBC Research and Development, who experiments to find out what works and what doesn’t. In a 360 degree field, a viewer can look wherever they like; Alia’s technical research covers the psychological techniques broadcasters can use to subtly push viewers in the right direction.
To Alia, these subtle ways of directing the viewer can work in a number of ways: a voice to the left of the scene; the people at the centre of the scene pointing to the left; or a person walking across the scene towards the left, among many other dynamic prods. The same effect can be done with light – like moths to a flame, viewers tend to follow the source of light over dark.
All of these ideas have been lifted from traditional theatre techniques, and with eyetracking software Alia noticed how effective they can be for viewers in 360°. Even then, Alia found that some people still tend to dart around the scene left and right, not focusing on a particular area for a long time. These viewers like to scan their surroundings as a scene plays along, rather than stare at one area for a long time. Others might choose one area of interest and remain fixed on it until very strongly directed to shift attention. People don’t stop being highly individual.
Scene transitions can still happen, and they do not need to be complicated. Fade to black then fading into a new scene still works, for example, and this matches the use of lighting effects in theatrical productions. Not unexpectedly, spinning the camera into a new scene tends to make people feel nauseated, however if footage is sped up fast enough the motion blur effects disappear and turn into a more acceptable ‘blinking’ effect which surprisingly sometimes works for transitions.
Level of viewer engagement can be affected by the amount the viewer can be included in the scene. Alia created a scene when two people were having an argument, with the viewer listening in, and then an almost identical scene except in this case the people arguing attempted to involve the viewer. There was far less engagement when the viewer remained unacknowledged. Although in the second example the actors in the scene attempted to draw the viewer into the scene, the viewer was not able to affect the scene in any way. But when the viewer was given a name, and brought into the argument – with continual side glances from the arguers towards the viewer – attention spiked upwards. The Viewer became a part of the argument, not just a passive listener. Alia calls this as the ‘illusion of interaction’
Another factor is distance from the camera, and how it affects the watcher. Alia staged a fight between two people, and situated the camera 2ft, 3ft, and 4ft away from the fight. What Alia found was that the viewer’s level of comfort is somewhat contingent to the distances of real life. If a fight was two feet away viewers felt overwhelmed, whereas if a fight was 4ft away they felt safer watching the conflict. Distance in this case affected the emotional impact of the scene and the ‘correct’ distance for the camera from the actors depended on the desired emotional state for the viewer as well as taking into account that the low resolution of current 360° systems makes it hard to see people clearly if they are too far away.
The team found that changing the distance of scene objects from the camera also allows them to very convincingly play with size using traditional perspective effects. Working with the Familia De La Noche team on their 2016 production of Gullivers Travels, Alia and Trainee R&D Engineer Sam Nicholson, found that it was possible to approximate a ‘giant’ talking to ‘Lilliputians’. The giant effect was created by placing one actor on a stepladder and having him lean forward over the 360° camera which had been placed close between his hands. The Lilliputians were created by having the actors a few meters from the camera, looking artificially ‘up’ at the giant (who himself exaggerating his gaze downwards). There is a perception that a 360° camera perspective is somehow a more ‘honest’ representation of the world, as the entire panorama can be viewed and there is no hidden area ‘behind’ the camera. However these tests do show that the camera can, as was always the case, very effectively lie.
When the team went to the Edinburgh Fringe and did experiments filming street performers, they found the same conclusions – camera position is vital. There is a perfect middle between performer and audience where a viewer can watch the performance without being part of a crowd, and therefore be in a good position to observe both. This position only really makes sense if it is either important for the viewer to be able to turn away from the performer and observe the crowd for some reason, or if being situated at the front of the crowd would offer a poor view of the performance. Once the action is even a few meters away, on current playback hardware, detail is lost, and in that instance it is preferable to move the camera closer to the performance at the expense of feeling ‘part of’ an audience.
For one-on-one performances however, where the viewer is the only audience member, more interesting camera positions afford themselves in one case for example, having a performer juggle over the camera to make the viewer look up and feel a sense of peril, as well as provide an alternative view. In another example, a performer thrusted a flaming sword near the camera and gave a cheeky smile, a view that is closer to what the viewer would experience during a one-on-one interaction. These performances were made accessible via BBC R&D’s WebVR player, which was coded in Javacript and used on the BBC TASTER site to give an impression of the variety of street performances on show at the Edinburgh Festivals. Crucially, having a bespoke player meant that content did not have to be made available via an external platform, and the way that audiences navigated around the player (either on headset or non-headset device) could be customised to the content on offer.
More information here: https://www.bbc.co.uk/taster/pilots/edinburgh70/trailer
From arguing friends to talking giants to dancers at the Fringe – at the heart of it all is narrative. No good story works solely on structure – it needs a tale to weave it together. And after a lot of tinkering on her side, Alia came away with three key questions that any 360° experience producer needs to be able to answer:
- Who is the audience? What is their role?
- How do you want to make them feel?
- What do you want them to know / understand?
Overall, Alia is excited for the future. ‘The language of filmmaking is still there, it’s just in a different dialect.’
Originally written by me for VRFocus.