Share

You may not have heard of him yet, but director Eran Amir is sure to become a household name soon. Known for his mindboggling ideas, his desire to experiment and his meticulous dedication to a project – which often involves filming himself in unusual surroundings while he works out how to realise his vision – Nexus has announced that it has signed Amir to its roster.

Amir’s debut project is a VR/360 music video of Chapita for electro-pop group, Mind Enterprises, through nexus vr studio. Amir’s breakthrough spot defies the rules and confuses viewers by combining choreographed loops, clever cinemagraphs, clonemotion technology and a single 6K resolution camera on a custom-built rig. shots spoke with Amir, Nexus producer Becky McCray, DOP Martin Testar and Nexus’ technical director/VR artist Elliot Kajdan to find out why this approach is so different to the usual multiple camera rig and how they came about realising the video.

 

 

What was the brief on this video?

Becky McCray: Because Music wanted to do a live action VR film and they wanted to see something interactive. The dancer, acclaimed performer Mimi Jeong, is caught in a kind of loop; so her movements become cinemagraphs, like live-action GIFs. Eran wanted to confuse viewers so they wouldn’t know if the effects were created in post or if it was actually being danced that way. In fact, some of them are on 20-second loops, some are 2- second loops and some are done in post, so it’s really difficult to tell which bits are where.

 

How did you come up with the idea?

Eran Amir: It started with the music and the fact that I’d began dabbling with 360 video and VR about two years ago. It’s about Mimi conquering the warehouse space and simultaneously demanding the viewer’s attention. To begin with, there’s only one dancer and you can look at her or look away. But by the end, there are so many clones of her, it should feel a bit overwhelming, almost threatening. It starts very intimately with Mimi up close, but then an army of Mimi’s take over.

 

 

How did you decide how to navigate the viewer's attention?

Amir: I wanted it to feel like it was taken in one long take, so the seamless movements, the colours and the fact that the space is conquered chunk by chunk help to guide the viewer. Initially, the action happens right in front of you and then it widens so the viewer can explore new spaces. The way I see it, VR has two modes; there is the ‘guided tour’ mode where the viewer is told where to look and the ‘explorative’ mode where you can choose where and what to look at. In this video, I tried to play with both.

How important was the planning process on this project and what did it involve?

Amir: It was vital. I know everything about this sequence but of course, there’s always an element of surprise in the production of my work on process. It was incredibly important to have the space planned perfectly.

 

How did you stay organised throughout the shoot?

Amir: Elliot prepared a one-on-one 3D virtual model of this space and I played around with it via a 360 virtual camera and did a few tests on a 360 rig (below). That’s why I feel I know the space so well, even though we were only actually on location for one day. It required planning as much as possible beforehand and then trusting that all the parts would come together on the day.

 

 

Why did you decide on using a single camera rig and what’s the benefit of this?

McCray: We were thinking of ways to improve the image quality and how to elevate Eran’s art direction, increase the production value yet retain his idea. We knew it would help to shoot it on a higher spec camera as we wanted the most information that was possible.

 

Tell me about the camera and why you thought it was suitable for this job?

Martin Testar: The camera has a very high resolution and shoots 6k, whereas most cameras only shoot 4k. By using the 6k functionality, we can capture as high a resolution as possible. Before the actual shoot, we did some tests with an 8mm lens and then Elliot took the footage to Nexus and composited it all together and took out some of the distortion. But we’re using a very low distortion as the quality is infinitely high. As a full cinema camera, we shot around 8 shots at 6k around the space, so we had a 48k panorama file in total.

Elliot Kadjan: This camera has a full-fledged raw footage function which is uncompressed and has a high dynamic range in terms of colour. It gives a lot of colour, high resolution and the lens gives a very nice, crisp image which is what the camera is made for.

Testar: This camera was ideal for this shoot as we wanted to control the shutter and the speed precisely. Every image was taken in a short amount of time to ensure that every frame would be very sharp. We always kept Mimi in the centre of the frame and we rotated around her. The whole camera revolves around the lens, rather than the sensor. It makes the image stable and parallax-free.


What was the editing process like?

McCray: It wasn’t too bad. Because we were compositing her into reality, we cut her out quite roughly as the camera hadn’t changed position. When a clone overlapped with another a clone it got tricky because then it needed to be cut more precisely as she’s not really there. 

 

Connections
powered by Source

Unlock this information and more with a Source membership.

Share