The Making of 31 Days

Super 8 centred in an HD frame

I placed the 31 seconds of 4:3 edited footage in the centre of an HD frame, preserving the image quality by having its pixel size at a 1:1 ratio. I felt the image seemed a little ‘lost’ in the HD frame and would likely be smaller than other work in the compilation.

Arrangement of panes across the HD canvas.

In search of a solution that would use more screen-space, I then composited another two frames across the 16:9 HD screen. As seen in Figure 21, the outer two SD frames extend beyond the edges of the HD frame, shown as a black rectangle.

One Second a Day compositing with one-second offset on each pane.

I chose this arrangement in preference to having three smaller complete panes. Offsetting the footage in each pane by one second from right to left displayed subsequent clips in the three panes. The three video tracks in Figure 22 above are the right, centre and left panes, reading from top to bottom. When playing, each one-second clip appears partially on the right-hand pane, then fully visible in the centre pane, then partially visible on the left-hand pane, alluding to past, present and future across the composition, reading left-to-right.

One Second a Day film still – three Super 8 frames across an HD screen

The footage used in One Second a Day is silent. To create a soundtrack, I recorded a Super 8 film projector being started and running, syncing this audio to the moving images making an auditory connection to the performance of film projection. The machine used to digitise the footage was almost silent, its operation not disturbing the hushed, reverential space of the telecine suite in Soho. The sound of the Chinon projector being operated in my work room brings the footage home – literally – as the progress of a 50ft reel through the machine is sonically coloured by that space. Brandon LaBelle describes his act of clapping in a room as “more than a single sound and its source, but rather a spatial event” that is “inflected not only by the materiality of space but also by the presence of others, by a body”(LaBelle, 2015, p.xii). Recording the audio in a domestic-sized room with me standing and holding the microphone, shapes the audio, and to me, it speaks of the times when I first saw each reel freshly returned from processing.

Super 8 cameras usually have automatic exposure, so filming can be freewheeling in execution: frame, focus and press the trigger. The cost of the film and processing encourages frugality, so individual shots are fairly brief. Despite the consequent brevity of most of the filming – meaning more frequent in-camera edits – 30 of the one-second selections contain a single shot. Close inspection of One Second a Day reveals that a single one-second clip has an on-screen edit within it that then repeats across the three panes, shown in Figure 24. 

An existing edit in a one second clip

The short film was compiled with the other students’ videos to form a collection with a running time of approximately eight minutes duration, entitled 3D3: 1 Second a Day Showreel. The compilation is displayed by 3D3 on its website.

My Super 8 films had undergone a complicated series of remediations on their passage to include in the group project. The film frames were mostly shot at 18 frames per second, digitised to standard definition video at 50 interlaced fields per second, colour-corrected by a telecine operator and recorded to Sony DigiBeta half-inch digital tape. This tape was played out and captured to hard disk, then de-interlaced and placed in a 25 progressive frames-per-second high-definition video editing project. The SD telecine and video edit used ProRes codec QuickTime files with 4:4:2 colour subsampling. When the edit was complete, the subsampling was downgraded to 4:2:0 and compressed using the H.246 codec for playback on an LCD television.

31 Days vs The News

Multi-screen editing

Non-linear video editing systems (NLEs) typically offer multiple ways to view material, with options to display the editing interface across several screens. Figure 2 shows Final Cut Pro (FCP) in action. The main iMac Pro screen at the top left displays panes containing selected material for the next edit, the timeline below with thumbnail images and the clip below the playhead in the timeline displayed above and on a separate monitor. The iPad next to the keyboard is displaying ‘filmstrips’ with thumbnail representations of material available for editing.

Other NLEs, such as DaVinci Resolve, offer even more flexibility, allowing editors to arrange the interface across many screens to suit their needs. Along with the image content, FCP can display metadata such as timecode and clip names, along with technical data such as video scopes to indicate the brightness and colour of the video being edited. This hypermediacy delivers simultaneous streams of information useful to the editor, such as seeing the approach of the next video edit on the timeline before the cut is displayed.

When ‘immediacy’ is required, FCP can be switched to ‘full screen’, so the interface disappears, leaving only the video content playing across the whole monitor. When reviewing an edited sequence in full screen there is still the temptation to pause the playback and return to the editing interface to address any points of interest or concern. The editor can leave the edit suite and watch the programme played out to a television to access the immediacy that will be experienced by the eventual audience.

Non-linear video editing systems (NLEs) typically offer multiple ways to view material, with options to display the editing interface across several screens. The figure above shows Final Cut Pro (FCP) in action. The main iMac Pro screen at the top left displays panes containing selected material for the next edit, the timeline below with thumbnail images and the clip below the playhead in the timeline displayed above and on a separate monitor. The iPad next to the keyboard is displaying ‘filmstrips’ with thumbnail representations of material available for editing. Other NLEs, such as DaVinci Resolve, offer even more flexibility, allowing editors to arrange the interface across many screens to suit their needs. Along with the image content, FCP can display metadata such as timecode and clip names, along with technical data such as video scopes to indicate the brightness and colour of the video being edited. This hypermediacy delivers simultaneous streams of information useful to the editor, such as seeing the approach of the next video edit on the timeline before the cut is displayed. When ‘immediacy’ is required, FCP can be switched to ‘full screen’, so the interface disappears, leaving only the video content playing across the whole monitor. When reviewing an edited sequence in full screen there is still the temptation to pause the playback and return to the editing interface to address any points of interest or concern. The editor can leave the edit suite and watch the programme played out to a television to access the immediacy that will be experienced by the eventual audience.

Live television news broadcasting is an everyday example of example of hypermediacy that viewers may access via several screen-based devices, such as mobile phones, tablets, laptops, computer monitors and televisions, which can be used in a range of settings, from the domestic to workplace and outdoor public spaces. Within the frame of the news programme, the viewer has several ‘features’ vying for attention: for example, the studio-based lead journalist or anchor, live or recorded footage from the location of the story that may include a reporter, interviewees and other filming, a live discussion with guest commentators who may be present in the studio or elsewhere, a window in which the miniaturised sign language interpreter translates what is said for D/deaf people, subtitling for people who are hearing-impaired and others watching on a muted television, various text-based news crawls that slide across the bottom of the screen, and the logo of the television company. National television weather forecasts sometimes present QR codes on-screen that once scanned allow viewer to view more localised meteorological information on a mobile phone.

31 March – ghosts beneath us

31-2016-03-31-09-59.png
Wildlife pond at CCANW

The Sun reflected in the surface of a strange pool in Haldon Forrest car parking, near to the now-defunct CCANW – Centre for Contemporary Art and the Natural World. I think the centre was showing an exhibition by local artist Tabitha Andrews. As I was filming we were approached by Emily Allan – an ex-foundation student from PCAD – who introduced us to Paula Orrell, who was about to take up a curatorial post at Plymouth Arts Centre.

The CCANW website has been archived.

29 March – the past dissolves

29-2016-03-29-09-57.png

Grass caught in a fence on Dartmoor. We parked behind the Plume of Feathers pub in Princetown and walked towards South Hessary Tor. It was bitterly cold, Kayla carrying her heavy Bolex. I filmed with the weighty Canon 1014, using its lap-dissolve facility. The dried grass looked beautiful in the wind – it’s molinia and a real nuisance on the moorland.

What year was this? I’ve no idea. Maybe there is metadata with the film strip – not date and time, but implied metadata such as the film type. Perhaps shots before or after on the roll would give a clue. The lap-dissolves on the the grass tells me that it was filmed with the 1014 because its predecessor, in the sense that I traded one camera for the other, a Canon 814 didn’t have that facility. Now when did I buy the camera? Lees Cameras in London, and I remember using the 1014 at Beaumont Avenue before we moved out in 1994. So it could be any time after 1989…

28 March – dark reflecting pool

28-2016-03-28-09-56.png

The Sun reflected in the surface of our garden pond, in the 90s or 2000s. The pond is formed from the foundations of a WW2 air-raid shelter to the rear of the house. The front door is a bit crooked as a large bomb exploded 5m away in the street in The Blitz. Plymouth has a ‘bomb book’ where the impacts were mapped each day by the City Council. Our house was already 50 years old at the time. It has flawed glass panes in the front windows that were probably replaced after the blast. The next road has two newer ‘infill’ houses as two adjacent homes in the terrace were destroyed by a bomb.

When we first introduced fish to the pond they were given names.

27 March – zephyr

27-2016-03-27-09-55.png

The muslin curtain blown by the breeze through the open bedroom window at Beaumont Avenue. To the left was a ‘vintage’ pink moulded plastic splash-back with chrome metal fixtures for two toothbrushes and his-n-hers water tumblers. It’s probably still there as the landlord moved back after we left, having been ousted from his role as director by a coup from Plymouth Art Centre, where he’d lived and worked.

26 March – spectacular

26-2016-03-26-09-53.png

Beautiful filming of sparklers through a photographic enlarger’s condenser lens. The tiny house on Beaumont Avenue in Plymouth was the studio space for a lot of experimentation in the 1990s. This was filmed in the upper front room which served as studio and office space.

I had been experimenting with lenses to create an aerial image in front of a projector for transferring to video without filming the projected image on a white wall or piece of paper. The filming is spectacular, what’s less obvious are the small autobiographical elements. At the time I was spending a lot of time in photographic darkrooms, developing, printing and teaching. Some of the paraphernalia made it home such as the enlargers’ condenser lenses.

25 March –what goes around

25-2016-03-25-09-51.png

A very early piece of Super 8 filming, on St Andrew’s Cross roundabout. I had walked up the hill from Plymouth Video Workshop, based in Plymouth Arts Centre circa 1987/8.

I had learned photography at college as part of my biology course, and developed video skills in the workshop, but Super 8 was a new adventure. I recall playing this film on a Bolex back-projector and hearing Annette Kemp, the workshop director, commenting on the beautiful footage to someone on the phone. At some stage the word ‘film’ was added to the name, becoming Plymouth Film and Video Workshop. This was the beginning of an understanding that there was something of a schism between filmmaking and the newer independent video production. This line was blurred for me as I’d come to video via photography which was all ‘analogue’ at the time.

[Having selected the first shot, the following second of footage appended to the editing timeline contains an in-camera edit, that I discovered at the end of the month when reviewing the whole sequence.]

24 March – lapsing time

24-2016-03-24-09-50.png

The living room mantlepiece in Beaumont Avenue with daffodils and artwork from Plymouth Art Centre where I worked in the film workshop, but also as a gallery technician for Artangel’s James Lingwood and then Rosie Greenlees.

Working at the Art Centre was a wonderful education having come from a science background: contemporary visual art and artists, huge events like TWSA 3D, driving exhibitions around the country and a whole new world of film to absorb in the cinema.