The Making of 31 Days

Super 8 centred in an HD frame

I placed the 31 seconds of 4:3 edited footage in the centre of an HD frame, preserving the image quality by having its pixel size at a 1:1 ratio. I felt the image seemed a little ‘lost’ in the HD frame and would likely be smaller than other work in the compilation.

Arrangement of panes across the HD canvas.

In search of a solution that would use more screen-space, I then composited another two frames across the 16:9 HD screen. As seen in Figure 21, the outer two SD frames extend beyond the edges of the HD frame, shown as a black rectangle.

One Second a Day compositing with one-second offset on each pane.

I chose this arrangement in preference to having three smaller complete panes. Offsetting the footage in each pane by one second from right to left displayed subsequent clips in the three panes. The three video tracks in Figure 22 above are the right, centre and left panes, reading from top to bottom. When playing, each one-second clip appears partially on the right-hand pane, then fully visible in the centre pane, then partially visible on the left-hand pane, alluding to past, present and future across the composition, reading left-to-right.

One Second a Day film still – three Super 8 frames across an HD screen

The footage used in One Second a Day is silent. To create a soundtrack, I recorded a Super 8 film projector being started and running, syncing this audio to the moving images making an auditory connection to the performance of film projection. The machine used to digitise the footage was almost silent, its operation not disturbing the hushed, reverential space of the telecine suite in Soho. The sound of the Chinon projector being operated in my work room brings the footage home – literally – as the progress of a 50ft reel through the machine is sonically coloured by that space. Brandon LaBelle describes his act of clapping in a room as “more than a single sound and its source, but rather a spatial event” that is “inflected not only by the materiality of space but also by the presence of others, by a body”(LaBelle, 2015, p.xii). Recording the audio in a domestic-sized room with me standing and holding the microphone, shapes the audio, and to me, it speaks of the times when I first saw each reel freshly returned from processing.

Super 8 cameras usually have automatic exposure, so filming can be freewheeling in execution: frame, focus and press the trigger. The cost of the film and processing encourages frugality, so individual shots are fairly brief. Despite the consequent brevity of most of the filming – meaning more frequent in-camera edits – 30 of the one-second selections contain a single shot. Close inspection of One Second a Day reveals that a single one-second clip has an on-screen edit within it that then repeats across the three panes, shown in Figure 24. 

An existing edit in a one second clip

The short film was compiled with the other students’ videos to form a collection with a running time of approximately eight minutes duration, entitled 3D3: 1 Second a Day Showreel. The compilation is displayed by 3D3 on its website.

My Super 8 films had undergone a complicated series of remediations on their passage to include in the group project. The film frames were mostly shot at 18 frames per second, digitised to standard definition video at 50 interlaced fields per second, colour-corrected by a telecine operator and recorded to Sony DigiBeta half-inch digital tape. This tape was played out and captured to hard disk, then de-interlaced and placed in a 25 progressive frames-per-second high-definition video editing project. The SD telecine and video edit used ProRes codec QuickTime files with 4:4:2 colour subsampling. When the edit was complete, the subsampling was downgraded to 4:2:0 and compressed using the H.246 codec for playback on an LCD television.

(Re)Remediation

Kodachrome Super 8 frame

Sea Front serves as a case study for the remediation choices that were made for the Super 8 practice experiments that follow. The first digitisation was a home telecine using a standard definition miniDV digital video camera to film the projected image. As described earlier, I set the projector to 18 fps and slowed it slightly to reduce flicker in the captured video. The editing was carried out in its native 50i interlaced format. The finished film won an award at the London Short Film Festival, despite, or perhaps because of, the ‘unprofessional’ low resolution rendition of the Super 8 footage which pulses – an effect caused by the frequency of the rotating projector blades.

Digitised Super 8 frames

A screen grab from this home transfer is seen on the left in Figure 19. The professional telecine conducted by Deluxe in London, described is in the centre. Although this was standard definition (SD), it was more detailed and did not exhibit the previous iteration’s pulsing. I recreated the video edit of Sea Front using the Deluxe scan that was carried out at 25 fps. During editing I slowed the footage to preserve the ‘feel’ of the first award-winning edit. Technology has developed rapidly making high quality scanning much more affordable, but it would still be expensive and time-consuming to re-digitise all my archive at HD or higher resolution. I did take a selection of Super 8 material to Bristol in 2019 for an HD scan where each film frame is captured to a progressive video file. The Super 8 film passed in front of a lens on the scanner, rather than across the physical gate of a projector or telecine machine. This process is gentle on the physical film and allows the whole width of the film to be scanned as shown in Figure 18 or zoomed in to the picture area as in the right-hand image in Figure 19.

Womad (2016) was edited using the Deluxe SD telecine footage and was screened on Sony Cube monitors. These obsolete 4:3 ratio CRT televisions displayed the SD material at its best. The difficulties of screening SD interlaced video on an HD flatscreen, and the strategies to cope with the mismatch are discussed in Skimming the Archive on pages 93-98.

31 Days vs The News

Multi-screen editing

Non-linear video editing systems (NLEs) typically offer multiple ways to view material, with options to display the editing interface across several screens. Figure 2 shows Final Cut Pro (FCP) in action. The main iMac Pro screen at the top left displays panes containing selected material for the next edit, the timeline below with thumbnail images and the clip below the playhead in the timeline displayed above and on a separate monitor. The iPad next to the keyboard is displaying ‘filmstrips’ with thumbnail representations of material available for editing.

Other NLEs, such as DaVinci Resolve, offer even more flexibility, allowing editors to arrange the interface across many screens to suit their needs. Along with the image content, FCP can display metadata such as timecode and clip names, along with technical data such as video scopes to indicate the brightness and colour of the video being edited. This hypermediacy delivers simultaneous streams of information useful to the editor, such as seeing the approach of the next video edit on the timeline before the cut is displayed.

When ‘immediacy’ is required, FCP can be switched to ‘full screen’, so the interface disappears, leaving only the video content playing across the whole monitor. When reviewing an edited sequence in full screen there is still the temptation to pause the playback and return to the editing interface to address any points of interest or concern. The editor can leave the edit suite and watch the programme played out to a television to access the immediacy that will be experienced by the eventual audience.

Non-linear video editing systems (NLEs) typically offer multiple ways to view material, with options to display the editing interface across several screens. The figure above shows Final Cut Pro (FCP) in action. The main iMac Pro screen at the top left displays panes containing selected material for the next edit, the timeline below with thumbnail images and the clip below the playhead in the timeline displayed above and on a separate monitor. The iPad next to the keyboard is displaying ‘filmstrips’ with thumbnail representations of material available for editing. Other NLEs, such as DaVinci Resolve, offer even more flexibility, allowing editors to arrange the interface across many screens to suit their needs. Along with the image content, FCP can display metadata such as timecode and clip names, along with technical data such as video scopes to indicate the brightness and colour of the video being edited. This hypermediacy delivers simultaneous streams of information useful to the editor, such as seeing the approach of the next video edit on the timeline before the cut is displayed. When ‘immediacy’ is required, FCP can be switched to ‘full screen’, so the interface disappears, leaving only the video content playing across the whole monitor. When reviewing an edited sequence in full screen there is still the temptation to pause the playback and return to the editing interface to address any points of interest or concern. The editor can leave the edit suite and watch the programme played out to a television to access the immediacy that will be experienced by the eventual audience.

Live television news broadcasting is an everyday example of example of hypermediacy that viewers may access via several screen-based devices, such as mobile phones, tablets, laptops, computer monitors and televisions, which can be used in a range of settings, from the domestic to workplace and outdoor public spaces. Within the frame of the news programme, the viewer has several ‘features’ vying for attention: for example, the studio-based lead journalist or anchor, live or recorded footage from the location of the story that may include a reporter, interviewees and other filming, a live discussion with guest commentators who may be present in the studio or elsewhere, a window in which the miniaturised sign language interpreter translates what is said for D/deaf people, subtitling for people who are hearing-impaired and others watching on a muted television, various text-based news crawls that slide across the bottom of the screen, and the logo of the television company. National television weather forecasts sometimes present QR codes on-screen that once scanned allow viewer to view more localised meteorological information on a mobile phone.

Personal vs Institutional

Some personal archives may find their way into institutional archives, through donation by the family, and so may come to form part of our social memory – but how ‘safe’ is this material? Many institutional film archives, such as the British Film Institute archive have digitised parts of their collection to make the material more accessible to the public and for educational and research purposes – and have preserved the original celluloid material. However, other archives may have only digital copies of the film material they hold. Will the ‘thin images’ of incidental moments of day-to-day lived experience be weeded out from the public archive?

Laura U Marks’ comment seems relevant here: “The less important the film or tape (and by extension, its potential audience) was considered, the less likely that it will have been archived with care” (Marks, 2002, p. 169) – or, I suggest, be considered insignificant to the construction of our shared, socio-cultural history by a dutiful archivist who may remove the material.