31 Days vs The News

Multi-screen editing

Non-linear video editing systems (NLEs) typically offer multiple ways to view material, with options to display the editing interface across several screens. Figure 2 shows Final Cut Pro (FCP) in action. The main iMac Pro screen at the top left displays panes containing selected material for the next edit, the timeline below with thumbnail images and the clip below the playhead in the timeline displayed above and on a separate monitor. The iPad next to the keyboard is displaying ‘filmstrips’ with thumbnail representations of material available for editing.

Other NLEs, such as DaVinci Resolve, offer even more flexibility, allowing editors to arrange the interface across many screens to suit their needs. Along with the image content, FCP can display metadata such as timecode and clip names, along with technical data such as video scopes to indicate the brightness and colour of the video being edited. This hypermediacy delivers simultaneous streams of information useful to the editor, such as seeing the approach of the next video edit on the timeline before the cut is displayed.

When ‘immediacy’ is required, FCP can be switched to ‘full screen’, so the interface disappears, leaving only the video content playing across the whole monitor. When reviewing an edited sequence in full screen there is still the temptation to pause the playback and return to the editing interface to address any points of interest or concern. The editor can leave the edit suite and watch the programme played out to a television to access the immediacy that will be experienced by the eventual audience.

Non-linear video editing systems (NLEs) typically offer multiple ways to view material, with options to display the editing interface across several screens. The figure above shows Final Cut Pro (FCP) in action. The main iMac Pro screen at the top left displays panes containing selected material for the next edit, the timeline below with thumbnail images and the clip below the playhead in the timeline displayed above and on a separate monitor. The iPad next to the keyboard is displaying ‘filmstrips’ with thumbnail representations of material available for editing. Other NLEs, such as DaVinci Resolve, offer even more flexibility, allowing editors to arrange the interface across many screens to suit their needs. Along with the image content, FCP can display metadata such as timecode and clip names, along with technical data such as video scopes to indicate the brightness and colour of the video being edited. This hypermediacy delivers simultaneous streams of information useful to the editor, such as seeing the approach of the next video edit on the timeline before the cut is displayed. When ‘immediacy’ is required, FCP can be switched to ‘full screen’, so the interface disappears, leaving only the video content playing across the whole monitor. When reviewing an edited sequence in full screen there is still the temptation to pause the playback and return to the editing interface to address any points of interest or concern. The editor can leave the edit suite and watch the programme played out to a television to access the immediacy that will be experienced by the eventual audience.

Live television news broadcasting is an everyday example of example of hypermediacy that viewers may access via several screen-based devices, such as mobile phones, tablets, laptops, computer monitors and televisions, which can be used in a range of settings, from the domestic to workplace and outdoor public spaces. Within the frame of the news programme, the viewer has several ‘features’ vying for attention: for example, the studio-based lead journalist or anchor, live or recorded footage from the location of the story that may include a reporter, interviewees and other filming, a live discussion with guest commentators who may be present in the studio or elsewhere, a window in which the miniaturised sign language interpreter translates what is said for D/deaf people, subtitling for people who are hearing-impaired and others watching on a muted television, various text-based news crawls that slide across the bottom of the screen, and the logo of the television company. National television weather forecasts sometimes present QR codes on-screen that once scanned allow viewer to view more localised meteorological information on a mobile phone.

Deluxe telecine

The facilities house I selected for the work had a long history in UK filmmaking and broadcasting at an address on Wardour Street, previously having been Soho Images and other businesses, but it was called Deluxe when I visited in July 2011. It was a strange experience having my ‘private’ Super 8 films laced up on an expensive Ursa Diamond telecine machine by a professional more used to working for television, music videos and advertising. The films had only been seen by myself and close friends, either projected or on a small hand-cranked viewer, now they (and I) were being treated like royalty, no expense spared. The telecine operator reported he had been digitising the film collection of Amsterdam’s Rijksmuseum for months prior to my booking. The realisation that my films were worthy of the same treatment as a nation’s collection perhaps affected my relationship to my Super 8 films. It is hard to recall with certainty what I felt about the film collection before this time, but the grant application to have it digitised suggests that I considered it justified the attention, even if the thinking had not been fully articulated.

In preparation for the telecine, I had joined most of the 50ft rolls using a tape splicer to make larger 400 and 800ft spools to speed up the transfer process because Deluxe was charging by the hour. This was not an entirely orderly process, as many of the rolls were unlabelled and uncatalogued. Only when the mailer envelopes had project identification could the films be collated so that the larger reels had films running in a logical order. This splicing had changed the nature of the collection. Now there were longer running films with strange and interesting subject juxtapositions and jumps in time, instead of the camera rolls that would previously been loaded into a projector one-at-a-time, then each one experienced as a discrete visual piece reflecting, in a sense, the way they had been shot. The cartridge is loaded into the camera, exposed over a certain length of time – maybe minutes, maybe weeks – and then removed, so the filming is, in a sense, compartmentalised. The running time for 50ft at 18 frames per second is around 3 minutes 20 seconds which affords a particular viewing experience compared with ‘normal’ cinema – the screening is necessarily stop-start with the lights coming up to allow the next roll to be loaded into the projector. The compiled reels had created portmanteau movies which had previously not existed, and these were transferred to standard definition DigiBeta tape and later captured to external hard disks. The transfer process was smooth once the spliced film spool was laced on to the telecine machine.

The telecine suite was soundproofed and dark, coolly air-conditioned in contrast to the sweaty heat of the Soho summer outside. The films appeared on the expensive video monitors as the technician adjusted the colour balance and exposure, smoothly rewinding if I requested any changes. This was an entirely new way for me to experience the footage, and my understanding of the material, and my relationship to it had changed. This professional, hands-off, somewhat rarefied experience of the footage underlined the shift from physical film to digital, ‘amateur’ to professional, private to public.

As the years passed, there were huge advances in video technology which had two contradictory ramifications. First, the professional use of film workflows rapidly diminished, causing Deluxe Soho, among others, to be reconfigured as dry-hire Avid edit suites, losing all its telecine facilities before being shut down entirely by its American owners.

Ghost Vessels

Frame from Teign Spirit

At the moment the sea off Torbay and Teignmouth is filled with cruise ships parked up waiting for Covid to pass. These ghost ships are manned by skeleton crews, behemoths at anchor on the horizon.

https://www.devonlive.com/news/devon-news/cruise-ships-stay-torbay-over-5071384

This reminded me of filming for a project in Teignmouth in 2008. At that time, tankers could be seen anchored out at sea for long periods in the sheltered waters of Babbacombe Bay, waiting for the price of oil and petrol to rise before discharging their cargo at refineries and storage depots.

I had filmed around the town but the project came alive when some home movie footage came to light, beautifully shot by a family member, of successive summer holidays in the town in the years running up to WW2. I imagined the last holiday as tensions were rising in Europe prior to Hitler’s invasion of Poland on 1  September 1939.

The outcome of the filming, Teign Spirit, is an archive film – in Jaimie Baron’s terminology/taxonomy – a found footage film. I made no attempt to reveal the context of the cine film, leaving it to audiences to decipher the interleaved footage that alluded to the gathering clouds of climate change that threaten Teignmouth.

The film demonstrates Baron’s ‘archive effect’ in both ways: intentional and temporal disparity. Intentional disparity as the Brown family’s home movies weren’t intended to be seen in Tate Modern in 3/12/2009 Starr Auditorium introduced by Stuart Comer – Tate’s Modern film curator. Teign Spirit also has had screenings all over the world and can be viewed online. Temporal disparity as the old black and white footage appeared on screen mixed-in with modern-day colour digital video.

Teign Spirit – pronounced teen spirit, yes like in the Nirvana song. It was a the name of a racing pilot gig we filmed on the water.
River Teign – river teen
Teignmouth – tin muth
Kingsteignton – kings tane tun

Sounds distant

A number of films in my collection have accompanying audio on formats which changed over the years: audio cassette, 1/4″ (very few), DAT and solid state recording devices.

When searching for original audio for the Sea City project recorded on DAT (Digital Audio Tape) I discovered my DAT recorder had expired – it turned on but would not play the tape. I contacted various friends and local organisations to borrow a player but all the machines had problems of one sort or another, seven machines in total!

DA-P1 innards
DA-P1 with non-spinning head drum

A Panasonic SV-3800 DAT machine bought on eBay chewed up a tape

so was returned for a refund. It played a few tapes OK and these were captured via S/PDIF to the laptop.

Hearing the Past

Having been re-editing Sea Front with the newly-digitised HD footage my mind has moved to audio and whether to revamp the soundtrack. The audio for the larger project Sea City was recorded on DAT and MiniDisc (MD) which are both digital formats but require the recording to be played back in real-time.

As I write this, audio from around 2009 is being ingested over an optical digital cable in to the laptop and I’m hearing the sound of swifts, blackbirds – so a summer recording – and a distant police siren from an MD labelled ‘Garden ambience 28 May 09’. The disc holds 80 minutes of audio and by no means all of it matches the label.Sony MiniDisc

Continue reading “Hearing the Past”

Super 8 HD scan

Super 8 reel on an HD scanner in Bristol.
Womad being rescanned

A few rolls of Super 8 were scanned at Geneva Stop in Bristol. The MWA HD scanner can be set up to overscan ie to include the edges of the film which were excluded from the Deluxe scan in Soho years earlier. It’s amazing to remember that in just a few years video has transitioned from broadcast interlaced 4:3 standard definition to 16:9 HD progressive and beyond.

Taking stock of Womad

Greens Super 8 editor

A quick check of the Super 8 reels before tomorrow’s telecine/scan at Geneva Stop in Bristol showed me that Womad was shot on a couple of – well two and a bit – 50ft rolls of Agfa Moviechrome 40. I remember this as being slightly cheaper than Kodachrome with white envelopes sent to Deer Park in west London..

DRHA 2017 DataAche exhibition

Skimming the Archive
Single channel HD video work comprising three panes of Super 8 footage

The work interrogates the ‘digitised materiality’ of personal Super 8 film, contrasting the tactile presence of the celluloid archive with the malleable temporality of its digital afterlife, and more particularly the accessibility afforded by skimming many gigabytes of filmed material in a non-linear editing program.

The film was developed across the 31 days of March 2016 using a process-based methodology – each day I skimmed across the hours of footage until a single image arrested my attention (cf. Barthes’ punctum) then I appended the following second of film footage to a timeline. The three panes allude to the past, present and future.
This presentation includes the original One Second a Day and its three subsequent iterations where the camera footage clips are extended to two, seven and 20 seconds. As the clip-length extends the repeated image progression across the panes become less obvious. Skimming the Archive simultaneously celebrates the boundless possibilities of digital postproduction while lamenting the feeling that with ‘digital’ a work is never fully finished.

This project is part of my 3D3 practice-led PhD based at Digital Cultures Research Centre, UWE funded through the AHRC.

U-matic

IMG_4311-2017-04-21-11-40.jpg

Experiments digitising from a lo-band U-matic player came to an abrupt halt when the machine expired with a loud bang and acrid smoke. The player went on its final journey to the council recycling centre.