Working With John Sanborn                                                         November 2016

 

"MEANDRES & MEDIA, L'CEUVRE DE JOHN SANBORN" was an exhibition in Paris (2106) and a book about the work of Video and Media artist John Sanborn curated by Stephen Sarrazin. These are my recollections of working with John in the 1980s, as published in the book.

-------------------------------------------------------------------------------------------------------------------

 

New York City in 1982 was a very different place than NYC today.  Lower Manhattan “pioneers” were able to get large loft spaces for very little money; MTV, MIDI, CDs, the explosion of cable TV and the club scene (remember Danceteria?)  were about to drive demand for visual music through the roof.  And there was a little bit of danger and an electric energy in the air. As a result, artists of all disciplines and generations were drawn together in a heady mix of creativity and community.  Also, the city was awash in blow. That was the year I met John.

 

I had spent my college years at Rensselear Polytechnic Institute working on getting artists and engineers to collaborate. We built hardware, created video art and generally tried not to kill each other -- all with mixed success. Upon graduating I made a beeline for NYC and went to work for VCA Teletronics, the first independent post production facility.  My day job was designing machines to electronically edit video but my weekends were spent “testing”  the equipment with video artists on personal projects.  If memory serves, John approached me about working with him and Kit Fitzgerald on a piece they were creating for Adrian Belew called “Big Electric Cat.” That seemed like a logical extension from the work at RPI, and so our partnership began.

 

Unlike today, getting access to video equipment back then was a big deal. If you’ve never known a world where you couldn’t edit cat videos on your iPhone you might have trouble picturing this, but video edit suites took up lots of space and cost millions of dollars to build and operate.  John was willing to spend his weekends working – we did have the keys to all the toys – and I was delighted to have a new collaborator.  Over the course of the next eight years we ended up working on numerous artistic and commercial projects together.

 

Three things made working with John special: 1.) he didn’t struggle making creative decisions. That was especially important as we were layering images in a linear editing suite without an “undo” button. 2.) although not an engineer he grasped technical concepts very quickly.  And most important, 3.) John understood that you could be serious about your work but still maintain a rollicking sense of humor while working.  I can’t overstate how important that was and still is.

 

The process we used to create images was based on “discovering” visuals. We would take what were then state-of-the-art post production tools, e.g., analog switchers and digital video effects devices (DVEs) and feed them back into themselves. This digital feedback in an analog signal path produced unpredictable results. When we discovered things we liked we recorded them onto videotape.  These recorded images would then be layered on top of each other – often as much as fifty or sixty analog generations down.  We created a more detailed explanation of this process for a segment called “The Video Artist” on “Night Flight,” a 1980s TV series on the USA cable network covering downtown Manhattan art and culture.  You can view it by clicking the photo of us below.

 

Our process of improvising, capturing and layering imagery is analogous to jazz. By contrast, the rigid formalism of computer graphics imagery is more like classical music. And we liked jazz. That said, John and I were also fans of the avant garde which led us to our next piece, “ACT III” set to the music of Philip Glass.  Glass’ music and John’s energy were a good combination.  ACT III managed to transcend the barriers of abstract video art and find widespread acceptance.

 

I’ll always be grateful to John for our next moment of cosmic synergy. One of my musical heroes was composer Robert Ashley.  Unbeknownst to me, a few years earlier John had directed the pilot for “Perfect Lives (Private Parts)” Ashley’s brilliant opera for television. When John asked me if there was any music I’d like to work with for our next project I immediately replied, “Robert Ashley.” “Funny you should say that…”said John. After languishing for several years Britain’s Channel Four had just approved the funding to create all seven episodes with John directing. “Would I like to work on it with him?” he asked. Trick question?? Of all the work we did together ACT III and Prefect Lives are my favorites. And Bob Ashley went from hero to my friend and mentor until he passed away on March 3, 2014. Eternal thanks to John for that.

 

In addition to commercial projects we went on to collaborate on four more video art pieces: “Renaissance” (1984) for the Computer Museum in Boston,  “Video Wallpaper” (1984) a 50 minute ambient background video for a distributor I’ve long since forgotten, “Luminaire” (1985) for Expo ’86 in Vancouver and “Infinite Escher” (1990) an early analog high definition work for Sony.  We no longer had to steal weekend time to work on these, although we could only work the night shift. I have fond memories of watching John run around the facility shrieking jokes and doing shticks at 4:00 AM.  And, fortunately, most everyone else there at that hour was amused as well.

 

Things are different today. John moved to Berkley. I’ve stayed in lower Manhattan and watched as artists got displaced by hedge fund kids. Sitting down at our respective Mac workstations we each have more computer power than filled all of Teletroncs (and then some).  And I surely don’t miss the two hours I had to spend aligning all of the analog videotape machines and signal processors to get ready for an evening’s work.  Much to be said for double clicking and having a project come up just as you left it. But there was an energy that came from working “in the studio” in general and with John specifically that I do miss.  Skype just isn’t the same.

Sanborn (right) and Winkler (left) in Edit III at VCA Teletronics, NYC 1984

Preserving Nam June Paik's Work                                                   February 2016

 

Nam June Paik (July 20, 1932 - January 29, 2006) was a Korean American artist who is rightly considered to be the father of modern video art. Nam June was a hero, mentor and friend who taught me many things.  Perhaps the most important being that creating art and keeping a sense of humor about it go hand-in-hand. (A philosophy embraced by the composer Robert Ashley, who is also sorely missed.)

 

From 1981 - 1997 Nam June worked with video artist and social activist Paul Garin. Paul took a gig as Nam June's assistant and ended up being one of his most important collaborators, producing hundreds of works together. We basically gave Nam June and Paul a key to Post Perfect to work on art projects -- one of the charms of working the night shift was running into them. The picture below is of Nam June (left) and Paul (right) having a video jam session in Post Perfect's linear edit suite number two.

 

Unfortunately, many of Nam June's pieces are falling into disrepair. This is particularly true for the multi-monitor sculptures, or "robots" as he called them, as they were created with vintage analog consumer television gear. These are failing and must either be repaired (very hard) or replaced with digital displays that maintain the look of the originals (even harder). Paul presented a paper on this topic in Seoul last week -- click on the picture below to read the text of it. Bottom line: time is running out to preserve these magnificent works. If you'd like to help the restoration effort please contact Paul at: pg(at)freethe.net

 

 

 

 

 

 

 

Nam June Paik and Paul Garin at Post Perfect, circa 1992

The First Non-Linear Edit System                                                 September 2015

 

In 1969 SMPTE released standard 12M – a specification for applying a universal time code to video. Assigning a unique number to every video frame was critical to the development of electronic editing as it enabled a list of all the edits in a program to be compiled. Thus was born the Edit Decision List or EDL. CBS and Memorex formed a company called CMX Systems to build editing systems using time code and EDLs. If you’ve never known a world where you couldn’t edit cat videos on your iPhone your reaction might be “meh.” But this was some radical engineering at the time.

 

In 1971 CMX released their first product: the CMX-600 light pen random access editor. Wildly ahead of its time, it stored monochrome video in analog format on Memorex computer drives and used a DEC PDP-11 mini-computer to control the system via a light pen interface. The system only held 30 minutes of low quality video, the disk drives took up a few hundred square feet of floor space and it cost the equivalent of three million bucks in 2015 dollars. But this revolutionary machine was the forerunner of all modern non-linear editing systems.

 

Of the first five systems CMX sold three went to CBS, one to CFI in LA and one to Teletronics in NYC. I had the privilege of sharing an office with the CMX-600 disk farm at my first full time job as an engineer at Teletronics. Typing this post on my Mac workstation, with its 82 Terabytes of attached storage, I can’t say I miss the big multi-disk Memorex platters each of which held only 5 minutes of video. But I do have a fondness for the CMX-600 and a great appreciation of the monumental effort it took to create it. And there’s much to be said for a heavily air conditioned machine room.

 

Thanks to Robert Lund for the pictures below. Lundo left Bell Labs to join Teletronics as one of the first digital engineers in post production. He maintained the CMX-600, wrote custom software for it and had the temerity to hire me in 1981 to design and build hardware for him.

 

 

 

The CMX-600, as shown in the original CMX product brochure (1971)

The real "Mad Men" edit a commercial at Teletronics using the CMX-600 (circa 1976)

On Analog Computing                                                                          June 2015             

Joost Rekveld wrote a fascinating and detailed blog post about analog computing. Worth a look just for the vintage big-iron eye candy.  Click the image below to hop over to it (please do come back when you're done).

How Much Resolution is Enough?                                                  February 2015             

If you attended or read about this year’s Consumer Electronics Show it was hard to escape the hype about the new high resolution televisions. 4K! 8K! Unbelievable amounts of K! But how much display resolution does one actually need?

 

The short answer: it all depends on viewing distance from the screen.

 

The long answer: resolution is one of several factors that needs to be considered together when evaluating capture, post production and display technology. In much the way signal-to-noise ratio, frequency response and distortion are always looked at together when evaluating audio systems, resolution needs to be considered along with many other factors when evaluating image systems including: dynamic range, contrast, brightness, color gamut and frame rate. My friend Mark Schubin gave an excellent (as always) presentation about this in November, 2014 which is pasted below.

 

The research WCI did for the large scale immersive films the Doha Film Institute is creating showed that the key driver of how much resolution is enough is viewing distance from the screen. To wit, if you’re thinking about replacing your 50 inch TV with a 4K model and your couch is 150 inches back from the screen don’t bother. You won’t be able to perceive the difference in resolution. However, increasing the dynamic range of the image, as Dolby is proposing, increasing the contrast and the brightness of the display will have a very, very noticeable effect and will yield greater benefits than increasing resolution.

 

On the other hand, at very close distances, e.g., less than one screen height away, resolution does become critical.  Continuing the above example, if you like to watch your 50 inch TV from 40 inches away a 4K set will indeed change your life, although probably not as much as a new pair of glasses.

 

An unexpected corollary to this was that at very short viewing distances display resolution is actually more important than source resolution. We compared 2K and 4K projection screens viewed from ½ screen height distances with both 2K and 4K source material. What we found was that the perceived quality improvement going from a 2K to a 4K projector with 2K source material on both was greater than the perceived increase in a 4K projector with 2K source material to the same 4K projector with 4K source material.

 

Looking beyond display, there are, obviously, other reasons for capturing and posting at high resolution. Generally speaking, more resolution means more flexibility, e.g., the ability to zoom and reposition and image in post, the ability to add stabilization, the ability to reproduce material, etc. But this is not free. Data space and processing time increase exponentially with resolution. So if one is creating a webisode for YouTube perhaps it best not to shoot 100 hours of 6K Red Dragon files.

 

 

 

 

 

 

 

 

 

 

 

Video Facility Migration to an IT Based Infrastructure                     February 2015

 

Video facilities have always used specialized, expensive hardware to interconnect equipment. Beginning with equalized coaxial cable runs required for NTSC or PAL interconnections, to multi-wire analog component, to parallel digital  (which I seriously don’t miss – remember white video speckles?) to today’s serial digital coaxial interconnections the professional video environment has always required dedicated interconnection schemes.  These required specialized hardware and were very format specific, i.e., each signal path could only carry one type of signal to one place.

 

The evolution and ubiquity of computer networking is about to change this. As information technology continues to improve it’s able to handle more complex signals with higher bandwidth and strict latency requirements. Why build expensive, dedicated signal paths if “generic” Ethernet switches can handle video interconnections with format and routing flexibility?

 

Unfortunately, we’re still a few years away from this being practical. But it is the direction technology is heading in. The January SMPTE NYC Chapter meeting was devoted to this subject, a recoding of which is pasted below.

 

Ironically, the world of video projection is heading – with good reason – in the other direction. All professional projectors now offer the option of having coaxial serial digital input connections. And while there may be a next generation of display port technology that ultimately replaces it, for now, we believe serial digital interconnection is the highest quality, most reliable method of feeding projectors. By contrast, if someone is setting up a projection system for you – particularly if it involves multiple projectors – and they want to use computer DVI interconnections run away. Far, far away.