Creative Computer Publishes Interview with the Guy Behind the Death Star Trench Run (1978)
An Interview With Star Wars Animator Larry Cuba
It’s that time of the month were we look at a computing-related interview. I found this interview in an issue of Creative Computing. It doesn’t follow the usual question-and-answer form of an interview and it was originally published in an issue of Starlog, but it involves computer talk, so it counts. Enjoy.
from Creative Computing Magazine (May 1978) Volume 04 Number 03
The Digital Brush
By DAVID HUTCHISON
On the shifting sands of Tatooine nestles the small cottage of "Old Ben" Kenobi. Inside, Luke Skywalker and Ben listen to Princess Leia's plea for help via a holographic recording implanted in R2-D2. Also within the feisty 'droid's memory banks are the technical read-outs of the battle station Death Star. These plans may sway the balance of survival for Princess Leia's people in the fight against the Empire!
The man responsible for the physical creation of the little 'droid's memory readout is Larry Cuba. The sequence in the briefing room in which the schematic view of the Death Star appears on a huge electronic screen, displaying a simulated point of view of a pilot maneuvering straight down a trench on the surface of the Death Star to a two-meter wide thermal exhaust port, was accomplished by means of computer animation.
Computer animation is a process whereby the illusion of movement is bestowed upon inanimate objects by electronic means. In cel animation, an artist must draw each frame of film by hand. Here the computer creates each frame which is then photographed and projected. (Or videotaped and televised.)
With Star Wars already in production, George Lucas issued a call for bids from companies and individuals to produce various bits of instrumentation animation — in particular the briefing room sequence. A number of computer artists and cel animators responded.
Some of the computer people had very sophisticated equipment capable of producing colored and shaded planes and forms. One computer artist even wanted to do most of the model sequences entirely on computers. George spoke with each of the artists and viewed their work, but Larry seemed to understand the kind of look that George wanted for the film.
When Larry was assigned the computer realization of the Death Star plans for the briefing room scene, he was asked to have the sequence photographed on 35mm film so the plans could be rear-projected during the filming of the briefing room scene with the rebel pilots. At UICC Larry would be using the Vector General 3D3I display and a PDP 1145 minicomputer. The sequence would be filmed off of the Vector General screen with a standard Mitchell 35mm camera rigged with an animation motor. The only thing lacking was the trench. John Dykstra's crew had not yet gotten around to building it.
John Dykstra and his team of modelmakers at Industrial Light & Magic (ILM) had begun to assemble the basic modular molds from which they would construct the model of the trench. The basic molds were constructed about two feet square in six different types. From these molds hundreds of casts were made in polyurethane foam. These modular sections were then cut up and assembled in a variety of basically random configurations to establish the sides and bottom of the trench as well as part of the Death Star's surface area.
Larry took samples of each of the six to Chicago to construct his own computer trench. "There was no reason to have the computer sequence match the actual model precisely, since the audience would perceive the trench more in terms of a texture rather than an absolute configuration," Larry explains. "ILM was chopping up the modular pieces to assemble the trench, so I did the same thing— building up the trench in the computer memory just like they were doing with the real thing.
"I photographed the six modules and traced them onto the Vector General data tablet with its electronic pen. By pressing the pen to the various points on the photographs, the modules were digitized — their x and y components entered into the computer." (The x component refers to the horizontal axis and the y to the vertical axis.) The z coordinate was entered manually.
The z coordinate (depth) was limited to about four or five different levels, so when entering the x and y components on the electronic tablet, Larry punched one of five buttons that he had programmed to represent the z coordinate at various levels.
"Then a program was written so that I could call up (from the computer's memory) the raw sections and combine them into the trench." The computer trench consisted of about fifty U-shaped sections (the two sides and bottom of the trench make a U). Larry called up sections of the modules, stretched or moved them around to build up the trench bit by bit. "The trench information was stored away and another program written that would call up the sections sequentially, in the perspective of a pilot flying down the trench, and cue the camera to photograph a frame. I managed to get about thirty frames an hour into the camera once the program was running smoothly."
On the screen the Star Wars audience sees the computer realization of the trench sequence in the form of a "wire-cage" model rather than as a series of solid forms and planes. One of the early problems in computer graphics was the wire cage versus solid form display. At first computer programs could only call up figures in wire cage format. It was only a few years ago that programs were devised to remove the "hidden lines;" the program had to determine which lines would be "hidden" by a front surface or plane and remove those lines.
"When George Lucas specified the kind of animation he wanted for the scene, he knew enough about computer animation to ask for a true perspective without the 'hidden lines' removed. He wanted the trench and the Death Star to appear as wire cage figures with all lines and vertices visible. George thought that this sort of image would suggest 'computer animation' by having a very mechanical look."
Science fiction as a genre often projects into the world of future technology. Larry Cuba suggests that in the future computers will be able to generate pictures of such quality that they will look as though they had been photographed by a camera. In the case of Star Wars, it was thought that such photographic realism might be confusing to the audience, so a wire cage model was specified so that the audience would readily understand that the images were to have been created by a machine.
From start to finish, the entire sequence lasts only about 40 seconds on the screen. It took Larry and his two assistants T.J. O'Donnell and Tom Chomicz about two months to supply two minutes of animation.
The enormous number of points and lines on the wire cage figures that make up the representation of trench seem to flow with almost simultaneous precision. The computer doesn't handle all of these points simultaneously, but rather sequentially. It happens very fast, certainly, and it can appear to the eye to be happening all at the same time, which would be the case while observing a real-time system. A real-time system means that the computer is drawing successive frames as fast as thirty-per-second, which is what is needed to see the thing move smoothly on a TV screen. "There is a limit to how many of those points a computer can draw in a thirtieth of a second and in the case of the Star Wars animation with its true perspective image as opposed to parallel projection (one without depth cuing), I went way beyond that limit. Consequently, you take longer than a thirtieth of a second to put an image on a frame of film. Since the Star Wars sequence was being filmed it didn't need to exist in real time anyway. In this case it took about two minutes to complete each frame."
There are, of course, displays more sophisticated than the Vector General, that could have computed the perspective more readily and probably done the flight down the trench in real time; the perspective transformation would be wired into the hardware itself, rather than generated by a separate program.
There are systems today that can generate shaded color planes in real time. One such system was developed by General Electric and built at a cost of $2,000,000 to train astronauts to land on the Moon. Similar systems are used to train airline pilots to land under a variety of emergency conditions.
Basically, Larry's system consisted of a $50,000 Vector General 3D3I graphics terminal with its dials and electronic data tablet, a $30,000 PDP 1145 mini-computer and standard alpha-numeric keyboard. "I set up a Mitchell 35mm camera with an animation motor in front of the screen and connected it to the computer so that a signal from the program could trigger the animation motor when the image was complete.
"The full length of the trench consisted of about fifty of these U-shaped sections. Well, you couldn't bring all fifty of these sections up on the screen at the same time. The computer brought up five sections at a time and it would take about 24 frames (one second) to go through one U-shaped section of the trench.
"So it was this continual shuffle of sections; never having more than five on at any one time. Now, of course, this means that ones at the back just sort of pop on. I had hoped to be able to just fade them in, bit by bit, by manipulating the intensity control to make them appear more slowly. But there wasn't enough time.
"The entire sequence was shot once, and that was it. Early on, I had a deadline of June first, but in early April the deadline was moved up to May fifth — lopping off three weeks. I had anticipated another six. I suggested that they wait and shoot the sequence in England blue screen; they could print the computer effects in later and have the thing perfect. But no, they wanted to rear project it so that the guys in the briefing room would play to the images while they were talking. Well, my first take worked. There were a couple of problems, but they edited around them."
The briefing room sequence is the only scene in Star Wars in which digital computer animation was used — other than for occasional background displays as part of the Death Star set. The effect was programmed in Tom Defanti's GRASS language. GRASS (GRAphics Symbiosis System) was written by Tom as part of his doctoral thesis for Ohio State. "It takes advantage of all the things that the Vector General does. The Vector General has a lot of image transformation hardware built into it, which allows you to do a lot of things in real time (with no processing delay). The language is designed for non-computer people. GRASS consists of very simple, straight forward commands which allow the students to work with the Vector General 3DI directly and manipulate the image by means of various dials and buttons.
"GRASS as a language makes it super easy for an educator or student to come in and call up a stored image (a crystal, molecule, etc.) and by means of the language manipulate the image, say rotation by a single dial, programmed in GRASS.
Suppose it is necessary to look at a particular molecule, a simple sugar for example, which has been named SUGAR. The molecule must be called up from the memory disk, shown on the screen, made larger or smaller and rotated for study. The commands would be typed out on the alpha-numeric keyboard in GRASS:
GET DISK SUGAR
SCALE SUGAR, DO
ROTATE SUGAR, X, D1
By means of these three commands the required molecule appears on the screen, its size can be changed by turning dial number "O," and it can be rotated around the x-axis (horizontal) by means of dial number "1."
Sounds easy? It is. And what fun it must be to sit there and play with shapes and movement!
"The display can then be handled by an image processor — colored, mixed and recorded on standard videotape, 3/4 inch cassette or what have you." The...mathematics, medicine and computer programs.
Additionally, since the system operates in real time, it has been used in performance in a live concert. Various monitors were spotted around the concert hall and one large Advent Video projector rigged. There are three performers. One performer programs the computer and operates the dials of the Vector General, creating the original image. The second manipulates the image processor and colorizes the image and the third performer creates music on an audio synthesizer to complete the video picture. A number of tapes have been made of these concerts and are generally available. PBS has broadcast a number of them.
But is it art? Mr. Cuba maintains that the computer and its peripherals are tools, like brushes and pigments to a painter. That the manipulation of these tools is by the mind of man and just as selectively controlled as any other fine art. "The computer as a tool gives us a new way to explore motion, movement and the kind of imagery that we have never really had the power to explore."
Will we see more computer animation in motion picture making? So far it has had a very limited use. There was a sequence in UFO: Target Earth and Futureworld. All of the visuals aboard the ships in 2001 were cel animation masquerading as computer graphics. There were some in Demon Seed — one of the background display monitors ran a computer-generated model of an earthquake.
Ultimately, there is the possibility that the technology of producing curved surfaces, details, colored and shaded...complicated special effects that can be created only by photography and optical effects.
Already computer controlled cameras could usher in the era of setless cinematography, in which the actors will work on giant blue-screen sets with all of the details added by computer (see Magicam in STARLOG # 9).
Computer video technology has found its way into commercial television. Numerous commercials and logos have made use of sophisticated video synthesizers to create, without the photographic camera or lengthy cel animation, the images required.
In New York City, Dolphin Productions uses the Scanimate video synthesizer to produce a good many of Madison Avenue's television commercials.
There are only five such machines in the world — originally built by Computer Image Corp', in Denver. The essence of the machine is that you can put down any picture or image and move it, transform it, distort it, flip it, color it right in front of your eyes and record the result on video tape.
The images can be saved, mixed or composited with other images and backgrounds so that little by little a completed sequence can be built up. Much of the credit must go to the enormous advances in recent years of computer controlled video tape editing. With the Scanimate equipment and the IVC 9000 video editing equipment a complete thirty second commercial may be produced in eight hours. The going rate, however, is $8,000 a day and up.
The process starts with an image, either a Kodalith on a light box scanned by a TV camera or a TV studio camera image. The image is then transformed in...into a ball, colored and positioned on the screen.
Then the image can be moved and rolled in any manner around the screen. The Scanimate is operated by patching the video signal through various transforming modules in much the manner as an audio synthesizer. The movements are watched and tested at various settings until the client sees what he likes. Then it is recorded. Eventually a foreground and background reel is generated. At the end of the day the reels are composited, a sound track laid in and the client goes way with a complete TV spot tucked under his arm.
The advantage of the system is that the client can immediately see what he is getting without waiting for various laboratories and optical houses to process film and create effects.
Dolphin's use of the Scanimate equipment allows them to have almost any job out in two days at half the cost of the average commercial. Certainly if the effects of figures twisting, stretching, zooming, strobing, or squeezing against a "three-dimensional" background were attempted with cel animation, the cost would be prohibitive.
The Scanimate, however, isn't intended to compete with cel animation, but to produce visually effective animation on the spot, with the client watching.
Certainly the potentials of computer animation have only been suggested. Much is still unrealized, waiting for the man with the ideas and visions to use these new tools.
What computer ads would you like to see in the future? Please comment below. If you enjoyed it, please share it with your friends and relatives. Thank you.