In 2016, cinematographer David Stump, ASC did his best to change the way movies are made.
That was the year he shot a short film using a prototype from Lytro Cinema, then the leading developer of light-field capture technology. Lytro’s movie camera was like no other. A light-field camera captures more than just a picture of a scene from a single fixed perspective. It doesn’t just record the intensity of light hitting the sensor; it captures information about the direction light is traveling.
In fact, it captures enough information about the way light is being reflected off objects in the scene to allow the actual 3D space in front of the camera to be partially reconstructed. The camera can’t see through walls or all the way around objects in the frame. But it captures enough information to completely change the nature of production and post.
For example, it allows image characteristics like plane of focus and depth of field, which are normally baked into the image based on the optical characteristics of the camera and lens, to be manipulated after the fact. And it can dramatically ease the burden on VFX teams by eliminating the tedious roto work or imprecise green-screen keying that usually goes into isolating people or objects in a scene.
“I had read innumerable white papers, theses and historical documents about light-field capture, and I had one of Lytro’s consumer cameras, so it wasn’t foreign to me,” Stump says. He knew that, in addition to capturing a photographic image, the camera would track the position of each visible pixel in space. “Inside the 3D space, I could do VFX work effortlessly.”
At the time, Stump was chief imaging scientist at The Virtual Reality Company (VRC), whose founders included Chris Edwards, CEO of previs studio The Third Floor, and Robert Stromberg, a director, VFX artist and Oscar-winning production designer. “I was brainstorming stuff with them,” Stump says. “I was, and still am, the chairman of the ASC’s camera committee, and I have that reputation: ‘Let him try it out. If he can make it work, then we’ll stick our toes into the water.’”
As it happens, Stump, who was at the time VFX supervisor on the Starz series American Gods, already had a scene in mind that he and VFX designer Kevin Tod Haug knew could be transformed by light field techniques. “What I was really thinking was, ‘I’m going to use this on the second season of American Gods next year,’” he recalls. “There was a big scene that was to be set on a spinning carousel, and I wanted to use it for that. So I was kind of selfishly motivated.”
The short-film project was a five-minute drama titled “Life,” developed and directed by Stromberg with cinematography by Stump. He described it at the time as a “visual poem” that followed a boy and girl through their lives from youth to old age. Since the Lytro Cinema camera was being positioned as more of a VFX tool than a general-purpose cinema camera, Stump shot about half of the shots with the ARRI Alexa SXT to show that they could be intercut seamlessly with footage from the Lytro.
The first thing to reckon with about the camera was its sheer size. The camera’s length extended from around six feet to as much as 11 feet, depending on the type of shot being captured. The camera is large mainly because of the unusual nature of its capture mechanism — it held a micro-lens array with millions of tiny lenses that redirected light to the camera’s image sensor, which captured an astonishing 755 megapixels with 16 stops of dynamic range at up to 300fps.
Fortunately, the camera was tethered to its required storage array via a 100-meter fiber cable, so moving it around wasn’t an issue beyond than the sheer bulk of the thing. “Believe it or not, I muscled it around and panned and tilted it using a gargantuan old Raby wheel head,” Stump says.
One of the most transformative aspects of working with a light-field camera is its lack of a traditional lens aperture. “It doesn’t have an entrance pupil per se,” Stump explains. “Depending on how many sensors and photosites you have, it can have thousands of entrance pupils all over the front of the lens.”
That means Stump wasn’t limited by the same laws of physics that bedevil lens manufacturers. Instead of hunting down an incredibly esoteric and expensive large-aperture lens for a desired depth-of-field effect, he could specify a synthetic aperture.
Look at it this way. For cinematography, any lens with a f-stop lower than f/1.4 is generally considered an exceptionally “fast” lens, transmitting large amounts of light and capturing images with very shallow depth of field. The fastest lenses ever made have f-stops in the range of f/0.95 or, in rare cases, f/0.8.
For one shot in “Life,” Stump specified a synthetic f-stop of f/0.5, which is beyond the manufacturing capabilities of traditional optical lens-makers. The image shows a boy holding a baseball; the boy’s hands are in sharp focus, but his arms and body immediately recede into soft focus behind the ball.
“It was crazy — an impossible shot otherwise,” Stump remembers with a chuckle.
Asked whether the footage from the Lytro camera, with its apparent optical properties dialed in in post-production rather than baked-in on capture, really looked just the same as footage captured with the Alexa, Stump said it did, at least until you pushed into clearly uncanny territory. “When you’re working at a theoretical stop, it’s obvious that something looks different,” he says.
Another scene that demonstrated the power and flexibility of light-field acquisition was a wedding taking place under an arbor with a couple standing at the altar in front of a preacher and the camera looking over the backs of people in the foreground of the image. The entire background of the scene needed to be replaced, but the crew didn’t make any special allowances on set. In fact, they deliberately allowed a grip to walk through the background of the scene carrying a ladder, just to prove how easy it would be for a production to remove any unwanted elements from a shot.
“We shot the whole scene on stage without even trying to put up a blue screen or green screen,” Stump says. “We used depth maps from the light-field capture to discriminate between all of the objects in the scene. We removed the stage wall and the grip with the ladder walking through behind them, just from depth information.”
As a flourish, the team made one more significant alteration to the shot that they knew nobody would ever catch if it weren’t described to them. “We subtracted the Mylar confetti everyone threw into the scene and then replaced it with different Mylar confetti” from a different take, Stump says. The results are seamless.
All of these modifications were made in industry-standard compositing software — in this case, Foundry’s Nuke 3D compositor — with special plug-ins to accommodate the light-field data.
“Life” premiered to a packed house at NAB 2016, with more than 600 attendees packed into a meeting room that wasn’t meant to hold nearly that many people. (Stump remembers the presenters being scolded by NAB Show representatives because so many people had crowded into the room that it constituted a near-fire hazard.) Stump and Stromberg described their work on the film, Lytro representatives answered questions about the camera, and the overflow crowd spilled out into the hallways afterward to compare notes on the mind-expanding demo they had just seen.
But, even though the potential for light-field capture remains tantalizing, the Lytro Cinema camera couldn’t quite make it reality.
An infamous upheaval behind the scenes of American Gods led to a changing of the guard, and Stump didn’t get to stay with the series long enough to shoot his magical carousel. Lytro Cinema didn’t stick around long, either — though the camera was intended to be production ready in the third quarter of 2016, the leadership team announced plans to shut down the company a year and a half later, in March 2018. (Today, key technologists from Lytro Cinema are driving the development of light-field display technology at Light Field Lab.)
But Stump still thinks about what he could have done if he were able to apply the technology on a real-world production. He was planning to use the camera’s depth information to find and isolate characters on American Gods’ spinning carousel and perform moving morphs, digitally altering them as they traveled.
“If I had gotten to use Lytro Cinema Camera for the carousel shots, I would have been able to do 3D visual effects directly on the light-field images,” Stump says. “Kevin Tod Haug and I had an elaborate plan for how we were going to do that scene that got blown up when that Hollywood thing, ‘creative differences,’ happened.”
Stump still feels that light field technology, or something very similar in its ability to easily and reliably capture detailed depth information about a movie scene, will change Hollywood, but he’s reluctant to predict when or how it might happen. “I don’t want to put myself in the business of being a futurist,” he says. “There are other techniques that can get you partway there. But the part they can get you is such a small percentage of what we could get from the Lytro. It’s unsatisfying.
“When we finally get it, and get it for a practical size and expense, it’s going to be like, ‘How did we ever do without this?’”