‘Lightyear’ VFX Supervisor Jane Yen On Making An IMAX Blockbuster From Home, And How VR Helped
Pixar’s Lightyear touched down in theaters last week, the studio’s first feature to receive a theatrical release since Onward hit cinemas on March 6, 2020.
Of course, the reason for Pixar’s theatrical sabbatical was a global pandemic that shut down or severely limited the capacity of cinemas for almost two years. And it wasn’t only distribution that was impacted. Production also transformed overnight, and studios adapted on-the-fly to new work-from-home models.
Pixar vfx supervisor Jane Yen, who has been with the company since 2001 and boasts an epic resume, spoke with us about producing an IMAX-ready science fiction blockbuster from home, how virtual reality (vr) helped the crew visualize what they were doing, creating a new pipeline for IMAX, and which films and eras inspired the Lightyear aesthetic.
You’ve said in talks before that for Lightyear it was necessary to create a new pipeline to get the film ready for IMAX. Can you talk about that process?
I had been on the film for maybe two or three months when one day, Angus McLean, our director, came in and said that since we wanted to make the most epic sci-fi film possible, that he would love to pursue IMAX. Well, Pixar had never created a plan to do that before. So we met with the technical team and and pretty quickly felt that the solution that could work for us would be to do each frame once, and shoot it as if it was for IMAX in a full 1.43:1 squarish resolution, then crop the larger image down for more widescreen resolutions.
The tricky parts were in our day-to-day workflows and in explaining to the teams that they needed to consider staging things in such a way that it will work in each ratio, and that the center of action is in the right place for every format. To save themselves work, they would just center crop everything, moving the camera to fit everything in frame. But that meant that our set teams had to build and dress sets that went all the way to the edge of each ratio.
It all seemed like it should just work, but we were holding our breath a little because anytime you do something different with the pipeline, issues pop up. In the end, when we were reviewing the film in each aspect ratio, we would catch small stuff and fixes that needed to be made, but it was always manageable.
So, is Lightyear a movie people should try to see in IMAX?
Oh absolutely. One funny result of working from home during the pandemic for about two years is that we’d all been looking at the images of this film on our home computers or laptops. Then, the first time we finally saw it on IMAX, I remember it was the sequence where Buzz goes into hyperspeed for the first time and the frame opens up, we were all blown away and our hearts were racing. It felt like we were on a ride and it was a really wonderful experience, especially after being confined at home for so long.
Lightyear is a film that in the Toy Story universe is supposed to have come out in 1995. While looking for influences, stylistic or narrative, did your team limit itself to content from before that time?
That’s a good question… I don’t think that we ever had any concrete discussions about limiting the scope of references to a time period. We wanted to stick with what is cool in the genre and what we could do to make the most epic sci-fi animated film possible. But I do think in the end we found a zone where, those hyperspace effects for example, the effects team settled on a classic look. I can’t say for sure that Angus was focused on that very specific slice of time around 1995, or if it was more focused on the classic sci-fi we know and love and being referential to that.
And what about the original Toy Story films? What influence did they have on this production?
Well, this movie is the thing that inspired the creation of the toy that starts the story of those films. But we really needed to create a world where Buzz is his own character, and he’s human. In the Toy Story films, the Buzz toy would be a cartoony version of the character. So we included very specific parts of Buzz’s costume but made the character more human. The art team did a tremendous job of nerding out and examining every nook and cranny of the toy in the initial films and figuring out what would work on our Buzz’s suit.
In terms of the world that the film takes place in, we didn’t want it to look like Toy Story. We wanted it to be clear that this is its own film. So we focused on making the world as cohesive as possible. There is a kind of spectrum of looks for animated films. They can look very cartoony or very realistic. Angus wanted to find a style that wasn’t too cartoony, but also not completely real. So the art team spent a tremendous amount of time on creating the style of this world, then making it as cohesive as possible.
You’ve been with Pixar for many years and have some incredible films on your resume. How has your role at the company evolved over that time?
I started with Pixar right after grad school in 2001. Seems like forever ago. At the time I was in the software group and I remember, we were working on Finding Nemo and one of our researchers was looking for help with programming the water physics. We didn’t know how to do it back then. It was really heavy computationally, and we were starting from scratch.
In those early days, there were so many problems that just hadn’t been solved yet. Pixar has always been at the forefront of developing a lot of the technologies we use today, but in the early days we weren’t always sure we could get an image on the frame. Fast forward to today and there are packages that offer solutions to nearly every problem. So now the challenge is putting your own special touch and features into it, but the foundation really has changed.
In terms of my role, the pipeline today is smooth and on this film in particular the only real technical problem that needed solving was the IMAX pipeline. So my major responsibility now is facilitating that cohesive world I was talking about. We need the best collaboration possible between individual departments: sets, characters, shading, lighting effects…, so I felt like in some ways my role was to help facilitate all the teams to work together seamlessly. Early on I planned ways to bring in the leadership of the teams and get everyone working in a small, more intimate environment, figuring out problems and building relationships. I think that really paid off too, because when the pandemic hit, the tightness we’d forged in those early days really helped. I think it was an enjoyable experience for everyone as well.
Speaking of the pandemic, how did that impact production?
At first, we were in… I wouldn’t quite say survival mode, but we had to figure out how we were going to make a feature film working from home. I think if you’d asked people before the pandemic, “Can we make movies working from home?,” I don’t think most people would have said yes with any confidence. Having gone through it now, I think the answer would be, “Absolutely!” In those early days of the pandemic though, we had so many questions. How do we schedule meetings? What’s Zoom? Can my home computer handle the software tools I need to use and is it fast enough to be efficient? Thankfully, the studio was extremely supportive and, with the help of a lot of smart people, we figured it out.
What steps did the studio take to make remote production possible? Did the crew get new equipment to use at home?
Before the pandemic everyone had their stations at the studio, but only a handful of people had home computers set up for this kind of work. But once the pandemic hit, everybody kept their campus computer and then got another one at home. These are called Teradici, and the way they work is that you basically have a monitor and a keyboard at home, but the computer that’s running all the software is still back at Pixar. Also, early on in the pandemic we couldn’t even get on campus to evaluate what the film was looking like on the big screen. Then we learned that Disney has a vr app that lets you visualize sitting in a theater and looking up at screens in all the different resolutions. So, we had a set of maybe 15 people who had vr headsets that could evaluate how [the film was] looking on the big screen.