Greetings and salutations readers!
In this week of BSG production, I primarily worked in pipeline management and creating systems for efficient animation import and export out of Unreal Engine and Nuke. To start off with, I worked in Unreal Engine with the Level editor in order to create a system of referencing similar to Maya.
For our project, with our sets constantly changing and our animations existing on the same rig, I struggled with how we were going to organize our sets without the Unreal Project becoming massive. We didn't want to place fifty grasses in fifty levels with fifty different cameras, that would really bog down the project with redundant uassets. After doing some searching around through the version control, I came across the level reference editor in Unreal Engine. What this tool does is it allows as many levels to be open concurrently in the loaded level at a time, while also allowing people to work in one specific level without it overwriting someone else working in a different level.
So, to apply this for our BSG short, here is what I set up. In our Gas Station Set, we have a master level where all our foliage, pretty assets, and terrain live. Everything that is cosmetic and shared between shot to shot. Then, I created an individual level for each camera shot, and in each level I added the rigs, the shot's animations, a labeled camera sequence, and a CineCamera actor with a matching name. After that, I can go to my master scene, add the level in the levels editor, and bring in only the shot stuff I need when I need it!
The only technical hiccup I had at first was the renderers were not respecting the levels as present, and would default back to the perspective camera with no animation. The solution was setting the level's streaming settings from blueprint class to always loaded, and that fixed up all the problems. Here is a screen grab of my set up featuring shot A-055 in Unreal:
While definitely missing some lighting, I can bring whatever aspects I need from that singular shot into the main set with ease.
This week, I also had to play around with Advanced Skeleton and Maya to get the assets out of Maya correctly. There was a period of time where I was trying to export my UFO's animation into Unreal, with no luck. The majority of the time, Unreal wouldn't recognize that there was an attached animation with the rig. Through my own experimentation, I found out that the FBX has to bake the animation into it for Unreal to recognize an animation asset. However, when I was exporting it, and reopening the FBX in Maya, the rig was static.
With the help of my production and animation professor and an alumni from a year ahead of me, I was able to find out why my animation wasn't working upon import. When selecting the whole rig group, just the joints, or using the FBX plugins of advanced skeleton, the animation keyframes were missing. What proved being the solution was the fitskeleton, deformation group, and the geo groups in order to export the rig and the geometry, and just the fit skeleton and deformation group for the animation. The only other issue was that if the animation was being imported with the rig for the first time, the rig options had to be set to none, and then the rig had to be set for all subsequent animation imports later. After that, everything match between Unreal and Maya. Well, almost.
In thesis films from the year prior, there are certain shots where you can see the animated characters teleport randomly at the start or end of a shot. In my first render, my UFO at the last frame snapped back to the default positions and created a slight but noticeable hitch. My assumption is that between Maya, Unreal, and Nuke, there is a slight variation in what frame is truly the first, whether that is frame 1 or 0. And because of that, if an animation starts or ends before the camera sequence, it reverts back to the rig default position. In order to prevent that, I am going to ensure that our animations always have a five frame buffer before and after, so a 60 frame shot would be animated from frames 1-70, and then we would grab frames 5-65 for the actual cut. Thankfully, the other pipeline step I developed will facilitate that easily.
The other major part of the pipeline that I worked on this week was creating a Nuke Studio cut of our film. For those unfamiliar with Nuke Studio, it is a Foundry program that resembles other editing softwares such as After Effects and Premier, but has the capability of generating Nuke Scripts per shot. In order to create my cut, I took our existing storyboard animatic, and cut it up every time the camera angle changes, and named each clip to its pre-existing numbering. Then, with a demonstration and setup by my other production professor, we created a file hierarchy that organizes all of the render locations and compositing files. With this system, my teammates can render from Unreal into a pre-existing folder, work on the comp in Nuke, and export out using a template write node. Afterwards, all I have to do is grab the export folder in Nuke Studio, and I can automatically version up anytime there is a more current Nuke export. This will be a considerably easier process than manually re-stitching together footage every time there is a change in a program like After Effects.
Having done all of this setup, I created a rough blocked animation in Maya and put it through the whole pipeline I worked on this week. Here is the transition from Maya to Nuke Studio:
While the scene definitely has a long way to go, it is good that we have a precedent and understanding on the work flows so that we don't have to deal with trial and error during a time crunch.
Finally, in addition to work I did in the pipeline, I also animated several different rough prototype for space ship doors, and ended up on a shape and movement that the team likes. Here is that playblast:
That was a lot going on, but I appreciate everyone who has made it this far in the post! Until next time readers!
Comments