As I said before, the sets had to look as close to final render as possible because everything we did directly drove what WETA produced in the end as opposed to them interpreting what we did. Realtime assets sent to WETA along with motion data for uprezzing etc.īasically the movie was shot on our end of the production process. RT sets would be altered or pieces moved to work with JCs shots.ĩ. JC would use virtual camera to create shots in RT environment with mocapped creatures/navi.Ĩ. Put printout on the wall for JC to critique.Ħ. Render out that asset and print it out in color.Ĥ. Usually it was a ton of material and we had to get thru it quickly so that James Cameron could see them in the realtime environment.ģ. Receive stack of concept art for pretty much each and every plant. Heres a screen capture of Pandaora in the realtime version. The point here being that despite the hype I know what was used and how much it was used firsthand on one of the most important phases of the film. I know that the Maya artists had a difficult time keeping up with a fraction of what we were doing with LW. Whether or not anything else could have kept up with the workload is debatable to some degree. We did also use Maya and Motionbuilder but most of what was done in Maya at that stage was mainly to help translate assets from LW into motionbuilder. Lightwave was definately the workhorse during this phase. The movie itself was really made in the VAD and the shots we did were also passed onto WETA. The virtual sets had to look as photoreal as possible and those were passed along to Weta for uprezing and placement. Much of what we created in 3d was inspected by James Cameron himself and would be appr oved by him and put into the sets under his direction. The fast worflow enabled us to keep up with massive workload. Lightwave was the main workhorse to build the virtual sets mainly because we had a ton of assets that had to be created daily based on concept designs given us by the concept artists. I spent several months working in the VAD (Virtual Art Dept) at Lightstorm on Avatar during the design phase. The list of tools is not exhaustive and you can see there’s overlap in capabilities, depending on each company’s pipeline. countless plugins for each platform, some of them Ocula for Nuke, Ktakatoa for 3ds max, Sapphire for Combustion/AE.Adobe Premiere (proofing, rough compositing with AE).Adobe Photoshop (concept art, textures).Adobe Illustrator (HUD and screens layout).PF Track (motion tracking, background replacement).Adobe After Effects (compositing, real-ime visualizations).The Foundry Nuke Compositor (previz image compositing).Autodesk MotionBuilder (for real-time 3d visualisations).Autodesk 3d design max (space shots, control room screens and HUD renderings).Houdini (Hell’s Gate scenes, interiors).Lightwave (low-res realtime environments).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |