James Cameron started writing “Avatar” in 1994. Then he waited over a decade to start filming it because the technology he needed to tell his story hadn’t been invented yet. But even then, Cameron and co. found themselves developing new filmmaking technology to transport audiences to the lush 3D world of Pandora.

Sometimes filmmakers work with what they’ve got when it comes to filmmaking tools. Other times they look at the script they’ve written and realize that to tell that story, they need invent new tech.

Robert Zemeckis wanted a moving-camera shot of Michael J. Fox playing three different characters. Jim Sturgess needed to interact with actors pulled by an opposite gravitational force in “Upside Down.” The big screen adaptations of “The Lord of the Rings” books had to tackle ambitious, epic battle sequences. To pull all this off, technical wizards invented new software and camera equipment.

Check out the gallery below for a rundown of 10 movies that pushed its filmmakers to be technical innovators.

  • “Avatar” (2009): Fusion Camera System
    Photo Credit: 20th Century Fox

    To immerse audiences in the blue and bioluminescent world of Pandora, James Cameron wanted to up his 3D game. He did that by developing a hi-def 3D camera system that fused two Sony HDC-F950 HD cameras 2½ inches apart to mimic the stereoscopic separation of human eyes. He created the Fusion Camera System along with Vince Pace, who also developed new underwater lighting for Cameron’s 1989 undersea sci-fi adventure flick “The Abyss.” Older 3D tech also used two cameras placed side-by-side, but those cameras were so big that their lenses ended up being far apart, not close to each other like human pupils, as with Cameron’s small hi-def cameras. This tech has since been used in such films as “Tron: Legacy,” “Hugo” and “Life of Pi.” 

  • “Star Trek V: The Final Frontier” (1989): Previsualization
    Photo Credit: Paramount Pictures

    Digital previsualization is used throughout the film industry today to plan and visualize scenes before filming, much like storyboarding, though typically it's used for complex action and effects scenes. “Star Trek V: The Final Frontier” was an early pre-viz pioneer. VFX artists Brad Degraff and Michael Whorman proposed the idea of using 3D computer code to previsualize scenes in “The Final Frontier,” and animator Lynda Weinman then used Swivel 3D code to plan shots of the Enterprise to capture director William Shatner’s vision for new shots of the iconic starship.

  • “Upside Down” (2012): “master” and “slave” camera technology
    Photo Credit: Millennium Entertainment

    In “Upside Down,” a two-planet world has dual gravity — Jim Sturgess looks up not at a sky but another, incredibly close planet that has gravity pulling in the opposite direction. For scenes where characters on each planet interact with each other, director Juan Diego Solanas wanted the actors to be able to act opposite each other in real-time but he also wanted to use a camera controlled manually by crew, not by a computer with pre-programmed movements. Sturgess and Timothy Spall would be on separate sets on one stage, with a monitor positioned about each actor’s head displaying their co-star. Both cameras on the twin sets remained in sync with new technology that had a “master” camera on a dolly linked to and controlling a “slave” camera on the other set, so the cameras could have identical movements in real-time.

  • “Ender’s Game” (2013): rigs for zero gravity stunts
    Photo Credit: Summit Entertainment

    Much like with “Upside Down,” the team behind “Ender’s Game” faced zero gravity scenes by having the actors do a lot of wire work and by inventing some new tech. One new rig, created by stunt coordinator Garrett Warren, places an actor inside a ring on a lollipop arm suspension system that gives the actor a full range of motion while in the air in Battle Room scenes. (Check out a demonstration of that suspension system in this video.)

  • “Back to the Future Part II” (1989): VistaGlide motion-control camera
    Photo Credit: Universal Pictures

    For 1961’s “The Parent Trap” Hayley Mills appeared in frame playing two characters at once. But filmmakers had yet to devise a way to pull off that trick for a shot with a moving camera. The sequel to “Back to the Future” set out to tackle that issue when various actors played multiple characters and characters who encounter past and future selves. In one shot, the camera pulls back and pans up from a pizza to show three members of the McFly family all played by Michael J. Fox sitting at the kitchen table. VistaGlide was developed for shots like that. The robotic, motion-controlled camera dolly system allowed the filmmakers to mimic exactly the camera’s movement in a previous take.

  • “The Lord of the Rings” trilogy (2001-2003): MASSIVE software
    Photo Credit: New Line Cinema

    The grand battle scenes in “The Lord of the Rings” films called for new technology. New Zealander Stephen Regelous created MASSIVE (Multiple Agent Simulatin System in Virtual Environment) to populate scenes with bigger, more realistic and dynamic crowds than previous films had. The program enabled the “LotR” effects team to create battles among tens of thousands of orcs, humans and elves with autonomous agent animation. Previously, visual effects artist would have to animate individual creatures and copy them. That didn’t look as good as what MASSIVE could do: build characters, program into them a style of fighting, and essentially say “go” — and off each individual warrior would go, “choosing” from a library of motions based on the changing terrain or different enemies they encounter. The software has since been used for crowd scenes in such films and TV shows as “World War Z,” “300,” “Doctor Who” and “WALL-E.”

  • “Jurassic Park” (1993): Dinosaur Input Device
    Photo Credit: Universal Pictures

    At the time “Jurassic Park” was made, there weren’t nearly as many well-trained computer animators in the industry as there are today. But there were plenty of talented animators who knew how to use traditional tools. To address this issue, the Dinosaur Input Device (DID) was created for the film. Stop-motion animators posed a mechanical model dinosaur for each frame of animation. The angle of each joint in the model was monitored from sensors, and the data from these sensors was used to animate a computer model of the dinosaur. The system, renamed Direct Input Device, was later used for “Starship Troopers” and “Three Wishes.”

  • “The Matrix Reloaded” (2003): Universal Capture
    Photo Credit: Warner Bros.

    As films like “Tron: Legacy” demonstrate, creating a believable CG version of a recognizable actor is still a work-in-progress, even over three decades after “Looker” delivered the first computer-generated human character in a feature film, in 1981. But “The Matrix Reloaded” made some strides in this area with John Gaeta’s invention of Universal Capture. Blending the use of lens-based cameras and synthetic 3D CG, this facial animation system captured an actor’s expressions from various angles and stored them in a library used to create multiple characters. The technology was essential for such moments as the scene when Neo fights hordes of Agent Smiths.

  • “Edge of Tomorrow” (2014): new Autodesk Maya plug-in
    Photo Credit: Warner Bros.

    Autodesk Maya is a popular 3D computer graphics software that’s been used for films like “Avatar” and “Finding Nemo” and TV too, including “Game of Thrones.” Over the years, different artists have made various additions and improvements to the software, and that’s what time loop war movie “Edge of Tomorrow” did. For the alien race of Mimics, a technical animator created a Maya plug-in that made the movement of each of the alien’s 20 tentacles move independently but also react to the other tentacles. As VFX supervisor Dan Kramer explained, “The plug-in knew the radius of every tentacle. If one on the inside grew or slipped, the tentacles above it moved in reaction.”

  • “Gravity” (2013): light box
    Photo Credit: Warner Bros. Pictures

    Alfonso Cuarón pushed the VFX and cinematography envelopes with 2006’s “Children of Men,” but his 2013 movie about two astronauts adrift in Earth’s orbit is his really impressive cinematic feat. Before “Gravity,” no one had attempted to make an entire film that takes place in zero gravity. Sandra Bullock spent most of her time on the shoot in a giant, mechanical rig. One big challenge of pulling off the authentic look of zero gravity in Earth’s orbit was simulating the sun’s light that bounces off the dayside of Earth. As the characters are whipped around and about in the catastrophe that damages their space shuttle, so the filmmakers needed a way to rapidly move light sources. The film’s effects team created an LED light box that emitted and could instantly change the appropriate light on the actors inside. The box was 20 feet tall and 10 feet wide, and the interior walls were constructed of 196 panels, each containing 4,096 tiny LED lights. “Gravity” went on to get a Best Picture Oscar nomination and win seven Academy Awards.

An enthusiast of time travel stories, film scores, avocados and Charades, Emily Rome is an alumna of Loyola Marymount University and a native of beautiful Washington State. Emily’s writing has also appeared in the Los Angeles Times, Entertainment Weekly and The Hollywood Reporter. Follow her on Twitter @EmilyNRome.