Isn’t it easy for us to get convinced by the stunning visuals on the big screen? Aren’t the alienated beings in Avatar or the sandworm in Dune real enough to awestruck the audience? Well, it is pretty much real or created to be real by experts sitting behind computer screens.
Here, the technology we are going to discuss is CGI in films. Perhaps this is the best time to have a glance over the vast and elevating world of digital creations. With everyday emerging trends, we can spot CGI in almost every film of the 21st century.
Even in the earlier films, CGI was prevalent. However, extensive use is reaching its peak today. However, before we head straight to the large number of films using CGI to enhance the cinematic ambiance, we must get a clear idea of what CGI means.
What is CGI?
CGI stands for Computer Generated Imagery, and it is practiced to generate and create digital graphics upon already existing subjects or a blank canvas. With technological advancement, CGI has made 2D and 3D graphics possible in moving pictures.
Everything ranging from animation, VFX, 3D filmmaking are part of Computer generated graphics. In simple terms, CGI is the technology used to build animated, enhanced, unrealistic images on a digital screen. The images are made on an already given piece of art, such as a photograph, drawing, or video. CGI is created using computer software.
How is CGI actually done?
As intriguing it may seem, the art of CGI is not easy. Computer Integrated Graphics are on the rise, and thus, the demand for perfection stands at its peak. To create a successful CGI, a team actively handles different departments and contributes to a step-by-step process of production.
- Step 1: Initial Visualisation
- Step 2: Animation
- Step 3: Matchmove
- Step 4:FX Simulation
- Step 5: Lighting
- Step 6: Rotoscopy
- Step 7: Composting
This process is used in modern-day, evolved computer-generated graphics. But before we could witness the seamless experience of CGI in films, the technology went through a patient evolution. So let’s proceed to the history of CGI and dig into the pioneers who made it possible for modern-day filmmakers to use it as an effective storytelling tool.
History Of CGI in films
We can trace the history of CGI in films back to 1958. With the release of Alfred Hitchcock’s mystery thriller Vertigo, effective use of CGI in films began. Although CGI was not incorporated into the actual film, it was a milestone in forming a new perspective of cinematic tendencies to enhance the narrative.
Experimentation with computer graphics began in the 1940s with the development of mechanical animation techniques on customized devices made using anti-aircraft machinery. The pioneer name in the experiments conducted was John Whitney who further collaborated with an American graphic designer, Saul Bass, to produce the state-of-the-art opening credits of the film Vertigo.
The opening sequence consisted of a spiral animation that fairly depicted the feeling of vertigo, a sensation of whirling leading to loss of balance. This was made possible by combining the rotation of a gun director with a pendulum that had an attached pen that painted the spirals upon rotation. With this, the first use of successful CGI in films was documented.
In 1961, the first computer-animated experimental film, Catalog, was produced by John Whitney. However, this experimental film cannot be technically referred to as a film, but it was a milestone achievement in the case of the evolution of CGI on the big screen. In all honesty, the film resembles a modern-day computer screen saver but was an awestruck visual back in the days.
A few years later, in 1967, a type of animation referring to the blending of lines to form images without any cuts in the scene was produced. The film was called Hummingbird, animated by Charles Csuri and James Shaffer. This was a hint to the present-day morphing technique, and the perspective was made more explicit in the film Sine Curve Man.
The 1970s were indeed a bookmark in the evolution of CGI used in films. Inspired by the early experimentations and advancement of technology, CGI use was seen in quite a few important feature productions.
In 1972, Ed Catmull produced a short film demonstrating computer-generated hands and faces. This was the world’s introduction to 3D animation that was made possible by drawing 350 triangles and polygons and then digitizing it in an animation program.
Following this, in 1973, a widely appreciated film, Westworld, incorporated a scene (Gunslinger Vision) that came to be known as the first use of 2D computer animation in a film. The scene used 2D graphics to depict the vision of a robot. It was so popular that a sequel, Futureworld, was produced that advanced the use of CGI.
Futureworld used 3D animations for hands and faces and used 2D compositing. The successful and convincing use of CGI in Futureworld led to the foundation of an oscar for Scientific and Engineering in films.
With the advent of digital filmmaking in Hollywood, the era of CGI began. Star Wars was the first digital film of Hollywood that also used CGI. The trench run briefing sequence used 3D wireframe graphic animation to give comprehensive last-minute training to the rebel alliance and ultimately to the audience. Filmmakers used similar techniques of wireframe rendering in films like The Black Hole and Alien.
Rolling onto the 1980s, CGI went through extensive growth. However minor, several films opted for some sort of CGI such as morphing, wireframe rendering, and 2D trick graphics. A bookmark film was Tron that had a 15 minutes computer-generated graphic sequence, including the light cycle sequence.
Several other films overcame the complexities of CGI and widely explored it. This resulted in the extensive use of CGI in the 80s, with a seamless experience in the 90s. Films such as The Last Starfighter, Labyrinth, Luxo Jr, Rendez-vous in Montreal, Tin Toy, The Abyss, Indiana Jones and the Last Crusade are among the names that made a remarkable achievement in the use of CGI technology.
The 90s are all about seamless CGI. Many films worldwide were using CGI by now, and graphics companies have been founded abundantly. The audience could now find it difficult to differentiate real scenes from CGI, proving the flawless use of the new technology. Notable films are Terminator 2: Judgment Day, Jurassic Park, Radioland Murders, Toy Story, Marvin the Martian, Fight Club, and The Matrix.
With many firsts of CGI, the 90s brought about a significant change in the perspective of films and cinema narratives. Filmmakers could see a bright future for CGI in the upcoming decades, but little did they calculate the frequency of CGI that they would use in films.
CGI in 21st Century
It is no surprise that things evolve with the passing of time; they grow, just like human beings or trees. But, in the case of technologies, the concept never dies. It keeps on growing and spreading its branches all over. Such is the technology of CGI.
Today, in the third decade of the 21st Century, CGI has evolved like anything. Billion-dollar industry stands upon the trunk of computer-generated graphics. Name a film, and one can disintegrate its real scenes from the computer-generated ones. So I like to take it as a boon of creativity and the essence of changing times.
With the advent of the 21st century, CGI technology went through exponential growth. One of the first films to entirely rely upon CGI was Avatar. Director James Cameron led to the creation of photorealistic 3D characters as well as a photorealistic 3D world. The film’s production originated the first virtual art department that worked in real-time. Avatar became a highly successful film all over the world, developing a reputation as the first-ever 3D film in the history of cinema.
Films such as The Planet Of The Apes, Life Of Pi, Avengers, Gravity, Blade Runner 2049, Dune, and many more are strong examples of seamless, realistic computer-generated imagery. These films not only used CGI to develop characters but also gave emotions through it. Naturally, such work makes the audience amazed, but it also makes them forget the difference between real and photorealistic, thereby marking its success of it.
Until today, CGI stands to be the go-to option for so many filmmakers because of its cost-effectiveness and creative liberty. As a result, almost every mainstream film somewhat relies on CGI. For example, the Marvel Cinematic Universe and the DC Universe heavily use CGI and primarily shoot with a green screen. In addition, the rising amount of science fiction films give more opportunities to CGI; however, as mentioned earlier, every film of a different genre uses CGI in some amount.
Such is the case in the 21st century that CGI has been effectively used to manipulate reality but in a fictional world. Films use CGI to de-age actors, create identical twins, and resemble physical features with the fictional parents that are essential for the narrative. In this case, the allowed manipulation for fictional purposes surely does wonders.
It is not always possible to assemble a cast that shows the demanded resemblance to their fictional relations as well as benefit the filmmakers with outstanding acting skills. The act of making this possible used to be carried out through makeup artists, but with changing times and advancing technology, CGI has made the work easier.
Along with that, several CGI practices make the stunt double accurately resemble the actor’s features. Hereby, it is evident that every film of the 21st century irrespective of being Sci-Fi, War, Drama, Afro-Futurism, or Romcom uses CGI in some or the other forms and is convincing enough to lead the audience to amazement.
Why did CGI become so popular?
One of the biggest reasons for CGI to be popular stands at its cost-effectiveness. Filmmakers saw that an extraordinary cinematic experience could be delivered to the audience without building exaggerated film sets. These film sets are not only expensive but also demand a substantial amount of time for construction and maintenance.
Filmmakers, especially production companies, saw this as an opportunity to spend less and earn more. As audiences had already started to love the photorealistic visuals on the screen, the rise of CGI into the 21st century was inevitable. It had to take over one day. Today most of the films are made using CGI because the audience loves it.
Another major factor in adapting to CGI was the development of science fiction narratives. Generally, most of the fictionalized science does not exist in the real world. Thus, for creating those elements on the big screen, CGI stood as an easy answer.
With the advent of CGI, filmmakers also witnessed a type of creative liberty. Thoughts and ideas never cease to expand but making it a reality became easy with the use of CGI. Such as the alienated beings in Avatar, the highly expressive Apes in The Planet of the Apes, Davy Jones in Pirates of the Caribbean, Hulk and Thanos in the Avengers movies, and many more are an example of theoretical ideas being converted to practical subjects on the screen.
With the aspirations of creating visuals on screen that could gulp up millions of dollars and resources, CGI came up as a low-cost, highly effective alternative. CGI has made all of this possible, thus, claiming all of its popularity among production houses and audiences.
Future of CGI
We know that CGI has substantially evolved over the years. The world has witnessed plenty of firsts in films with respect to CGI. So much so that we must think of this as the extreme level- the level where all the technologies deriving from CGI are explored, but the world keeps on moving and brings on new challenges. So it shouldn’t be a surprise if the future brings us a new derivative of CGI, a better, more convincing one.
Nonetheless, analysts have argued that the stage of CGI being explored and incorporated is gone. As CGI is a cost-effective alternative for building real sets, it has become a rigorously used phenomenon for touching profit margins. Moreover, with the rise of digitalization, everyone and anyone can learn the aspects of CGI, especially when all you need is a computer, software, and the will to learn.
This has definitely catered to the growing demands of the CGI industry but also becomes a threat to humankind. We aren’t too late to realize that human eyes are losing the value of reality. Every time we see a film, our senses tell us to find hidden computer-generated graphics even when no synthetic graphics are used.
Whether you are a learned filmmaker or a general high school student, CGI cannot escape you. A learned filmmaker may make the greatest CGI film ever, but it is also possible that the high school student creates a fake video to outspread falsity.
However, taking note of all the aspects, the decline of CGI seems far, rather way beyond the horizon. The world has witnessed the magic of CGI and isn’t ready to let it go yet.
Many technological advances have inspired the world to bring out the best out of it. Similarly, with CGI, perhaps the best is yet to come. However, the world of CGI is exciting and is in high demand. Therefore, hoards of students enroll in a course willing to learn the comprehensive aspects of CGI and 3D filmmaking.
However, it is best not to forget that the technology persists as long as the storytellers choose it to. It is the director’s decision to choose what and how CGI will be used in films because eventually, everything comes to tell a story- a story that scriptwriters write and the filmmakers visualize.