Isn’t it easy to get convinced by the stunning visuals on the big screen? Aren’t the alien beings in Avatar or the sandworm in Dune real enough to awaken the audience? Well, it is pretty much real or created to look natural by experts sitting behind computer screens.
The technology we will discuss is CGI or Computer Generated Imagery in films. Perhaps this is the best time to glance over the vast and elevating world of digital creations. With everyday emerging trends, we can spot CGI in almost every film of the 21st century.
Even in earlier films, Computer Generated Images were prevalent. However, extensive use is reaching its peak today. However, before we head straight to many films using Computer graphics to enhance the cinematic ambiance, we must clearly understand what CGI animation means.
What is CGI in movies?
CGI stands for Computer Generated Imagery, and it is practiced to generate and create digital graphics upon already existing subjects or a blank canvas. With technological advancement, CGI has made 2D and 3D graphics possible in moving pictures.
Everything ranging from animation, VFX, and 3D filmmaking is part of Computer generated graphics. In simple terms, CGI is the technology used to build animated, enhanced, unrealistic images on a digital screen. The photos are made on an already given piece of art, such as a photograph, drawing, or video. CGI is created using computer software.
How is Computer Generated Imagery done?
As intriguing as it may seem, the art of CGI is not easy. Computer Integrated Graphics are rising; thus, the demand for perfection stands at its peak. A team actively handles different departments to create a successful CGI and contributes to a step-by-step production process.
Step 1: Initial Visualisation
Step 2: Animation
Step 3: Matchmove
Step 4: FX Simulation
Step 5: Lighting
Step 6: Rotoscopy
Step 7: Composting
This process is used in modern-day, evolved computer-generated graphics. But before we could witness the seamless experience of CGI in films, the technology underwent a patient evolution. So let’s proceed to the history of CGI and dig into the pioneers who made it possible for modern-day filmmakers to use it as an effective storytelling tool.
History Of Computer Generated Images in films
We can trace the history of CGI in films back to 1958. With the release of Alfred Hitchcock’s mystery thriller Vertigo, the effective use of CGI in films began. Although CGI was not incorporated into the actual movie, it was a milestone in forming a new perspective of cinematic tendencies to enhance the narrative.
Experimentation with computer graphics began in the 1940s with the development of mechanical animation techniques on customized devices made using anti-aircraft machinery. The pioneer name in the experiments conducted was John Whitney, who further collaborated with an American graphic designer, Saul Bass, to produce the state-of-the-art opening credits of the film Vertigo.
The opening sequence consisted of a spiral animation that fairly depicted the feeling of vertigo, a sensation of whirling leading to loss of balance. This was made possible by combining a gun director’s rotation with a pendulum with an attached pen that painted the spirals upon rotation. With this, the first use of successful CGI in films was documented.
In 1961, the first computer-animated experimental film, Catalog, was produced by John Whitney. However, this experimental film cannot be technically referred to as a film, but it was a milestone achievement in the case of the evolution of CGI on the big screen. In all honesty, the film resembles a modern-day computer screen saver but was an awestruck visual back in the day.
A few years later, in 1967, a type of animation referring to the blending of lines to form images without any cuts in the scene was produced. The film was called Hummingbird, animated by Charles Csuri and James Shaffer. This was a hint to the present-day morphing technique, and the perspective was made more explicit in the film Sine Curve Man.
The 1970s were a bookmark in the evolution of CGI in films. Inspired by the early experimentation and advancement of technology, CGI use was seen in quite a few essential feature productions.
In 1972, Ed Catmull produced a film demonstrating computer-generated hands and faces. This was the world’s introduction to 3D animation, which was made possible by drawing 350 triangles and polygons and then digitizing them in an animation program.
Following this, in 1973, a widely appreciated film, Westworld, incorporated a scene (Gunslinger Vision) that came to be known as the first use of 2D computer animation in a film. The scene used 2D graphics to depict the vision of a robot. It was so popular that a sequel, Futureworld, was produced that advanced the use of CGI.
Futureworld used 3D animations for hands and faces and used 2D compositing. The successful and convincing use of CGI in Futureworld led to the foundation of an oscar for Scientific and Engineering in films.
With the advent of digital filmmaking in Hollywood, the era of CGI began. Star Wars, by George Lucas, was the first digital film in Hollywood that also used CGI. The trench run briefing sequence used 3D wireframe graphic animation to give comprehensive last-minute training to the rebel alliance and, ultimately, the audience.
Rolling onto the 1980s, CGI went through extensive growth. However minor, several films opted for CGI, such as morphing, wireframe rendering, and 2D trick graphics. Tron was a bookmark film with a 15-minute computer-generated graphic sequence, including the light cycle sequence.
Several other films overcame the complexities of CGI and widely explored it. This resulted in the extensive use of CGI in the 80s, with a seamless experience in the 90s. Films such as The Last Starfighter, Labyrinth, Luxo Jr, Rendez-vous in Montreal, Tin Toy, The Abyss, Indiana Jones, and the Last Crusade are among the names that made remarkable achievements in the use of CGI technology.
The 90s are all about seamless CGI and visual effects. Many films use CGI now, and graphics companies have been founded abundantly. The audience could now find it challenging to differentiate natural scenes from CGI animation, proving the flawless use of the new technology.
Notable films are Terminator 2: Judgment Day, Jurassic Park, Radioland Murders, Toy Story, Marvin the Martian, Fight Club, and The Matrix.
Animated movies like Toy Story are exceptionally incredible. It is a feature film created entirely from visual effects. It made a virtual reality based on CGI elements and dynamic images.
With many firsts of CGI, the 90s brought about a significant change in the perspective of films and cinema narratives. Filmmakers could see a bright future for CGI in the upcoming decades, but little did they calculate the frequency of CGI they would use in films.
Computer Graphics in the 21st Century
It is no surprise that things evolve; they grow like human beings or trees. But, in the case of technology, the concept never dies. It keeps on growing and spreading its branches all over. Such is the technology of CGI.
In the third decade of the 21st Century, CGI has evolved like anything. Billion-dollar industry stands upon the trunk of computer-generated graphics. Name a film, and one can disintegrate its natural scenes from the computer-generated ones. So I like to take it as a boon of creativity and the essence of changing times.
With the advent of the 21st century, CGI technology went through exponential growth. One of the first films to rely entirely upon CGI was Avatar. Director James Cameron led to the creation of photorealistic 3D characters and a photorealistic 3D world.
The film’s production originated the first virtual art department that worked in real-time. Avatar became a highly successful film worldwide, developing a reputation as the first-ever 3D film in cinema history.
Films such as The Planet Of The Apes, Life Of Pi, Avengers, Gravity, Blade Runner 2049, Dune, and many more are strong examples of seamless, realistic computer-generated imagery. These films used CGI to develop characters and give emotions through it.
Naturally, such work makes the audience amazed, but it also makes them forget the difference between real and photorealistic, thereby marking its success of it.
CGI, As Seen Today
Currently, CGI and animation films are the go-to options for many filmmakers because of their cost-effectiveness and creative liberty. As a result, almost every mainstream film somewhat relies on CGI.
For example, the Marvel Cinematic Universe and the DC Universe heavily use CGI and primarily shoot with a green screen.
In addition, the rising amount of science fiction films gives CGI many opportunities; however, as mentioned earlier, every movie of a different genre uses CGI in some amount.
Such is the case in the 21st century when CGI has effectively manipulated reality in a fictional world. Films use CGI to de-age actors, create identical twins, and resemble physical features with the fictional parents essential for the narrative. In this case, the allowed manipulation for fictional purposes indeed does wonders.
It is not always possible to assemble a cast that shows the demanded resemblance to their fictional relations and benefits the filmmakers with outstanding acting skills. Making this possible used to be done through makeup artists, but with changing times and advancing technology, CGI has made the work easier.
Additionally, several CGI practices make the stunt double accurately resemble the actor’s features. Hereby, it is evident that every film of the 21st century, irrespective of being Sci-Fi, War, Drama, Afro-Futurism, or Romcom, uses CGI in some or other forms and is convincing enough to lead the audience to amazement.
Why did CGI become so popular?
One of the biggest reasons for CGI’s popularity stands its cost-effectiveness. Filmmakers saw that an extraordinary cinematic experience could be delivered to the audience without building exaggerated film sets. These film sets are expensive and demand a substantial amount of time for construction and maintenance.
Filmmakers, especially production companies, saw this as an opportunity to spend less and earn more. As audiences had already started to love the photorealistic visuals on the screen, the rise of CGI into the 21st century was inevitable. It had to take over one day. Today most films are made using CGI because the audience loves it.
Another major factor in adapting to CGI was the development of science fiction narratives. Generally, most fictionalized science does not exist in the real world. Thus, for creating those elements on the big screen, CGI stood as an easy answer.
With the advent of CGI, filmmakers also witnessed a type of creative liberty. Thoughts and ideas never cease to expand but making them a reality became easy with CGI.
Such as the alienated beings in Avatar, the highly expressive Apes in The Planet of the Apes, Davy Jones in Pirates of the Caribbean, Hulk and Thanos in the Avengers movies, and many more are an example of theoretical ideas being converted to practical subjects on the screen.
With the aspirations of creating visuals on screen that could gulp millions of dollars and resources, CGI became a low-cost, highly effective alternative. CGI has made all of this possible, thus claiming its popularity among production houses and audiences.
Futures of CGI
We know that CGI has substantially evolved over the years. The world has witnessed plenty of firsts in films with respect to CGI. So much so that we must think of this as the extreme level- the level where all the technologies deriving from CGI are explored, but the world keeps moving and brings on new challenges.
So it shouldn’t be a surprise if the future brings us a new derivative of CGI, a better, more convincing one.
Nonetheless, analysts have argued that the stage of CGI being explored and incorporated is gone. As CGI is a cost-effective alternative for building real sets, it has become a rigorously used phenomenon for touching profit margins.
Moreover, with the rise of digitalization, everyone and anyone can learn the aspects of CGI, especially when all you need is a computer, software, and the will to learn.
This has catered to the growing demands of the CGI industry but also threatens humankind. We aren’t too late to realize that human eyes are losing the value of reality.
Every time we see a film, our senses tell us to find hidden computer-generated graphics even when no synthetic graphics are used.
Whether you are a learned filmmaker or a general high school student, CGI cannot escape you. A learned filmmaker may make the most fantastic CGI film ever, but it is also possible that the high school student creates a fake video to outspread falsity.
However, taking note of all the aspects, the decline of CGI seems far, somewhat way beyond the horizon. The world has witnessed the magic of CGI movies and isn’t ready to let it go yet.
Many technological advances have inspired the world to bring the best out of it. Similarly, with CGI, perhaps the best is yet to come. However, the world of CGI is exciting and in high demand. Therefore, hoards of students enroll in a course willing to learn the comprehensive aspects of CGI and 3D filmmaking.
However, it is best not to forget that the technology persists as long as the storytellers choose it to. The director chooses what and how CGI will be used in films because, eventually, everything comes to tell a story that scriptwriters write and the filmmakers visualize.