Touchstone Words

The History Of Computer-generated Imagery | Touchstone Words

Popular Articles

Sexual activity and body health
Do You Know If You Are Sexually Active
Blockchain Hyperledger family
Intro to Hyperledger Family and Hyperledger Blockchain Ecosystem
Biofuel, Biodiesel, Environment, Fuel, Fossil Fuel, Energy, biohydrogen, biomethanol, biohyrdrogen d
Pros and Cons of Biofuel Energy
Hyperledger design model and framework architecture
Overview of Hyperledger Design Philosophy and Framework Architecture
Hyperledger fabric and its components
The Survey of Hyperledger Fabric Architecture and Components for Blockchain Developers
social and economical state of a country
Pros and cons of capitalism vs socialism
Porn actors who go to Hollywood
From Porn performances to Hollywood
Perceptions and mind thinking
What are perceptions and how to manage them
Blow job tips
Pros and Cons of Blow Jobs
Taylor Swift nightmare songs
Top Ten Worst Taylor Swift Songs Shared by her Fans

The History of Computer-generated Imagery

By Shane Staret on 2017-11-11

Computers haven’t even been around for all that long, with the first practical computers being built in 40s. That is just over seventy years ago, meaning there are tons of people (including my grandparents) who were around before modern computing devices were a thing. Yet now, we have supercomputers that can beat anyone at chess and AI that we can talk to. It is actually insane how quickly humans have managed to advance technology. One of the most notable things that computers can do is create 3D animations.



The above picture is pretty much entirely computer generated. From the audience, to the background, to the water. The only things that weren’t computer generated were the shark and the dinosaur, however.

Computer Generated Images (CGI) have been incredibly influential in the past few decades, as they have changed the way TV shows, movies, and games are made. Hell, even advertisements use CGI, as basically anything you imagine can be created with a computer. But where did it all start? Obviously, advancements had to be made to make computer generated images so realistic and there has not been much time to make innovations, so just how has CGI managed to advance so quickly in such a short amount of time?

Well, to find that all out we have to go back to the 60s. Computers were still in a pretty primitive state but many advancements had been made. 1961 was the year when the first computer generated animation was produced. It was a rendering of a planned highway created by The Swedish Royal Institute of Technology. Clearly from the video linked was not that amazing. But, it was revolutionary considering the fact that they used computers to create an animated highway. Not much more was done in the 60s in terms of computer animation, with the only other breakthrough coming from the Soviet Union. Soviet computer scientists managed to animate a 2D cat on their BESM-4 computer.

Then in the early 70s, raster graphics were created. This made it much easier for people to create graphic images on a computer, as an image could be displayed on a 2D or 3D coordinate system, with each each 1x1 box representing a single “pixel” of the image. It is likely that your computer monitor is even using some kind of raster graphic system to display this article on your screen. It is crazy to think that something created in the 70s is still being used in 2017, but that just shows that raster graphics were simply ahead of the times.


The 80s saw many advances in CGI. Lucasfilm was the first film company to actually implement CGI within one of their films. This occurred in 1984 and was for the movie Young Sherlock Holmes, where a minor character appeared on screen only for a few seconds, but was completely computer generated. Below is a rather convincing “knight” that appeared within that movie.


Many movie producers and developers were beginning to see the advantages to using computer generated animations to create effects in movies rather than just using live animation. Certain effects (like blood splatter) could be easily done through CGI in post-production, whereas live footage would have to be done over and over again to get scenes to be up to standard. Within the 90s, we saw many “firsts” for CGI, particularly with Toy Story (1995) being the very first full-length computer animated film. CGI has essentially been perfected over the past couple of decades, with it becoming indistinguishable from reality like the very first image from Jurassic World (2015) above.

But CGI does not just belong in the film industry. The invention and further innovation of computer generated imagery is the only reason why video games exist. Considering the video game entertainment industry has been booming for quite some time now and grossing billions of dollars every year, it is safe to say that CGI has been a very important modern invention.

CGI has also been used to model architecture, create medical devices, and even model the smallest elements of the universe and the largest ones. Essentially, CGI can be used to visualize anything that you can imagine. Literally. There is virtually no limit as for what can be animated with computers and that is just insane to think about, considering there are people still alive who lived when computers were nonexistent.

Just how far can we go? This video shows a recent CGI sample of a super realistic human. I think it is safe to say that we are past the uncanny valley stage of animating humans.

Personally, I think it would be amazing if we can really push virtual reality to the point where it is nearly impossible to even tell if you are within it or not. Maybe it will advance so far to the point where we can live our lives in a virtual reality...where all our dreams come true and anything is possible. I know to some people that would sound like a dystopia, as society collectively gave up on improving their own lives and just lived in a fantasy life...but you got to admit that it sounds kind of cool.


Article Comments

By Same Author

How machine learning works
How Machine Learning Works
How computer screen burns
Ever Try to Burn a Screen of Device
How Planned Obsolescence works
Technology and the Growing Popularity of Planned Obsolescence
What is CRISPR
What on Earth is Clustered Regularly Interspaced Short Palindromic Repeats
Learn about logic gates
Logic Gates and How an Adder Works
Why programming frameworks are so popular
Why Frameworks are Awesome Tools for all Professional Programmers
Learn about PC blue screen
The Computer Blue Screen
Story behind Y2K
Why Y2K was Such a Big Issue
How to code methods or functions
What are methods and functions and how they are different
The history of technology
How Technology Has Evolved over Time

Affiliated Companies

Disclaimers And Things

Copyright © WEG2G, All Rights Reserved
Designed & Developed by DC Web Makers