Monday, November 24, 2014

Computer generated imagery.

Since computers were invented, they have been used to display mathematical formulae. I remember getting my first computer--a little palm sized sucker that had a whopping 3kb of memory, and could store something on the order of 100 lines of computer code. It was a fancy calculator really, but it did print out a really neat visualization of a wormhole. Then came the personal computers--Apple II+ and the TI-80s and IBMs. It was a great time to watch graphics grow more and more impressive. I especially remember two pictures created by Silicon Graphics workstations... one of a billiard table with amazingly rendered balls (they even made a special process to put scuffs and dirt on the surfaces) and another of a wet mountain road. Such simple pictures are nothing these days, but were incredibly impressive back then.
 
Since those early days, computer graphics have accelerated to the point of absolute reality. Given a bit of time, modern graphics are capable of nearly perfect replication of reality. Even console gaming systems, like the PS4 or XBox One, are able to display live images that feel pretty real--they're even making the eyes seem far less fake now too. The time will come when we will finally get to the point where we won't be able to tell the difference.
 
Probably the final frontier is when movies have computer generated actors who are just as real as the real-live actors. Motion capture is highly developed now, giving us the phenomenal Gollum, as well as the Na'vi from Avatar. So, the foundation is there, and it'll only take a bit longer before we can create literally anything we can imagine. Virtual reality (still very far from being good) is the step afterward and then we'll have the Star Trek Holodecks we've all been craving! I can't wait. Pardon me while I do my happy dance...
 
- M

No comments:

Post a Comment