CGI is getting uncomfortably too real for me.
It may shock you, but the front of 1968’s electronic innovation was a 2D image of a cat that could move. At the time, the ability to completely computer generate and move and image real time was nothing short of a scientific marvel. Fast forward fifty years and the technical innovations birthed out of computers is simply overwhelming.
You might think “hey this is pretty exciting!” But we are actually living in the worst period of CGI…
Enter the uncanny valley.
Before we delve too much further, lets just point out what the ‘Uncanny Valley’ actually is. So, have you ever looked at something that is made to look life-like yet it looks unsettlingly not real?
No examples come to mind? Take a look at this cutscene from a videogame called “Medal of Honour: Warfighter”:
Now there’s obviously a lot of effort behind this to make it look as real as possible, but don’t you just think “ahh somethings not quite right?”
Want a more extreme example? This video of a “Ultra-realistic” dental training robot:
So, why is this so darn weird to look at?
In short, our brains are great at projecting human characteristics on to objects that we know are not humans. Apparently, we do this because it gives us a sense of familiarity and allows us to accept what we are seeing.
But when the object is already ‘humanized’ for us, our brains go in to overdrive searching for imperfections and rejecting these familiar characteristics.
This is why a car’s headlights, body shape and detailing will give it personality but a porcelain doll will give you nightmares.
Now what’s great about it all is that you can pinpoint the spot that makes you uncomfortable. Where is this point? Please examine this very scientific graph from filmscape.com:
Based on this graph, you might think that it really doesn’t seem to large of a gap, but keep in mind there is no sense of scale here. The simple fact is, we don’t exactly know how wide the valley is, meaning we have a lot of cringing attempts to get through.
This isn’t an entirely new concept either.
Humans have long been fascinated with replicating humans or human-like behaviour in order to take burden off our species. In fact, the whole robotics industry is centred around that very principal.
Sci-fi media has long publicised the innate fear we humans have with regards to technology over taking us. Gray and Wegner discussed the possibility that this fear was the direct cause of our rejection of overly realistic CGI.
So where does that leave us?
Historically, graphic modellers have gone one of two ways – stylization or realism.
Stylization consists of limiting details in order to emphasise things like colour, shape and form. As consumers, we are generally more receptive to this type of graphic modelling and it is considered a safe bet. Think this guy:
On the other hand, is realism, which attempts to mimic lifelikeness and pushes us closer to finally leaving the uncanny valley. However, the pursuit of realism is a) time consuming b) grotesquely expensive and c) ages horribly. Not to mention, as we’ve discussed, it is super easy to stuff up making something realistic.
But there is light on the horizon.
Recent blockbuster movies have edged dangerously close to finally climbing out of the valley.
A most recent example of this is the complete recreation of long-time deceased actor Peter Cushing in 2016 film “Rogue One: A Star Wars Story”. Animators spliced his entire face on to another actor (Guy Henry) and gave the character a central role to the film.
Take a look.
I know what your thinking – it’s actually not that bad to look at. It highlights how close we are to achieving true realism.
But this also opens a whole other can of worms (e.g. bringing a dead guy back from the dead to play a part in a movie) which will be discussed in Part 3 of the “Electronic Arts” series.