Dev Blog: Virtually Skinning Virtual Elephants

Dev Blog: Virtually Skinning Virtual Elephants

This is Albert Fung, and I’m responsible for generating the 3D models in the upcoming Quench game. I’d like to start with an overview of our 3D workflow. Generally speaking, I would receive concept sketches and pre-production specifications from our team members, and start building rough 3D models in Cinema4D, our choice of 3D production software. I would then add in lights and colors on the model (as seen in Fig.1) to enhance its appearance, and then import this model into our gaming engine, Unity3D.      Fig 1. Test renders made in Cinema4D – these will be hard to replicate in Unity3D   The problem is that the lights and colors do not necessarily translate across platforms. Cinema4D can produce fascinating renders on linear animations, which are based on complex calculations that simulate physical light in a virtual environment. These calculations, however, are too resource-intensive for real-time rendering – imagine having to simulate the light interaction between a herd of animated elephants and a vast terrain object, all at a rate of 60 frames per second. If we tried to do that, the game simulation would grind to a halt, gamers will experience lags, or in worst case scenarios a complete crash.      Fig 2. Flat 2D files that document the rendering/lighting data seen in Fig 1.   For that reason, I’m experimenting with baking textures, and importing them as flat 2D files into Unity3D. The intense calculations will be done in Cinema4D, and the finalized data will be taken into Unity3D. The game engine wouldn’t need to do the heavy-duty calculations, but our models will retain the high...