Forest Creatures & Anatomical Features

Forest Creatures & Anatomical Features

This week I thought I’d tell everyone a little bit about the background of the Axon artists/designers, including myself (Tabby), and talk about a recent event we participated in. Kristina, Albert and I all graduated from the Biomedical Communications Master’s program at the University of Toronto, back in 2010. Cool, you’re thinking, but what the heck is “biomedical communications”? We like to tell people we’re medical illustrators – meaning we are proficient in textbook, particularly anatomical, illustration – but that really only scratches the surface of what we do (and has very little to do with our roles at Axon). Literally the first assignment is a medical illustration. The BMC program is divided into 2 streams. Kristina and I were in the first stream, interactive media, while Albert studied the second, 3D animation. Everything we studied focused on visual communication for medicine and science, but we go about it differently. For our Master’s Research Projects, Kristina and I designed and programmed Flash games (hers was about arthropods, meant for a museum kiosk, while mine focused on food-borne illness and was directed at teens). Albert made an animation describing axillary lymph node dissection, a surgical procedure. In the interactive stream we studied web design, information design, UI/UX, and educational design. Meanwhile, the animation students learned about cinematography, compositing, scriptwriting, storyboarding and 3D modelling and animation. Screenshots of our MRPs (L to R: Kristina’s, Tabby’s, Albert’s). With our technical art backgrounds, you can see why our focus at Axon is on designing visual solutions, mostly for medical education – it’s our specialty! …But we also have a whimsical side. After all,...
Dev Blog: Virtually Skinning Virtual Elephants

Dev Blog: Virtually Skinning Virtual Elephants

This is Albert Fung, and I’m responsible for generating the 3D models in the upcoming Quench game. I’d like to start with an overview of our 3D workflow. Generally speaking, I would receive concept sketches and pre-production specifications from our team members, and start building rough 3D models in Cinema4D, our choice of 3D production software. I would then add in lights and colors on the model (as seen in Fig.1) to enhance its appearance, and then import this model into our gaming engine, Unity3D.      Fig 1. Test renders made in Cinema4D – these will be hard to replicate in Unity3D   The problem is that the lights and colors do not necessarily translate across platforms. Cinema4D can produce fascinating renders on linear animations, which are based on complex calculations that simulate physical light in a virtual environment. These calculations, however, are too resource-intensive for real-time rendering – imagine having to simulate the light interaction between a herd of animated elephants and a vast terrain object, all at a rate of 60 frames per second. If we tried to do that, the game simulation would grind to a halt, gamers will experience lags, or in worst case scenarios a complete crash.      Fig 2. Flat 2D files that document the rendering/lighting data seen in Fig 1.   For that reason, I’m experimenting with baking textures, and importing them as flat 2D files into Unity3D. The intense calculations will be done in Cinema4D, and the finalized data will be taken into Unity3D. The game engine wouldn’t need to do the heavy-duty calculations, but our models will retain the high...