by Jeff | Jun 12, 2015 | Axon, Design/Development, Featured, Games, Our Games, Quench |
Since entering into production for Quench, we’ve been building out all of the gameplay by making the world much more dynamic and complex than our prototype ever was. Not only can you use your rain power to create water and bring the map to life, but wind is able to shift sands to uncover hidden objects, and lightning is able to start fires that can quickly become uncontrollable under the wrong circumstances. Many of these features are brand new and still very programmer-art-iffic. We have a Grand Plan™ to soon move all of these systems rendering into 3D with awesome particle effects, but as of this moment everything is a black-and-white demonstration of data flowing through the map that we use to help us debug. Don’t let that scare you though. We just want to give an insider’s look at how we get our work done before we have assets ready to see the final product in all of its shimmering polygonal glory. Until then, we’d love for you to take a look at a couple of short videos about fire and how it behaves, and a short discussion of some of the tools we’re using to help us design the behaviour of our environment even with so few assets to render what’s going on. Only YOU Can Prevent Forest Fires In this video, Jeff shows off the environment simulation with regards to fire and spread of heat in a level, including how fire reacts with water (rain, surface and groundwater), movement of fire with wind, and production of ash. Currently the simulation is not associated with 3D assets...
by Kristina Neuman | May 29, 2015 | Axon, Design/Development, Featured, Games, Our Games, Quench |
Kristina here! I’m responsible for a lot of the 2D art and 2D animation for Quench. In particular, I’m working on the animated cinematics that will play at the beginning and end of the game, as well as between some of the levels. In this blog post I’ll describe my process for making these cinematics, focusing mainly on storyboarding. Overall Process There are many steps involved in creating the cinematics for Quench. Before I delve into more detail about storyboarding, here’s a quick overview of the whole process: Figure 1: Flow chart of how we’re making the cinematics for Quench. While I’ll likely go into more detail about the final few steps of this process in a future blog post, for now let’s focus on steps 1-3. What’s Storyboarding? Storyboards are graphic organizers that visually tell the story of an animation (or film), panel by panel (kind of like a comic book). Before jumping directly into making the animation, starting with storyboards help us pre-plan the major actions that will happen, help to time everything out, and (at least in our case) give us an idea of the assets that will have to be made to make the animation. Storyboarding saves money in the long run because it is much easier to make changes and fix mistakes during this stage than once the animation has been put together. A storyboard will likely convey some of the following information: What characters and objects are in the frame, and how are they moving? What are the characters saying to each other, if anything? Is there any narration? How much time has...
by Albert Fung | Apr 10, 2015 | Design/Development, Featured, Quench |
This is Albert Fung, and I’m responsible for generating the 3D models in the upcoming Quench game. I’d like to start with an overview of our 3D workflow. Generally speaking, I would receive concept sketches and pre-production specifications from our team members, and start building rough 3D models in Cinema4D, our choice of 3D production software. I would then add in lights and colors on the model (as seen in Fig.1) to enhance its appearance, and then import this model into our gaming engine, Unity3D. Fig 1. Test renders made in Cinema4D – these will be hard to replicate in Unity3D The problem is that the lights and colors do not necessarily translate across platforms. Cinema4D can produce fascinating renders on linear animations, which are based on complex calculations that simulate physical light in a virtual environment. These calculations, however, are too resource-intensive for real-time rendering – imagine having to simulate the light interaction between a herd of animated elephants and a vast terrain object, all at a rate of 60 frames per second. If we tried to do that, the game simulation would grind to a halt, gamers will experience lags, or in worst case scenarios a complete crash. Fig 2. Flat 2D files that document the rendering/lighting data seen in Fig 1. For that reason, I’m experimenting with baking textures, and importing them as flat 2D files into Unity3D. The intense calculations will be done in Cinema4D, and the finalized data will be taken into Unity3D. The game engine wouldn’t need to do the heavy-duty calculations, but our models will retain the high...
by Tabby | Nov 9, 2013 | Design/Development, Events, Featured |
Recently I’ve been working on an iPad app at my second (third?) job, and I wanted to share some of the stuff I learned about Flash UI design, particularly for those working in Flash Professional CS6. Adobe hasn’t done the most excellent job with creating components or built-in classes for some really important UI standbys (scrollbars, I’m looking at you) so I went looking for some alternatives. What I ended up going with is a quite excellent open-source UI library called Feathers, which runs on the also open-source Starling framework. Unfortunately most of the documentation out there is for folks using Flex or Flash Develop, which are great tools, but I use Flash Professional at work (and before you ask, my work computer is a Mac so I can’t get Flash Develop). It took me a while to figure out even the simplest configuration in CS6 so I thought I’d make a short presentation/tutorial to pass the knowledge on and hopefully help someone out. As an aside, I should point out a few things that may have made this more difficult for me than it would for someone looking into Feathers for a new project. I’m a fairly advanced, but largely self-taught, Flash programmer. I don’t have much outside experience with programming so I do find some of the design patterns that Feathers uses new and confusing. If you are coming from a design background, this is going to be a bit of an uphill battle, but it’s totally doable. If you’re a pro programmer already (like Jeff!) than you’ll probably be just fine. The other problem was that...
by Jeff | Oct 19, 2013 | Axon, Design/Development, Featured, Games, Our Games, Quench |
We’ve been working on the Quench pipeline tools for a couple of months now, and despite the apparently endless barrage of school assignments that keep slowing things down, we’ve managed to reach our first major milestone! As I mentioned in this post, our plan has been to closely integrate a hex-based pixel art editor called Hexels with Unity as a 2D map editor to let our map designers more easily do their work. We’ve got the initial stages of this process working and from this point forward we’ll be making more and more map features in Unity editable from Hexels. I figured that I would run through the stages that we’ve passed through on the way to a working (but still pretty unstable) product. First Steps When this integration process began, we planned to utilize Hexels’ XML output feature to pass data back and forth between it and Unity, but after a short email conversation with Ken Kopecky of Hex-Ray Studios (the developers behind Hexels) it was made clear to us that Hexels doesn’t actually read its own .XML output format. With that in mind, we realized that we’d have to bite the bullet and decode the Hexels binary .HXL file format. Thankfully Ken is amazing and happily provided us with a specification to follow in the process of building a C# .HXL reader/writer (Hexels itself is written in C++). Ultimately the power of Hexels has been well worth the effort, as it provides us with a ton of map editing features that would have been a mountain of work to implement from scratch in Unity as Editor extensions....
by Jeff | Sep 27, 2013 | Axon, DataVis, Design/Development, Featured, Games, Our Games, Quench |
We’ve been working on the Quench editor and pipeline for about a month now, and before anything else, I want to know what I can and can’t get away with doing on an Android tablet. Over this past week I’ve performed a semi-scientific study to identify how best to use the A* algorithm in Quench. AI can be incredibly demanding on CPU resources. Having spent time studying robotics, I know the kind of computational power that often goes into academic robot designs with goals no more noble than ensuring even coverage of a surface by a Roomba. It can be surprising how much computation it can take to do something that seems trivial in the human experience. Quench is going to require that groups of animals have herd-level group-think AI and individual-animal-level AI that result in flocking/swarming behaviour to move as a group and also avoid enemies while finding their way through a map of hazards to reach a goal location. Our plan is to implement the group-level AI as an A* algorithm that runs at intervals to identify a clear path to follow. Flocking requires some further study before we can say how exactly that will work. With these goals in mind, I dove into the first assignment for my AI class at Humber to answer some important questions for myself. I wanted to know what factors cause the computation complexity of A* to grow most quickly, so that I could plan to mitigate or sidestep them. And so I wrote myself a simple A* testbed as a C# Console Application that utilized interfaces to specify the necessary...
Recent Comments