As school starts up again for the year (my final year!) my attention has been focused upon the project of grand scale that lays before me: my capstone. The project that no teacher at Humber will let you forget from the first moment you sit down in a classroom. It is intended to be your greatest achievement before graduating, and to mark your transition into professional video game development.
I’ve been dying to get started. And so much thought and care has gone into my plan — and Axon’s plan — to bring my capstone project to life as a commercial game called Quench.
Quench is a top-down hex-based puzzle game in which the player controls the weather to assist herds of animals though desolate landscapes and the dark spirits of the past. The player uses earthquakes, lightning, wind, rain and simple psychology to guide their flock to safety, and eventually restore the world.
Since Axon is working on this project as a whole, I find myself in the enviable position of having a team of some of the most talented people I know from a wide range of fields to bring to bear in creating Quench. Furthermore, Axon will be bringing another another Humber Game Programming student named James Zinger onto the team, and so Quench will be shared as our student work masterpiece. All told, the Axon team for Quench is comprised of 6 people:
- Myself (Jeff Rose) – Programmer and Technical Direction
- James Zinger – Programmer
- Tabby Rose – UI Design and Creative Direction
- William (Bill) Nyman – Game Design, Level Design and QA
- Albert Fung – 3D Modelling and Animation
- Marty Bernie – Music and Sound Design
While Axon’s team is well-rounded, this presents us with a new challenge: How do we keep everyone productive in a team of this size?
Seeing as this is intended to be graded as a programming project, James and I must exhibit some serious technical geekery (there’s even a detailed rubric to outline the score of our geekery based on implemented features), but our actual goal at the end of the day is a polished product. Therefore, we need to take the time to provide facilities for rapid feedback and innovation as well as a separation of concerns for everyone on the team before we get ahead of ourselves. We have given a great deal of thought to our pipeline and how the tools that we employ can both accelerate our workflows, and perhaps most importantly, turn out a better product for us at the end of the day. Below is a very simple diagram outlining our tools and how they fit together:
To build Quench we’re using Unity as our game engine, primarily because we can get much more done throughout the year by taking advantage of the workflows that we’ve learned in our 2 Unity courses. Our modeller is well-versed in Cinema 4D, so we’ll be bringing models directly into Unity by exporting them as .FBX and importing them to Unity. To capitalize on Tabby’s excellent knowledge of Adobe Flash UI development, we’ll be employing Autodesk Scaleform to provide us with UI components that are built, animated and coded in Flash using ActionScript 3. We are also taking advantage of a kickstarted 2D armature-based rigging and animation system called Spine (by Esoteric Software) to perform character animation in Flash and to assist in producing cutscenes for the game, which can also be imported into Unity using Scaleform.
All of this means that we’ll be able to compartmetalize our efforts into separate domains. 3D modelling can be done in a suitable rendering environment separate from outside concerns. UI development and 2D animation can be done in its own environment. Game logic can be coded without other groups needing constant access to the central project files (that will exhibit varying levels of stability as development is underway). We should be able to focus on our own individual concerns without unexpected interruption and stay inside our preferred workflows most of the time, except when we need to integrate for a build.
Lastly, James and I are building a custom interface with Hexels (by Hex-Ray Studios), a hexagon-based pixel (hexel) art editor, which will provide a convenient suite of tools to facilitate level editing much more smoothly than building Unity extensions to provide these workflows (and at much less cost in development time!).
With our Hexels interface and Unity extensions in place, Hexels and Unity will become the primary level development environments for Quench, and moving back and forth between them will form the basis for an intuitive and powerful level creation process. To be clear about where I am trying to take this process in the long run (because I loves me some UX design!), I put together the diagram below to describe the target workflow that we want to promote for level creation:
Note: The parallelograms on the left represent assets or data being created/edited. The boxes on the right represent actions of the user that may result in changes to those assets.
A central part of the level design process is an iterative process in the middle diagram in which the level designer moves back and forth between Unity, Hexels and the simulator (which just runs in the Unity Editor when you press play). Basically the simulator is just the game logic without any of the player-targeted UI and control systems. The purpose of the simulator is to allow the map created by the level designer to reach steady state conditions (as it is likely that it won’t be based on only human input) and to give the level designer some foresight regarding how the initial conditions of the level will play out over time. The simulator will include controls to fast-forward and reverse the map state, allowing the level designer to stage the initial conditions for levels with much greater accuracy than trying to do it manually, and to quickly view the long-term consequences of any starting configuration.
Once the level designer has crafted the initial conditions of the map, tasks like decoration, light baking and nav mesh placement can be performed to complete the level design (or they can be done during the iteration process too) and the final product is ready to be exported as a level.
I didn’t say anything about it so far, but the Quench Editor is a stand-alone Unity project from that of the Quench Game. By establishing this boundary, we can then pack up the scene containing the level (and strip out any debug components we were using automagically with an Editor extension), and export the whole thing as a unity package along with any dependencies it requires. Simply import this into the Quench Game project, and the new level should be included into the project with no extraneous clutter or bloat to the project which needs to be built as an .APK for Android mobile devices. If you’re interested, the Quench Editor and Quench Game scene graphs are intentionally very similar (to make exporting/importing go smoothly), and are structured something like the following:
At this point we have a reasonable chunk of these Unity Editor extensions up and working. We’re hoping to have a more complete Quench Editor available to show off in a couple of weeks. From that point onward, iterating upon our game design will be much smoother and the whole team can get involved more easily. I’m really looking forward to it!
I really can’t imagine anyone actually reading this mess through to completion, but if you made it this far, you must have terrible taste in blog writers. Keep it up. I don’t mind the company.