Mocap & Gameplay Setup
I directed a small team of designers and programmers who implemented gameplay and mocap sequences for 1979 Revolution. I handled scheduling of tasks, training, and directly coordinated with the 1979 team both internal and external to make sure the project was completed.
Tech Used: C#, Unity, Maya, iOS
Things I Worked on:
-Environment and design whiteboxing.
-Tool and pipeline development.
-Project management, training, and scheduling.
Working with other designers, we developed a pipleine for breaking down camera and animation direction, and then implementing it in-game with frame-by-frame accuracy with mocap footage.
Along with directing mocap implementation for cinematic scenes, I setup a majority of in-game action sequences, or directed others in setup of action.
Fight scenes were mocaped with limited hand animation for fingers and faces. Mocap animations for fights often included props, multiple character interactions, and very short animation sequences which all had to be choreographed into a fluid experience.
Crowd Rendering Tech
One of the design requirements that I was slated to tackle on 1979 was the need to render a large crowd reminiscent of what someone would experience in a street protest. There was also the added difficulty of having to handle both wide shots of hundreds of people, alongside close-up shots where our main characters would be in the midst of a mass of people.
On PC we were able to get away with simply pooling and placing multiple animated meshes where they were needed in-game, as long as the total number of characters wasn't too large. On iOS, we had to get creative - the app was spending too much time animating hundreds of characters. So, I created a system that instantiated a handful of animated characters and baked down the meshes every frame. The meshes were distributed out to hundreds of static crowd spots throughout the scene, and were culled based on visibility. We were able to get away with maintaining similar crowd numbers on both iOS and PC using this technique.
Tech Used: Unity, C#, Maya, iOS
Things I Worked on:
-Character object pooling and allocation.
-Simple crowd navigation.
-Crowd system setup in-editor.
I setup the animation, dialog, and crowds for this squence. The crowd system was able to intermix 3d characters and distance planes well enough to convincingly portray a large crowd.
Crowd planes could be dynamically batched and drawn with low overhead, which allowed huge crowds to be rendered even on mobile devices.
I wrote the majority of code involved in crowd rendering. Crowd setup in scene was designed to be simple and mostly automatic.
I also worked on simple crowd navigation, which allowed for inter-entity collision avoidance and limited obstical navigation.
Blood And Water
1979 presented multiple opportunities to test my shader-creating abilities. Something of particular challenge was fluids. Blood and water on multiple occasions needed to be represented in-game, and also had respond to user input. It took many cross-dicipline skills to pull off some of the effects used in 1979, even if they were only in game for a few seconds a piece.
Tech Used: Unity, C#, ShaderForge, Maya
Things I Worked on:
-Skinned mesh creation.
I created a Blood shader that layered multiple texture alphas on a mesh that animated with a character model. The system detected touch or mouse input and adjusted vertex alpha of the animated mesh to fade in and out blood.
Working with a programmer, I worked on one of the most challenging effects in the game - an interactive pool of development fluid for a photography minigame.
The water system took user input and semirealistically added waves and ripples to the water mesh. User input simultaneously deformed the mesh and generated a texture which was fed into the water shader. This effect runs even on iOS, but is extremely expensive!
Ember is mainly a mobile title that had very simple material and lighting setups, So certain environments needed to be pushed up graphically with a heavy dose of FX. I was tasked with buidling effects to decorate the world of ember, along with improving and expanding cinematics that were entirely in-engine.
Tech Used: Custom Game engine, Ogre, Particle Universe, Photoshop.
Things I Worked on:
-Particle texture painting.
-In-game cinematic scripting.
I setup particle effects for multiple cinematic sequences, including the final boss battle. Many sequences were a complex series of events that needed to be choreographed properly for a smooth experience.
Further example of FX for ice-themed areas.
Further example of FX for underground nature areas.
One of my favorite effects - the glowing orbs of Radiance.
Mesh Deformeing Tool
I wanted to have a quick way to build large, varying environments on the fly in the editor withouth the tedium of going back and forth between Maya and Unity to update meshes. So, I built a tool that uses splines to deform meshes real-time in editor to speed up environment building.
To get this to work, I had to first build the spline. Unity doesn't have a built-in spline system, so I wrote a Bezier spline class that came equipped with special features to allow for traversal of the curve based on arc length instead of normalized time. Once this hurdle was overcome, deforming the mesh based on the directional information at any point on the spline was straightforward.
Tech Used: C#, Unity, Math.
The spline tool can deform pretty much any mesh, as long as the resulting mesh has less than 65,000 verts.
Multiple meshes can be deformed together to make interesting geometric features. Each mesh, or "lane" can be independently offset from the spline or flipped if desired. The system can also pull randomly from a pool of meshes to combine into a single deformed model.
The system was of great help to fully realize 3d environments in a short timespan without needing artist intervention.
The resulting meshes can be baked with collision, work with unity lightmapping, and exported from untiy into Maya/3dsmax for additional tweaking.