Maya: Animation Event Definition Tool

I've been working on a new set of tools to define animation event data in Maya and export it into Unity for consumption by the AssetPostProcessor. This has been kind of a cool learning experience, because before now I really didn't have much to do with the Animation pipe, which has been an entirely manual process for the animators.

Previously clips were cut up and events were defined by hand in Unity once the .fbx files had been exported from Maya. This wasn't such a bad thing, except that the tools in Unity for defining events were very un-intuitive. The aim of doing all this was to allow the animators the ability to work in a tool they were familiar with, and get the AssetPostProcessor to do most of the boring legwork on the Unity side.

The Maya interface. 
The Maya tool is pretty simple right now, and responsive enough that the animators don't hate me. Navigating to different clips or events will move to that time range on the slider. The event list is directly correlated to events called by the engineers in engine, so they are only available as pre-set items in a drop down.

The data is stored in the scene as attributes on empty nodes. On export time, the data is compiled into dictionaries that are then written into a .JSON file named after the .fbx.

The end result is something like this:

Data for an 'Attack' animation. The first clip is the entire duration, the next three are the intro, loop and outro. In the anim events the unit fires it's weapons three times, once on frame 5, then on 9 and 13. 'Index' is used for the muzzle position on a unit or turret, as those are the only entities in game making use of the system. I would prefer to be using named bones for more flexibility. 
Once this animation is out there, the engineering team pull it into the AssetPostProcessor and generate the required clips and event metadata from the contents. 

All in all, the Maya side is pretty basic at the moment, and implemented on a project that currently has pretty basic animation requirements. Even so, the idea of feeding data into the AssetPostProcessor is something that really appeals to me, and the .JSON format has been pretty easy to write to from Python. 

Eventually I would like to be able to define custom VFX events on named bones, and mark up non-event related data like whether you want an asset to automatically generate lightmap UV's on an imported model or not. There is a lot of potential there.