3

Realtime Animation Tips & Tricks

 6 months ago
source link: https://rockhamstercode.tumblr.com/post/178388643253/motion-matching
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Motion Matching

One of the more interesting changes that you can add to an animation engine that also has the biggest impact on quality is motion matching. It is an algorithm that is especially useful for locomotion and states where you can predict the trajectory of movement for a character.

Signs are pointing towards Unity having motion matching available by next year and companies like Ubisoft and EA are using that in many games already. You should not be afraid to give it a go.

image

UFC3 was one of the first EA games to go all in on Motion Matching and it achieved more realistically moving fighters.

Moving towards Matching

Realizing that there are components to motion matching that can be implemented separately and utilized without the full implementation you can motivate each separate step and improve your character animation iteratively.

That iterative development is something that is easy to barter for in our sprint locked game development processes.

The first step we took was to implement pose matching and apply that to seeking in animations. We utilized that for looping locomotion animations and we already had this for our state machine choosing between multiple animations, it worked well. Pose matching that you’re limiting to a few joints is quite fast and we utilized that for our players and npcs.

http://rockhamstercode.tumblr.com/post/174887916558/you-should-be-matching-poses

The second step was to add prediction to the physics simulation that recorded a history of positions to predict the facing and lean angles.

http://rockhamstercode.tumblr.com/post/175020832938/predicting-is-guesswork

These elements, the blend solutions I’ve mentioned before, the animation informed physics simulation and with a few states that chose based on foot phase you can achieve a consistent and reactive solution for a player character.

image

Pose matching and animation informed physics simulation. We had a very reactive and fluid player character with a planted feeling.

First Step Motion Matching

We implemented a first step motion matching routine without the real content, the way we implemented this first one was to simply create a new node in Unreal and this node had a list of animations that it would seek in and start playing at certain times.

It maintained a list of currently playing animations with blend weights so whenever a new selection was made it blended out the currently selected one and we had a short blend time to blend out to the new selection. Same animations could be played multiple times since with real motion matching content you will be seeking in long animation clips rather than choosing between many. We also blended the root motion fully between all animations.

The inputs were a current translation, current facing and then target facing and target translation. We changed that rapidly to an array of trajectory information as when you are only supplying current and target information to the motion matching it will only find parts of animations that are moving at full speed so you will lose all acceleration and deceleration. That array of trajectory information came directly from our prediction routine. We translated that trajectory information into local space from the current facing and translation.

Our search routine was brute force and it really just did two things, calculate the pose weights and future trajectory of every frame in every animation. We had precalculated the joint pose transforms as part of our work for pose matching as start location in animations already and we set that up for pelvis and feet joints in our motion matching node. We had not precalculated the future trajectory so the future trajectory was extrapolated by looking at the future joint pose transforms of the root. Not fast or clever but it was basically working.

image

The goal was to implement motion matching as described by Ubisoft as it achieved so much more fluid and realistic character motion at runtime than any other solution.

Not Good Enough

Now this was not good enough but it worked as a proof of concept, a node that chose animations automatically and that’s all you needed in your state machine.

Our next step came much later than the first test that we wrote, we took inspiration from Ubisoft and EA and took a few of their dance cards as templates and filled our mocap room with markers for where we would do turns/etc. We captured at a normal walk and at jog pace. We did a few different types of captures with starts and stops. S-curve type motions, freestyle motion and accelerations.

As this was just an exploration for us it took a little while after our mocap shoot to import it as we had ‘real’ work to do as well. So this was basically further explored when the animators had time to import and I made some free time in my schedule.

We had already implemented prediction of deceleration as part of predicting the stop location for a trick that was used for a motion warp routine. For motion matching we needed the acceleration as well so we added that part as well and mixed this with our extrapolation of facing.

Since the animators didn’t have time to clean up the mocap entirely I added two things that I know are important to motion matching. First of all a function that modified the mocap animations to set the facing for the root joint from the pelvis facing. Then I also added a notify marker that would allow us to change the weight of frames inside that notify window. This is a way that you can exclude parts of the animation that is unwanted during the solve and the mocap data had some glitches here and there that I wanted to exclude.

Also we added some basic functionality like having an interval for when searching is allowed, you don’t need to search for a new pose every single frame. It can increase stability in the character motion by not doing that, our setting was to search every 3 frames.

image

First functional motion matching in game with an NPC. The lines shooting out from the character are the best 20 choices with the thick dark line as the best choice currently playing.

Moving Forward Rapidly

To iterate on the solve and how we calculated the weights we also implemented a simulation mode to the motion matching node. This simulation mode was basically a list of transforms that the node would move through as it was running the motion matching selection routine in editor. Another tool that we had added earlier was a tool that recorded animation poses over time in the game, this also recorded the transforms of the actor. This kind of tool is quite common in game engines and it’s imperative to debugging animation problems.

We added an export to CSV for that transform recording, this is something we utilized at Bioware for the animators to see exactly how the character was moving so that they could match their animations with the physics changes that came in. Here it was instead used as input to simulation and using that static piece of data we could iterate on the solve and compare against a known set. But we also created a level with a navigation path for a test NPC to follow.

Having that simulation meant that the weight calculation could be improved and these are the improvements that we made.

Pose Matching changed to use squared distance comparisons for the translation and the squared delta distance for comparing velocity. Joint rotation quaternion angle difference.

For the trajectory we had made an error with the facing calculation so that it wasn’t relative to the start facing. Same as the pose matching we also made this use squared distance, delta distance and quaternion angle difference.

On top of the pose and trajectory weights we had a factor applied on top, this factor was used either as a factor to remain in the same animation or in the same cycle of the animation with a window of what was considered the same cycle. So frames that are close to the same position in the animation get a lower weight.

An additional change that was required was that our prediction was done at 60Hz, since all our animations were animated at 30Hz that additional prediction accuracy was unnecessary and just added extra complexity to the routine so we changed it.

image

Improved weighting and trajectory matching meant that the character followed the path much more accurately and achieved a behaviour that is very hard to achieve without motion matching.

Applying Motion Matching

We had a routine that functioned but it wasn’t quite there, performance was way off and the solver while it gave us motion and root motion that moved the character it wasn’t working well with our running motion set.

The reason that it wasn’t working well with our running motion set was due to that set having a lot of acceleration and deceleration during the capture as our motion capture room was not large enough for us to maintain a steady pace.

However there was also an additional cause, we were using all 30 trajectory samples for our matching, we were basically trying to match the trajectory with too much detail. This is a key thing to motion matching that you need to realize, it is not necessary or desired to accurately follow the predicted movement and it is only important that you are making steady progress towards the goal.

Reducing the number of samples to 4 meant that we could utilize our motion matching node for the running set as well. Those 4 samples should be spread over the prediction time.

As part of adding the run we also created an entire animation blueprint so that it supported idle, walk and run and the additional gameplay montage and layered animations so that we could have an NPC doing everything that he would normally do.

image

The running set contained more acceleration and deceleration as you can see a hint of in this capture. Ideally you need to capture your set of animations with more stable pace.

Optimizing Motion Matching

This code was not particularly fast, we were doing a lot of throw away calculations during the search and we were searching every single frame in the animations.

First part of the optimization was to copy all the pose frames and precalculate our 4 trajectory samples forward for every frame and store that locally in the node. During this process we also made sure to convert everything to the format we would run the comparisons in so we’d have vectors and quaternions. No intermediate structures and we set the comparison up so that it would not recalculate any information.

Second step was to exclude the frames that had been set to infinite weight with the notify windows in the animations. These simple improvements meant that for our set of animations we had ok dev performance on PC and ok test performance for PS4.

What was next?

The next step would have been to investigate acceleration structures and the one that I was thinking of to implement was a KD-tree for doing nearest neighbour search. I was also interested in looking into compute shaders but that would not have been something achievable by us in the time we had.

We would also have done more mocap shoots and cleaned up the mocap to make it higher quality and an easy replacement for our previous locomotion solution. We would also have cleaned up the trajectory as you can see foot sliding in the above captures. The goal was to use it for NPCs and not the player as changing the player locomotion would have required much more work to get the same responsiveness, versatility and predictability we already had.

Also a small detail was that we were using the player physics prediction to generate a path for the NPC, you can use the navigation path for NPC with physics acceleration and deceleration to get a better result as NPC know ahead of time exactly where they are going.

For a new project we would have gone all in I’m sure.

image

The goal with locomotion on NPCs is to show anticipation and believable movement.

-J
With thanks to Ubisoft, EA and Naughty Dog.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK