How can I play animations strictly from C++?

I see this asked a lot here but never fully answered. I’ve asked similar questions before. Here are the ground rules:

  • NO use of animation blueprints
  • NO use of animation state machines
  • ONLY C++ calls to play, blend, and manage animation assets

We have:
Montage_Play(anim, play speed), and Montage_JumpToSection to skip ahead.

I can’t see a way to blend 2 (or more) animation together from C++; alternatively, a way to play a Blend Space (1D or 2D) from C++.

I can’t see a way to play an animation into a particular slot from C++ (eg, Upper body or Lower body). Playing a Montage that is not set to slot “Full Body” or “Default” results in not playing at all.

I can’t see a way to play an animation to the end and stopping there (i.e. not looping the animation). I could add a bunch of logic to do manage this, but animation systems normally have this in there somewhere.

If it’s not possible, I’m fine with that as an answer. If you’re curious about why I would not want to use the editor features for these things you can read the following but if you’re not curious then you can skip the rest of this paragraph. I’ve been developing games for many years now and am very comfortable with C++ implementations. IMO, the animation part of the editor is bad for game development. Having two competing and unrelated state machines (one for behavior and one for animation) battling for control over a character is an ill-conceived concept. Both developing complexity in their own unique ways (state machine vs behavior tree or other) while trying to keep it in sync and tracking lines of causality is crazy house. Bugs are invisible, systematic and display emerging behavior (the perfect storm of development pain). It’s fine if you’re a student trying to learn 3D but for complex professional work, specially dealing with complex animations, it’s a non-starter.

3 Likes

handling animations in the editor makes sense, since animations are art assets, and you wouldn’t want artists relying on programmers changing and recompiling code, just to add a new hard coded animation reference.

using multiple state machines also makes sense. you really shouldn’t have a single state machine controlling an object, unless everything that object can do is mutually exclusive. if you want multiple degrees of freedom, performing multiple tasks at the same time, it makes sense to separate the state into multiple variables. so you should have separate state machine for the camera, navigation, attacks, hitboxes, jumping, stamina, menus, character animations, physics mode, weapons, weapon animations, etc… any functionality you want to trade out independently, should be a separate state.

Animation and behavior go hand and hand. If you’re doing complex systems of realistic behavior/animation, you don’t want artists/designers messing with it. Just a place for them to add the assets (no recompile needed), and expose some parameters for animators and designers to tweak. Any time you save by allowing non-programmers to make changes you pay for ten-fold in development time and bug-hunting.

I would suggest having a gameplay programmer who is responsible for event graphs and can also code in c++.

anim graphs, notifies, montages, and state machines are all gameplay programmer jobs. you might be able to hard code those into your engine, but you lose a lot of the flexibility that gameplay programmers need to iterate on new functionality.

if you want to keep artists from messing up code, tell them not to edit any graphs or data without permission from a gameplay programmer. modelers, animators, and writers should probably not even touch the level editor, unless its just to import their assets. level designers, lighting, and sound designers can place actors into a level, and tweak exposed properties, but if they want a new actor, they should ask the gameplay programmer.

FX artists need to use the editor to create their content, but they don’t need to worry about event graphs, and their assets are cosmetic, so they should not affect external states. they probably don’t even need to use the same engine build, they can probably just use a standard build of the editor, without any of your game code.

interface designers also need the editor to make their content, but they should probably be programmers or work in photoshop making mockups. maybe the gameplay programmer should code the menus as well, if they aren’t already too busy doing everything else.

or you can go the opposite direction, and hire technical generalists who can make all kinds of things, and work together to spot each others mistakes. you might be trading reliability for speed, but the iterative designs you get out of that process might be more fun. as long as you have a source control versioning system to make file backups, you should be fine just winging it with a talented group you can trust.

The features I’m asking about here are available for the other competing engines like Unity and Cryengine. I’m not sure why Unreal would want to shoot for having fewer features or be less complete for advanced development than the others.

The main concern here is that a separate animation system from behavior create more problems and more work for everyone than it save; for the reasons already explained. Menus and effects are orthogonal to behavior and are not of concern here. A menu is not trying to change the movement of the character. An effect does not try to change the animation of the character. Those are non-competing systems. Animations are intertwined with behavior. The behavior of throwing a punch and the animation of throwing a punch work together. These are two systems are competing over the character - logic pulling at the character one way, and animation another, the way UE4 does it.

After a while in development you stop worrying about what looks nice and worry more about what gets the job done the quickest with the fewest problems. You start realizing you spend 10% of your time coding and 90% of your time debugging. The animation system in UE4 tries to save you time in the 10% - exactly the wrong place to put your efforts - and adding to the debugging time. After a while you also realize that it’s more important to have flexibility in your system than a slick interface. You can’t go to a designer and say “we can’t do that feature because it doesn’t play well with UE4”. At the end we’re in the Entertainment industry, and producing creative, fun, original content is more important than anything else. I’ll take working harder and developing a more impactful game over doing an easy job that produces mediocre products. In my particular case, I believe doing it my way be both easier and make more impactful products in the long run, and that’s what really matters.

1 Like

I’d also like to know how to set and play animations from C++

Besides Montage_Play and Montage_JumpToSection, I found something called PlaySlotAnimationAsDynamicMontage in AnimInstance.h which seems promising. You can set the BlendInTime, BlendOutTime, PlayRate, and LoopCount. I’m going to try it soon and report back if it works for me.

I still need to find a way to blend two animations together strictly from code though.

Any update on this? Used to doing this in Unity and I like having total control in the code.

Partial answers:

Playing Blend Spaces from C++:

  • Set a UPROPERTY for your BlendSpace to be set in the Editor:

     UPROPERTY(BlueprintReadOnly, EditAnywhere, Category = "Anims")
     UBlendSpace1D *BlendSpace;
    
  • Play the animation and set the BlendSpace parameter (0-100 how far to mix the anims). In this example, I have a 1D BlendSpace that I’m blending along the ‘X’ axis (as set in the editor):

    ASkeletalMeshActor *Skel = Cast<ASkeletalMeshActor>(MyActor);
     if (Skel)
     {
         USkeletalMeshComponent *Mesh = Skel->GetSkeletalMeshComponent();
         if (Mesh)
         {
             Mesh->PlayAnimation(BlendSpace, true);
             FVector BlendParams(50.0f, 0.0f, 0.0f);
             Mesh->GetSingleNodeInstance()->SetBlendSpaceInput(BlendParams);
         }
    

    }

The example used is with a 1D Blend Space, but it also work for 2D or 3D blend spaces :slight_smile:

To Play an animation in a slot:

  • As before add the UPROPERTY with the animation sequence to play:

    UPROPERTY(BlueprintReadOnly, EditAnywhere, Category = “Anims”)
    UAnimSequence *MyAnimSequence;

  • Play the Animation:

     USkeletalMeshComponent *Mesh = MyActor->FindComponentByClass<USkeletalMeshComponent>();
     if (Mesh)
     {
         UAnimInstance *AnimInst = Mesh->GetAnimInstance();
         if (AnimInst)
         {
             AnimInst->PlaySlotAnimationAsDynamicMontage(MyAnimSequence, TEXT("UpperBody"), 0.1f, 0.1f, 1.0f, 30.0f);
         }
     }
    

To Play a simple full body animation and stop at the end:

MyAnimTimer = AnimInstance->Montage_Play(MyMontage);
GetWorldTimerManager().SetTimer(PauseMontageTimerHandle, this, &MyActor::PauseMontageFunc, MyAnimTimer, false);

Where MyAnimTimer is a float, and PauseMontageTimerHandle is a FTimerHandle.

In MyActor::PauseMontageFunc:

AnimInst->Montage_Pause();

To Play a simple full body animation and jump ahead to a section. Montage_Play(anim, play speed) and Montage_JumpToSection.

To Play an animation additively use the PlaySlotAnimationAsDynamicMontage function as before, into a slot set up like this: Using Layered Animations in Unreal Engine | Unreal Engine 5.1 Documentation

I have posted an answer to the main thread. Take a look and let me know if it helps.

1 Like

I have posted an answer to the main thread. Take a look and let me know if it helps.

1 Like

Thanks a ton man!

Ok, Adding some of the basics that might trip some people up

  • You need an animation blueprint to be present.
  • In the animation blueprint you need a slot node going into the final pose. The default one is fine (DefaultGroup.DefaultSlot).

  • If you want your character to default to ‘Idle’ if nothing is happening, you can connect “Play Idle” to the default slot but it’s not necessary.

  • In code you can play a montage into that default slot:

    a) You need a UPROPERTY where you can specify the Montage:
    UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = Anim)
    UAnimMontage *MyMontage;

    b) In the Montage make sure it plays into the slot you specified (DefaultGroup.DefaultSlot is the default which should be fine.

c) Make sure the Montage’s Blend-In, Blend-Out, and Blendout Trigger Timer are set to something low (like 0.01 for Blend-In and out, and -1 for the Trigger). The default settings sometimes causes the animation to not play fully or at all (eg, if your test animations are short).

d) Play the Montage:

if (MyMontage->IsValidLowLevel())
{
	USkeletalMeshComponent *Mesh = GetMesh();
	if (Mesh)
	{
		UAnimInstance *AnimInst = Mesh->GetAnimInstance();
		if (AnimInst)
		{
			AnimInst->Montage_Play(MyMontage);
		}
	}
}

OR

a) Declare your montage variable in your header file:

UPROPERTY()
UAnimMontage *MyCMontage;

b) In the constructor, load the montage:

static ConstructorHelpers::FObjectFinder<UAnimMontage> MyCMontageObj(TEXT("/Game/ThirdPerson/Animations/ThirdPersonWalk_Montage"));
MyCMontage = MyCMontageObj.Object;

c) Play the montage as before

2 Likes

@rantrod
This is the best answer so far to play an animation but I’m wondering, why it is so awful hard to do in just C++.
Even this example uses blueprints.
Each time a programmer asks for plain C++ examples, the response is that blueprints are superior.

For some jobs they are not.
I’m developing procedural animation for a different engine and would like to see just a simple -complete and working- example how to play animation in Unreal inside the editor (in Edit mode - not in Play mode).

2 Likes

One thing missing on these responses is how to load the animation in. I use a UPROPERTY because it’s easier for me. You can always load in code:

static ConstructorHelpers::FObjectFinder<UAnimSequence> anim(TEXT("AnimSequence'/Game/Mannequin/Animations/ThirdPersonJump_Start.ThirdPersonJump_Start'"));
     Anim = anim.Object;

(change the sequence path to your own path) or for montage:

ConstructorHelpers::FObjectFinder<UAnimMontage> anim_move_montage(TEXT("AnimMontage'/Game/TBContent/Animations/forward_montage.forward_montage'"));
         anim_move_montage_ = anim_move_montage.Object;

(Again, change the path to your own path). For both you’d need this in the constructor:

ConstructorHelpers::FObjectFinder character_model(TEXT("/Game/TBContent/Models/HeroTPP.HeroTPP"));
 
 GetMesh()->SetSkeletalMesh(character_model.Object);

The other thing missing is building the animation blueprint in c++. I have references on how to do this (not on my current computer), but I’ve never tested it.

That said, the AnimBP as I suggest using it in the main answers is simply a dummy, and the real work is piped through your own c++ code, so I’m not sure if you need to build the AnimBP in c++ even if you’re using procedural animations. Unless there’s something more sophisticated you want to add in the animBP like IK, the dummy should suffice.

Maybe I didn’t understand the explanation for the BlendParams, but I got weird results until I used the the metric I’d set my blendspace up with (in this case, passing it angles from -90 to 90). In any case, though, this was a big help.

Did you ever encounter any way of playing a blendspace in a slot or else blending it with the current anim? I’ve only done it used Animsequences and montages, but not blendspaces or plain anim assets.

Thank you so far,
In a simple coding world this would be straight code and would work ( It does not)

ASkeletalMeshActor* actor = actors[0]; //  any method to get the actor
UAnimSequence* anim = animations[0]; // any method to get the UAnimationAsset

if (actor && anim)
{
	USkeletalMeshComponent* mesh = actor->GetSkeletalMeshComponent();
	mesh->SetAnimationMode(EAnimationMode::AnimationSingleNode);
	mesh->SetAnimation(anim);
	mesh->Play(true);
}
1 Like

I apologize for barging in, but as this thread comes up in searches, I thought I should chime in with what I learned through trial and error. The example below is not complete, but should help pointing in the right direction, hopefully.

There is a minimum required tinkering with non-C++ blueprints, but in my approach I tried my best to “set it and forget it”. If there are better ways, please share :slight_smile:

This is how my montage looks like, just two empty slots, I handle their population through C++

this is how my anim graph in my custom animation instance looks like

Inside each state machine I connected a dummy empty state to the entry point, not shown in this picture, but you get the gist of it.

This is my C++ code snippet to handle fetching animation sequences on the fly and assign them to the montage slots. please note that I’m using some custom functions I wrote that are not shown in the code below

for (int32 l = 0; l < oCreature->AnimMontage->SlotAnimTracks.Num(); l++)
			{
				FSlotAnimationTrack slot = oCreature->AnimMontage->SlotAnimTracks[l];
				if (slot.SlotName.ToString().Contains("LowerBody"))
				{
					FAnimSegment NewSegment;
					NewSegment.AnimReference = oCreature->AnimSequence;
					NewSegment.AnimStartTime = animationListList.AnimationList[j].StartTime;
					NewSegment.AnimEndTime = oCreature->AnimSequence->SequenceLength;
					NewSegment.AnimPlayRate = animationListList.AnimationList[j].Speed;
					NewSegment.LoopingCount = 1;
					NewSegment.StartPos = animationListList.AnimationList[j].StartOffset;

					slot.AnimTrack.AnimSegments.Add(NewSegment);

					oCreature->AnimMontage->SlotAnimTracks[l] = slot;
					break;
				}
			}
		}
	}
}

as I mentioned, the example above is not either complete or fully functional out-of-the-box, but hopefully should help…

Did you ever figure out the blendspaces in a slot either? I’m struggling with that as well

Cheers,

To me it looks like slots are a property of Montages or AnimGraphs so unfortunately I don’t think there’s a way to play a BlendSpace in a specific slot from code.