What's the right way to play a montage over the network from a BTTask?

So I have a real simple behavior for an AI to run up to an enemy and do a melee attack. The plan was to use a montage to play an attack. But I need this to be replicated to all the clients. So I was going to have a BTTask (blueprint) call a custom event on the Character. My issue is I need to know when the animation completes. So the task needs a “completed” event or the duration to know when to complete. Events do not have an output. I’m kinda in a loss for figuring out the “right” way to do it. Everything I come up with is hacky.

Also when I hit " ’ " key to see the AI debug on a 1 client and dedicated server. I’m I looking at the debug on the server or client? Is there control for that? Does the BT run on the client? How is that controlled?

Hey Troy,

You’ve asked two unrelated questions in a single thread, so I’ll answer the AI debugging part, and I’ll assign someone on the animation team to answer the animation question.

The GameplayDebugger (the " ’ " tool) replicated information from the server and there’s no direct way of making it display data from the client. This actually sounds like a useful feature, so I’ll file a ticket for it, but don’t hold your breath waiting for it.

BTs, like the whole of UE4’s AI system run by principle only on servers. Having said that AI and Navigation systems can be instantiated on the client as well. For NavigationSystem you need to enable client-side navmesh creation by adding following to your project’s DefaultEngine.ini:

[/Script/Engine.NavigationSystem]
bAllowClientSideNavigation=true

As for AISystem, you’ll need to modify engine sources, since the change that allows to control that via ini settings is not in 4.10 (coming in 4.11). You need to go to UWorld::CreateAISystem and remove the WITH_SERVER_CODE || WITH_EDITOR macro.

Note that you can still run into issues if you wanted to spawn AIControllers on clients. Also, following a navigation calculated on a client can turn out to be tricky. I haven’t tried it myself, so let me know if you run into any issues.

Cheers,

–mieszko

Sorry about the AI/Animation in one question :slight_smile: I can paste that in a new question if they like.

I need to proofread what I ask, because I butchered that question. I wasn’t looking to do AI on the clients. I’d want it all running on server for sure. I was just wondering if the debug info I’m looking at on client 1 is what is running on the dedicated server.

My assumption was that the BT is ticked on server and there isn’t an instance on client (that sounds right). But I wasn’t sure if the debug info was showing me some ghost client BT that was ticking when it shouldn’t be. If that makes any sense.

Sounds like my assumptions are correct. I just got to see why when I do dedicated server the perception system doesn’t seem to send me events of actors coming into perception range.

Other than that, the AI tech has been fun so far:)

I usually look for worst-case scenario in UDN questions to cut down on number of posts, so that might have been me :wink:

In short: AI in UE4 is by default server only, and the tool we’ve build for AI debugging take that into consideration.

Other than that, the AI tech has been fun so far:)

Glad you like it :smiley:

Regarding perception issue, if it’s about sight then UAISense_Sight::Update is where the magic happens, and if your AI that’s not getting the notifies is not in SightQueryQueue then debug what’s going on when GenerateQueriesForListener is called for that AI.