Morph Targets don't update Bounds?

Hi guys,

I’m hoping there is an existing solution to this, but right now our bounding of skeletal mesh actors that use morph targets have the same bounds no matter what the morphing does to the mesh. Neither GetActorBounds nor GetComponentsBoundingBox seems to change even though the dimensions of the mesh may drastically do so. Is there a runtime bounding type I’m somehow unaware of or a means for artists to somehow map the scaling of bounding to the morph weights? Basically we need this for variety in melee weapons (at all times) and for drawing all our inventory items to render-textures scaled to fill out a 2D slot size. Any suggestions on either/both needs? Thanks!

Unfortunately Morphtarget change doesn’t contribute to any of component bound. It is bone transform that gets applied to bound size via Physics Asset.

Basically we need this for variety in melee weapons (at all times) and for drawing all our inventory items to render-textures scaled to fill out a 2D slot size. Any suggestions on either/both needs? Thanks!

I’m not sure if I follow. You use morphtarget for melee weapons as morphtarget?

In current engine, morphtarget doesn’t change collision or bound. It is visual effect at most. I think you can use skeletalmesh fixed bound (with maximum possible bound), but I don’t think that’s editable. That’s the option where you use fixed bound to avoid bound calculation code.

If you could explain what the purpose is, maybe I can answer better of how to re-calculate bound?

–Lina,

Is there nothing at runtime like us old-school game developers are used to where a box is dynamically updated (even asynchronously is fine) from the actual rendered geometry or something like this?

We don’t have any run-time solution to make this work. You can try USkinnedMeshComponent.ComputeSkinnedPositions and calculate the new skinned bound but it doesn’t include morphtarget. We do calculate bound using the mesh resource data when import but again that doesn’t include morphtargets. You can modify ComputeSkinnedPositions to include morphtarget. You can reference SkinVertices function in SkeletalRenderCPUSkin.cpp, but not sure if all data is available.

Any thoughts or suggestions would be greatly appreciated here.

You can add extra joint that doesn’t have any vert and you can use that joint to scale physics asset. You’ll have to create body for the bone because by default, I don’t think we create body if no vert is found.

What I’d think is that you have meta data that contains morphtarget and bounds of it, and maybe in run-time you can calculate the new bound based on the weight of morphtarget blending. Note that non-uniform scale doesn’t work well with certain shapes - i.e. sphere. I’d use Box if you have a lot of 3d scale. It’s not perfect either but it’s the closest you can get to.

Thanks,

–Lina,

Hi,

We think this post contains useful information which we would like to share with our public UE4 community. With your approval, we would like to make a copy of this post on the public AnswerHub which includes the discussion but strips out your username and company name. Please let us know if you are okay with this.

Thanks!

Hi Lina,

What we’re doing right now is using one melee weapon model with 2 different morphs on it to create a huge variety of unique looking weapons. Part of these morphs is the sword (in this case) changing length and shape (becoming wider, getting spikes near the top, this kind of thing). Unfortunately, that huge change in the visible geometry is not accompanied by any sort of change in bounding or collision, so right now, what we have are 2 (bad) choices it seems: make the bounding large so it fits the most extreme size sword and then it is very wrong for the smaller morphed swords, or make it very small and it is wrong the other way (or something in between half wrong at both extremes).

Is there nothing at runtime like us old-school game developers are used to where a box is dynamically updated (even asynchronously is fine) from the actual rendered geometry or something like this? Maybe something we can add to a shader to return those min./max. coordinates?

If there is no existing runtime solution, for this to work, I think we would have to do something like make our own scaling on the 3D bounding by linearly interpolating from the smallest morphs’ geometry up to the largest, and we do all this somehow manually, in code, for every weapon? (We will have dozens of weapon types, all with morphs changing their dimensions…) Any thoughts or suggestions would be greatly appreciated here. Thanks!

That last one was what I was thinking of trying, I’ll see if we can make time to test that out with an artist. Thanks, Lina!

By all means, Ben, go for it. There’s nothing too game specific here anyway, and hopefully it’ll help others with the same question. :slight_smile: