I’m having this exact issue, and related, in 4.26, and even though I found a “partial” solution I’m stuck further along the way.
I think this issue happens because the blendspace calculates the bone blends in local (parent bone, I think?) space. I assume this is the case to maintain bone lengths, but due to the fact that the bone chain “up” from root to pelvis has only one bone, while the bone chain “back down” through the legs usually has more, it seems that it accumulates errors due to how the skeleton tries to keep their lengths stable.
My “partial” solution consisted on adding virtual bones between the root and the affected bones (thighs, knees, feet and toes in my case using the mannequin). If you take a look at the blendspace with these bones, you will see that they lose alignment with their intended targets when you’re halfway between two sample points. Since they are parented directly to the root, basically their local space is the same as their component space.
Whatever the reason, the thing is that those bones seem to keep a more accurate placement during the blend, so you can use an AnimBlueprint to copy their transforms into their target bones. I used a post process AnimBlueprint for this so I could see the changes in the editor.
The problem with this is that it only works properly with the “default” skeletal mesh (that was used to import or retarget the animations). If you try other meshes using the same skeleton but with different proportions, the virtual bones will remain where they are on the “default” mesh, and if the proportions are really different then this will actually make it worse than the original problem.
From here on, I don’t really know what would be the proper solution. I’ve tried multiple things, like resetting the virtual bones to their targets manually in the animBP, but by doing this I achieve the inverse situation: now the virtual bones will follow their targets when using different meshes, but since this requires that I copy the bone transform after the blendspace is applied, they will also display the original issue and “sink” into the ground.
The only “solution” I can think of would be to basically forgo the use of blendspaces for this and recreate it with blend nodes in the AnimBP. This way I can fix the mesh issue before the blend is done on the sample animations and then do the blend, but it makes fixing blends with many animations very complex.