Where does the navmesh sweep the ref mesh to see if it's an allowable slope?

Having just completed my custom character movement component I need to create a way to import my own navmesh. I know ue4 has a cool auto navmesh creation but my navmesh needs to be able to work on walls and ceilng as well as floors. If there is already support for this then I’ll do that way. Everything I come across indicates otherwise.

So where would be the best place to start in code? Meaning what implementation files should I start familiarizing myself with? For my navmesh it would be cool if I could change the z is up limitation. The ue4 navmesh appears to only generate a navmesh up to 90 degrees (really it starts to falter around 85). If I could change some elements in it to allow it to generate the mesh (using the current awesome mechanics of it) in navigation volumes rotated 90 or 180 degrees that would be great. If I have to create support to import my own navmesh that would be second I suppose.

Thanks

Am I correct in assuming that Detour is the bulk of the nav mesh system? Where does it begin at? I’m looking for the point where the Navigation volume creates a nav mesh from a ref mesh. For my purposes I would like to try:

A) Modify the rules associated with the Nav volume and how it auto creates a nav mesh. It doesn’t allow you to create a nav mesh on say a wall. Anything past this 90 degree doesn’t work. Ideally you would be able to rotate the nav volume or something similar and it would do the same process except with a rotation offset.

B) Modify the Detour stuff so that I can feed it my own nav mesh.

Any information on this subject is welcome. I’m just now being exposed to it and still haven’t a clue what order everything falls in.

If navigation volume is rotated will it build a navmesh with the rotation offset? It doesn’t appear to in the editor but I wanted to check if there is perhaps a setting somewhere or technique for getting this to work.

I can see in the RecastNavMesh.h settings for AgentMaxSlope. Where in the associated files would I find the method that evaluates the orientation of this recasting on a ref mesh?

Also, when the Navigation volume creates a navmesh where is it stored? It seems like you would want this to bake out and then just have it load instead of generating it. Especially if you don’t have dynamic elements that don’t modify the nav mesh.

If the mesh is accessible and can be baked then it seems like you could just rotate the walls/ ceilings to be in line with a floor → project the nav mesh - > bake it out → and then when loading it back in you just offset its rotation to be back in line with the wall or ceiling.

Where in the implementation files is the trace done to determine if a ref mesh polygon is a walkable surface? I’m lost in two large files and not understanding their differences. Detour sounds like some sort of crowd management and recast sounds like the realtime navmesh generation is this correct? Is recast responsible for the initial navmesh creation from a navigation volume?

Still looking for where the raycast to determine if the ref mesh normal is a walkable slope and therefore navmesh should be generated is called at.

If you need navigation on sphere, here’s some food for thought: how about using different coord system for navigation? Prepare 2 functions that will covert between cartesian and spherical coord systems, covert all data passed to navmesh (collisions, path requests, everything) using sphere-cartesian and inverse that on path points before using them. This doesn’t sound too hard and will allow using existing navmesh generation.

Recast/Detour solution requires “flat” surface to work on, as it’s 2D based in its core. There’s no way of generating navmesh on walls or ceilings with it.

Even if you work code around that limits by rotating floor normals, you would still end up with 4 different navmeshes without any connections between them. I strongly discourage that approach - idea behind navigation data is that all agents pick single nav data as their own and use it (and only it) for generating path. There would be also an issue with accessing collision data stored in navoctree, because it’s prepared (coord system) for recast generation.

Best way seems to be either custom navmesh (if you already have one) or simple waypoint graph, both implemented from scratch using ANavigationData as parent class. From navigation point of view it’s only matter of implementing FindPath and possibly a generator for refreshing navmesh / waypoint links. Unfortunately, we don’t have any examples for it.

Actually I read what you wrote wrong, lol. That sounds like a very good idea. Thanks! It’s a little more on the side of doing a massive change but I ended up having to do that with the CharacterMovementComponent and it was fine.

Okay so one more thing - I didn’t see you’re answer posted above. I’m looking for a little clarity in what you are saying there. So I’ve noticed the nav system is 2d (as expected). Isn’t the way that it’s flat because of the way the data is stored/ traversed and not in the actual navmesh generated on the surface ( where it’s vertices are stored, location wise)? If you can overlap navigation volumes to have more navmesh extend into another nav volume then you could just lay extend more and more nav volumes up along the spheres wall eventually onto the ceiling. Doesn’t this evaluation just continue to 2 dimentionally extend the nav mesh data array? As if the sphere was layed flat?

No, it’s not like that. Combined bounds of all existing NavMeshBoundsVolumes are passed to recast generators to create big enough 2D tile grid. Then, within each tile, 3D geometry is being voxelized, split into 2D layers and processed to create 2D navmeshes. Results are tied together by inter-tile links and create one big, seamless navmesh. Neither collisions, nor navmesh are relative to bounds volumes - it’s always taken from 3D world space.

Connecting & rotating volumes to form a sphere would result in big axis aligned bounding box used to generate navmesh.

Only way of having existing generators work with sphere worlds (at least, only way I can think of right now) is cheating coord space of provided collisions - projecting it to flat surface. Of course this will cause problems with connecting left-right, top-down tiles together to form continuous space. Probably it can be solved by just spamming navlinks along edge.

Waypoint graph seems to be much easier solution, although not as flexible as navmesh.

Thanks again for the response. These are all great insights that will really help. To help visualize what you’re saying I put together a sphere unwrapped kind of how I picture the 2D data would be. I get that this isn’t exactly how it would look but I’m trying to make sure I get what you see as the major obstacles.

When you say spam nav links you mean along the split borders of the data where they 2 dimensionally no longer align so therefor navlinks would identify them as navigation jump points to the other side right? Would nav links be able to completely/ spherically accommodate these navigation seams? If I were to link them entirely along the border so that spherically it was seamless do you think it could handle that?

Attached is the visualization of how I see this sphere coord data container.

Just wondering how or if you solved this problem? Would love something similar myself however facing same problems still with nav mesh being 2D only and no plugins having been written by now.