[Solved] VR/AI Perception Issues
I have created an AI routine which is based on AI Perception. When testing with a normal third person character all goes well however when testing with my VR setup (Pawn with Floating Pawn Movement) the AI Perception is unable to properly detect the VR pawn. The location of the pawn is detecting as about 30 units off center and at floor level. This never changes regardless of the actual VR pawn position or height.
I assume this may have to do with the way "room scale" or "room center" is detected in the engine in relation to the headset position.
Does anybody have any insight or tips on this as I am at a loss for how to fix or offset this.
I was able to resolve this thanks to some advice I found on another (semi-related) answer.
I am still unsure why the original "perception source" for my VR pawn is way off but what I did was wait until my VR Pawn is spawned and then (via blueprints)
This works just fine with seemingly no consequences. It even handles nicely when you "peek" around the corner as a VR character only triggering the AI perception when your head is visible. Thanks very much for your feedback as it encouraged me to investigate and find the answer.
answered Nov 02 '18 at 02:10 AM
Do you have a collision object attached to the camera of your VR Pawn? I think the perception is using a linetrace and that should be able to hit something. The default VR Pawn has no collision objects attached.
If you would at a sphere collision as a child component to the camera ( see screenshot ), it will move with the HMD and the AI perception should be able to detect you.
Hope this helps.
answered Oct 31 '18 at 02:21 PM
Follow this question
Once you sign in you will be able to subscribe for any updates here