AR workflow - prototyping without launching on phone

I’m new to Unreal and I’m building an AR app (yeehaw). Building off the ARSample project provided by Epic.

What’s becoming apparent is building and launching to iphone every time as you’re developing is way too slow. You can’t iterate quickly when it takes 3 minutes to see it on the phone.

How could one set up a workflow so you can play your level on desktop to prototype?

Maybe you have a level with a table or something in it and you can test ARKit hits on that as a surface. That would be ideal. Or if you can’t get hits, can you at least move the camera and see geometry? What is possible?

You could set up Pawn, inputs etc needed to take control of the Pawn in the level when in PIE, player controller would have to be set to register click events as touch input. You could move around with pawn in the level with WSAD or whatever. I used to do it this way back in 4.17 ARKit. Right now (4.19) the approach to launching ARSessions is quite different and there’d be a problem with registering traces in your empty level.

Still you’d have to do quite some work to have a workflow that’d enable you to test in PIE, a workflow that’d wouldn’t necessary be a valid option in all use cases.

I was hoping that when the AR API integrations are stable (ARCore/ARKit) some enterprising individual would add support for OpenCV and a webcam. Being able to run AR from a webcam is perfect for iteration speed, debug and design. There is an old OpenCV AR plugin for unreal but I doubt it uses the same integration as the new phone based APIs do.

I would love to see some iPad emulation, or use of a VR hmd as a tracked “virtual” device.

Using the Unreal Remote 2 app to remote play in editor is a good way.

https://forums.unrealengine.com/development-discussion/vr-ar-development/1628411-ar-playing-without-deploying-to-a-device