Markerless Augmented Reality?

Is there a way to make an app using unreal engine that produces a 3D augmented object without needing a marker?

Seeing as how Unity3D is capable of doing it, I’d imagine it is but from looking online, I’ve yet to find anyone who’s done it. I don’t see a benefit in creating an app which requires users to print custom markers when its much more convenient if there is a way to make the processes markerless.

It depends on what you mean by markerless AR.

Typically you just use the phone’s various sensors to equate a game level with the camera’s world. What information are you missing to accomplish your goal?

Markerless as in, the phone uses its built in sensors like you said and maps the ground and is able to place the desired 3D object through the app. I’m sure its possible seeing plugins such as Augmented Reality for UE4 - Work in Progress - Unreal Engine Forums. But following this plugin it requires users to print a custom marker.

I want to remove that barrier and just have the software analyze its surrounding and that’s it. Ultimately making it much more user friendly. Personally I don’t even know where to begin to make this happen hence why I decided to ask in this forum. I would try Unity since there are so many plugins dedicated and tutorials on how to make an AR app but I’ve already started building in Unreal and prefer its quality.

You can use the phone’s sensor’s acceleration and gravity to get 90% of this working for testing/playing around but C++ would be needed to get it more accurate.

The gravity sensor will give you look up/down quite accurately.

Now the main thing Unreal lacks by default is no access to the Magnetometer (compass) from BP so getting your view direction is not possible. The acceleration sensor will be what you need to get horizontal look (yaw). This value is a bit odd so you need to experiment with it and know it will never tell what way you are facing but instead starts you at player start rotation and is used to rotate “approximately as much as the phone does”.

I recommend researching them a bit or just jumping in and putting a print string to print out axis values from the input events and seeing how the values change. You want the values that change the most as you rotate the phone in Yaw.

Last time i did this (the first time) it took a couple of hours of pushing builds to the phone to figure out the sensor values but it worked pretty well.

One note is you’ll see every instance of this with flying/hovering objects because it is not accurate but it’s accurate feeling enough to fool people. You will never root something to a real world surface this way (unless you end up targeting a depth sensing or LIDAR device like the HoloLense or Google Tango.)