ARKit World Alignment?

I’m trying to develop a new augmented reality app that handles interaction between multiple physical and virtual objects. I know the relative placement of each physical object from each other, so I’m trying to align my virtual world to my real one. From what I’ve seen of ARKit, it has excellent spatial tracking, but no inherent way of aligning to known physical features other than surface planes.

I’ve also been experimenting with the Unreal4AR plugin, in a sense, this is just the opposite. When a marker is located, you can move an actor (or the camera) to align to it, but the moment it becomes obscured or out of frame, the system has no means to continue tracking.

I would like to create a hybrid of these two techniques so that when a marker is located, the virtual world becomes aligned. In the absence of the marker, it falls back on the spatial tracking. I don’t think I need marker tracking with the same depth that Unreal4AR provides, so assuming I was just trying to locate the centroid of a color blob, it seems I could achieve that starting with the AppleARKit Camera Texture in a material and apply the appropriate processing to find the location.

So the questions: Is it possible to communicate from a material blueprint to an actor or level blueprint? Or could I use the material blueprint to manipulate the image in the GPU and pass the final image back such that coordinate processing could be done in a level/actor blueprint?

And of course, if anyone has any other suggestions for world alignment, I’m open to ideas! Thanks!

I would also like to know if something like this is possible. Currently looking for a way to find a marker in an ARkit map.