The good news is that there are multiple ways to implement AR triggering… i.e. when and how to display an augmentation.

The two main methods are:

  1. using a target surface to place an object
  2. using a target image to attach the augmentation to

Here’s an example of a moving environment from one of our solutions where cars, people, waves, birds are moving while the landscape stays fixed to the table top.

Here’s an example of an image based target, whereby wherever i move my arm (i.e. the target) the spider will stay fixed to that location - provided the target image is still in view by the mobile device camera.

In reference to your question, the latter will allow for slow to medium movements and work effectively at tracking the moving object.

However, if you are looking at more distant objects and or faster moving, then it becomes much harder. There is one possibility with the use of a 3rd method = object recognition AR. Both Vuforia and Apple have opened the possibility of detecting objects to attach targets to; Vuforia object scanner

and ARkitScanner

.

Vuforia object scanner - object detection in progress

ARkit scanning

As you can see, one of the downfalls is that you need to somehow “scan” the object first. So for things like a car or an airplane, that might not be possible.

However, I’m glad to say that both applications allow for real time scanning and object detection. However, to my knowledge, both of these aren’t YET very consistent.

In summary, there are already ways to do this that are openly available. However, results will vary depending on the moving object you want to track and the speed at which it is travelling.