Showing results for 
Search instead for 
Did you mean: 
Showing results for 
Search instead for 
Did you mean: 

The model identified by Model Target cannot completely coincide with the actual


The model identified by Model Target cannot completely coincide with the actual

Hi All,

I want to identify a model A by model target and this is the picture of model A.


Because Model A is too complex to use as a model target, I made a simple version of Model A(removing the complex widgets and keeping only the model body, And I call it Model B), and the following is a screenshot of Model B.


I use Model B as the model target.


When I publish this experience and actually use it, it turns out that the model does not exactly overlap with the actual object.
Here is a picture of the actual object.


And this is the effect after identification by the model target.

Sometimes there is even a phenomenon that the model and the actual object do not overlap at all.


This device is assembled from three smaller devices, and I can make sure that each individual small device is exactly the same scale as the model file. Now the device may not exactly match the outline of the model. I can understand that sometimes the model does not overlap 100% with the actual object, but why is there such a bad situation where the model does not overlap with the actual object at all? Has anyone encountered a similar problem? How should I solve this problem?



Hi @ZL_9884170 ,

think it will be difficult to find the reason why this shift occurs  and then to be able to  find the one   correct action to fix the problem - but possibly with a couple of actions the behavior could be at least improved. 

In the normal case I am using rather smaller models (related to the geometrical space size) so possibly my suggestions are not so relevant for your applicaiton case , but still want to share this according to the principle that  something to share is better as nothing. Possibly someone else could share here his/her experience in similar situation. 

You can check:

-  For mobile device the 3D container properties:

3D container widget ( )
Persist Map: Select the Persist Map checkbox if a project must accommodate movement or changes in the environment, or if it must visualize large augmentations several feet away from the ThingMark.
Extended Tracking:
Uses features of the environment to improve tracking performance and sustain tracking even when a target is no longer in view.
As the target goes out of view, other information is used from the environment to infer the target position by visually tracking the environment.
A map is built around the target specifically for this purpose and assumes that both the environment and target are largely static.
Extended tracking is supported for the following target types: 
• Model Target
• Image Target
• Spatial Target
• ThingMark


Here is to mentioned that first before you scan your model target - in this case you need to try to scan (similar to scanining when you start Chalk video session) the environement to be able to create a persistant map of the environement. The quality of this map and respectively the ratio what it  will improve the stability of tracking depends on the mobile device and how the device supports the ARCore /ARKit features and how accorate are the device sensors. Some backgrounds could be found here.  Usaly pro devices ( high-cost ) devices are better as low -cost devices and my personal experience is that IOS devices do track with Studio more stable then Android devices. 

For the creation of the environment mapping you need also to have some reference grid points in the environment. So , means that e.g.  a grey shiny surfaces, with rounds , without any geometrical objects with sharp edges will not lead to the creation of good persistent map

The modelTarget: you can check if the option Car Model  ( )

Car mode: Select this checkbox when  the physical model in your Experience is a car exterior or a similar highly reflective large object. Enabling this property optimizes the tracking performance to reduce drift in certain situations at the cost of higher CPU load.

In case that you use high - simplified model for target – I think it will be helpful to use a textures or colors which are the same as the real objects. I am not sure if this point is only relevant for the advance target 360 but still want to mention it.

Another point is to check: what should be the correct view. When you scan the model (or the part of the model what which is used as modelTarget)  and then go around the model then you possibly will not see the modelTarget object in the camera where the target does not appear  inside the view so that the device is working in Extended Tracking mode and will use the persistent map if such is defined. So means it could be helpful that your view is defined on such way that often when you go around the model then your model target will be completely inside the device view so that the  direct model tracking is activated

One problem could be the light – to shiny on surfaces or with some blinking frequency with 50 Hz (e.g. neon tube) could cause some issues

Possibly this Engine basics info could be helpful here as background (because the used principle in Studio is the same and is based on internal Vuforia Engine libraries)  and

Additionally you can use for  checking of the tracking

Model Target property : Tracked Tracked and Tracking Lost bindings

 And the model Target widget Events:

 Tracking Acquired ->Triggered when tracking is initiated.

Tracking Lost ->Triggered when tracking is lost.

But also as some kind of debugging of the device position , direction vectors the tracking events -3D container widget properties Enable Tracking Events as some kind for checking of the device tracking behaviour. This setting provides the ability to register a function (callback) to be called programmatically each time a target tracking event is triggered. For example, if the author of the Experience wants to be notified each time a tracking event occurs, they would select this checkbox and then add the following listener event to Home.js:

$scope.$on('tracking', function(evt, arg) {

The following are a few examples of how this property might be useful:

• Tracking and logging a user’s movement through a space to analyze their actions

• If you want to position a panel and button in front of the user when they make a certain action

• If you want to follow a user’s gaze and indicate points of interest in an environment

• To record the location of User A, share the location with ThingWorx, and then get the location of other users in the same space



Topics available:
AR/VR for Data Optimization AR/VR for Security and Control AR/VR for Inspection