Skip to main content
1-Visitor
June 16, 2021
Question

Target the position of a hand with HoloLens

  • June 16, 2021
  • 1 reply
  • 3441 views

Hello,

 

I'm currently trying to move a model in my scene by holding it (hold event) and dragging it with my hand.

Since there are events like swipeUp and SwipeLeft etc. and my finger is causing a shadow on a 3D-Button, I assume that Vuforia is somehow tracking the position of the hand or a finger.

 

Is there a way of reading these values?

 

Thanks in advance!

 

1 reply

21-Topaz I
June 16, 2021

Hi @TH_9967918 ,

so far I know, the functionality is not available in Studio yet.

Currently we have compatibility of studio project for both devices HoloLens1 and HoloLens2 so that the same project could work on the both devices. So the detection and tracking of the hand is functionality is available on the HoloLens2 API https://docs.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/?view=mrtkunity-2021-05

But is not implemented yet for Studio yet. So far I know there are some plans that it will be implemented in the future, but not sure when.

BR

1-Visitor
June 17, 2021

Okay, thank you very much! So the only way of using both Mixed Reality Toolkit and Vuforia Engine would be by working with Unity?

21-Topaz I
June 18, 2021

Hello @TH_9967918 

Yes and no , my opinion.

For the hand tracking  - you can find some examples also in the unity assets store which use   /additionally to the  Microsoft toolkit where we can see some  hand tracking implementations.

The problem is what you want to achieve and what work you want to invest in your project. Because in Unity definitely for the same functionality (common for Studio and Untiy with Engine )you need to invest a lot of more work.

Let check what we have for HL2 in Studio. We have 3d panels and we have adaptive 3d Panel which could move with the device gaze  vector or could be fixed (pin). Also you can register a userpick event when you finger is clicking on 3d object (it could be a modelItem /explicit as widgets  or implicit - without definition/  or any other 3d widgets)  It means you can track you finger when you click on object. A second functionality what you can use is to register the tracking event to receive to current device pose - the current redeposition, eye direction and eyeup vectors (respect to the global coordinate system here the Target scanned (similar to engine FIRST_TARGET). So possibly in some situation the Studio functionality  could be not the best option but sufficient enough to achieve your goal.

Otherwise you can in studio very easy handle models and sequences  which is not so easy if you want to achieve this with engine in untity - but I agree  with Vuforia in unity  you have more functionality but there the api are low level /comparing to Studio and you need to invest  significantly more work for the implementation.

Mostly the general trend is that the  functionality form Engine is transferred to Studio so that in future we can expect more to see also in Studio.