cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Visit the PTCooler (the community lounge) to get to know your fellow community members and check out some of Dale's Friday Humor posts! X

Target the position of a hand with HoloLens

TH_9967918
3-Visitor

Target the position of a hand with HoloLens

Hello,

 

I'm currently trying to move a model in my scene by holding it (hold event) and dragging it with my hand.

Since there are events like swipeUp and SwipeLeft etc. and my finger is causing a shadow on a 3D-Button, I assume that Vuforia is somehow tracking the position of the hand or a finger.

 

Is there a way of reading these values?

 

Thanks in advance!

 

7 REPLIES 7

Hi @TH_9967918 ,

so far I know, the functionality is not available in Studio yet.

Currently we have compatibility of studio project for both devices HoloLens1 and HoloLens2 so that the same project could work on the both devices. So the detection and tracking of the hand is functionality is available on the HoloLens2 API https://docs.microsoft.com/en-us/windows/mixed-reality/mrtk-unity/?view=mrtkunity-2021-05

But is not implemented yet for Studio yet. So far I know there are some plans that it will be implemented in the future, but not sure when.

BR

Okay, thank you very much! So the only way of using both Mixed Reality Toolkit and Vuforia Engine would be by working with Unity?

Hello @TH_9967918 

Yes and no , my opinion.

For the hand tracking  - you can find some examples also in the unity assets store which use   /additionally to the  Microsoft toolkit where we can see some  hand tracking implementations.

The problem is what you want to achieve and what work you want to invest in your project. Because in Unity definitely for the same functionality (common for Studio and Untiy with Engine )you need to invest a lot of more work.

Let check what we have for HL2 in Studio. We have 3d panels and we have adaptive 3d Panel which could move with the device gaze  vector or could be fixed (pin). Also you can register a userpick event when you finger is clicking on 3d object (it could be a modelItem /explicit as widgets  or implicit - without definition/  or any other 3d widgets)  It means you can track you finger when you click on object. A second functionality what you can use is to register the tracking event to receive to current device pose - the current redeposition, eye direction and eyeup vectors (respect to the global coordinate system here the Target scanned (similar to engine FIRST_TARGET). So possibly in some situation the Studio functionality  could be not the best option but sufficient enough to achieve your goal.

Otherwise you can in studio very easy handle models and sequences  which is not so easy if you want to achieve this with engine in untity - but I agree  with Vuforia in unity  you have more functionality but there the api are low level /comparing to Studio and you need to invest  significantly more work for the implementation.

Mostly the general trend is that the  functionality form Engine is transferred to Studio so that in future we can expect more to see also in Studio.

Yes, I get your point. It seems like Unity isn't really made for working with CAD. But just out of interst: Do you know a proper way to import Creo models with sequences to Unity? I'm curious about the pros and cons of Unity and Studio, so I know which one to choose for my project (both with Engine of course). 

 

It's good to know that Studio will get more and more functions. But sadly we never know when something will come. We recently got the 3d panel, which is pretty nice. I hope for more like that 🙂

The base of vuforia studio is to develop AR experience with less code. you can say focus of vuforia studio is basically manufacturing, Production, aerospace, automobile field, where input is CAD model.

but if we talk about Unity, Its basically VR platform but AR becomes Add on now which is fantastic tool/ Software. In unity you can develop anything like it can be used in e commerce platform along with capabilities of Vuforia studio.

let us take example of clothing industry/stores, where i want to check the cloths on my body. its very much possible with unity but not with vuforia studio.

Hi @RolandRaytchev How to add the 3D adaptive panels to Vuforia studiio experience, I cannot find this anywhere.

Hi @amarshall-3 ,

3D Panels and other 3D widgets like 3D Video are available for mobile and eyewear devices (mobile) , but there is difference between how it work on HoloLens and mobile. In a new project with the latest Studio version, you can add them from the widget panel

 

2022-02-10_18-31-27.jpg

 

Here is a link to the Studio Help e.g.  3D Panel   and 3D Video   , This is convenient way to create 3D adaptive UI.

There was an old techniques (before the 3D widgets are introduced In Vuforia Studio) which requires significantly more work to be implemented – here I  want  to refer to e.g. following community posts How can we make a 3D Widget on HoloLens visible in front of me everytime ?   and Design UI in 3D Eyewear (Hololens II)

Top Tags