cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Community Tip - Visit the PTCooler (the community lounge) to get to know your fellow community members and check out some of Dale's Friday Humor posts! X

Azure remote rendering in Studio

AndreaT
14-Alexandrite

Azure remote rendering in Studio

Hello community,

I've just came across the Azure remote rendering adv.

That looks impressive and I started wondering if it's even possible to use it with Hololens2 and Vuforia Studio.

In Microsoft doc. they speak about the implementation of SDK for Unity...

Any chances it's a way to go with Studio?

It would be so good to take advantage of it especially when using models with millions of polygons.

 

Best regards

Thanks!

4 REPLIES 4

Hi @AndreaT ,  

so far I know is this not possible in the current functionality yet and I do not have information if something is planned in the next time. I checked some internal resources but could not find anything about usage of Azure Remote rendering. Vuforia app use some Azure api e.g. the location Mixed RealitySpatial Anchors on HoloLens for the capture and view app, so that theoretically it could use e.g. Render APIs)  I am not sure what are your requirements - do you want to render  all widgets or do you want to render only specific  3D model. Do you have models where you reach some limits of graphic performance so that you want to use this API.

In case that  you intent to render everything in this case – possibly you can create your own application with unity API and use the Vuforia Engine features. https://library.vuforia.com/

I want to mention the following points.

  • Vuforia Studio is tool which provided a compatibility for many platforms like IOS , HoloLens , Android and window. Therefore, feature should be implemented for all platforms (in the most cases)
  • Also, the most of the widgets should support all Hololens devices HL2 but also should work on HL1
  • Using the javascript you can call some external api so possibly you can access azure remote rendering if it provides some external APIs. Unfortunately I do not have knowledge in this api. So, Vuforia  Studio could display some video stream e.g. by url – and possibly you can use this when the rendering of the Azure  will results a video streams. Otherwise, you can send via e.g. rest api call some command to make it interactive.
  • The more advance option is to create some customized widgets where you will try to extend the functionality by some code using the foreign API but this requires a very deep knowledge and programming skills
  • By the way , here I want to mentioned Studio supports in tmltext widget GLSL or HLSL fragment and Vertex rendering.
  • The model widget requires that the models PVZ file (if other supported file format they always will be converted to pvz) will be loaded locally in the upload folder or dynanamicaly form TWX  – means models will be loaded at run time into the Vuforia view  memory from file repository – some kind of dynamic model loading.
AndreaT
14-Alexandrite
(To:RolandRaytchev)

Thank you @RolandRaytchev for so many informations.

We use Vuforia Studio because in my office there're no experienced software developers, I'm the only one programming but at a very simple level. It's all good as long as we're in the maintenance but when it comes to experiences where we want to show the customer the whole system we can easily reach upt to 5-6-7 millions of polygons and that's way beyond Hololens2 capabilities. Therefore the interest in the remote rendering feature.

I've never read anything about Unity but if there's no much developing knowledge needed for basic applications I might think about it, of course in that case there'd be the Vuforia Engine license to be taken into account...

 

I also had a meeting with an expert from Microsoft regarding Azure Remote Rendering.

He confirmed that there are APIs to use and also that currently Unity is the way to go.

I asked if those APIs were accessible via JavaScript but he couldn't answer to the direct question, he mentioned though that being Unity the official environment, C# is the language that for sure can be used to call the APIs.

So there are indeed external APIs, still to be figured out whether they can be called via JavaScript or only C#...

There is plenty of documentation in Microsoft Docs portal (which I still have to read because of complete lack of time) and here's the link https://docs.microsoft.com/en-us/azure/remote-rendering/

For example the guy from Microsoft spoke about C# but in the Docs I see references to C++ so I'm wondering if he just mispoke and there are actually possibilities to call APIs via JavaScript..

Could you give it a read given your expertise?

 

Thanks

Best regards and thanks again for the tips!

SteveGhee
12-Amethyst
(To:AndreaT)

Just to be clear, View does not use Unity.

 

PTCs advanced engineering team have done done very detailed experiments using Azure Remote Rendering and we have promising results.  

 

 

The real question is what is the usecase, the need for very large scale rendering.

Most AR projects bring digital information into a physical world, and that digital information tends to be augmentation / additional data i.e. ADDING to the already-complex physical world.   

There are usecases which are more Mixed/VR type experiences, such as digital protoyping and design review, where there is no physical product. 

This is where we have done some experimentation.

 

if this is where your usecases sit, it would be interesting to learn more. 

AndreaT
14-Alexandrite
(To:SteveGhee)

Hi @SteveGhee , sorry for the long time it took me to answer but we're pretty busy these days.

By the way, my case is exactly more on the Mixed Reality side. We're not concentrating on augmenting only digital information for the machine from Thingworx, we don't do it at all at the moment.

The company where I work at produces machines quite big machines.

Our target is to go to the customer with a "digital twin" in AR/MR and let him see how it would be in his site, introducing a low level of configurability. That's why I opened this post, because at the moment Hololens2 alone can't give a stable and smooth experience due to th dimensions of the machines (number of polygons).

 

Looking forward to hearing from you

Andrea

Announcements

Top Tags