cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
Showing results for 
Search instead for 
Did you mean: 

Vuforia Studio collaboration with other users in runtime

Velkumar
17-Peridot

Vuforia Studio collaboration with other users in runtime

Hi all,

 

Is it possible to collaborate and view AR experience with multiple user ?

 

/VR

1 ACCEPTED SOLUTION

Accepted Solutions
sdidier
17-Peridot
(To:Velkumar)

Hello Kumar,

 

In an Experience made in Vuforia Studio and opened in Vuforia View, it is not possible to collaborate with another user.

But it is possible to open a Vuforia Chalk session to do that.

 

Best regards,

Samuel

View solution in original post

10 REPLIES 10
sdidier
17-Peridot
(To:Velkumar)

Hello Kumar,

 

In an Experience made in Vuforia Studio and opened in Vuforia View, it is not possible to collaborate with another user.

But it is possible to open a Vuforia Chalk session to do that.

 

Best regards,

Samuel

Hello Kumar

Hello Samuel

 

I agree with Samuel regarding "In an Experience made in Vuforia Studio and opened in Vuforia View, it is not possible to collaborate with another user.". 

At least I can't do  Experiencer in Vuforia Studio for deveralcollaborated  Vuforia View  users (and several of them can view how one of them operates with AR dynamically); But in MWC 19 (Barcelona 2019)  "HoloLens's Father" Alex Kipman demonstrated how he can use Vuforia Studio (on HoloLens2)  in collaborative work with PTC and Howden  (Maria Wilson) on the one Experience  (see from 52:00 of .https://www.youtube.com/watch?v=fXFc8Tjbq5E&t=3030s)

Wilson describes how Howden prepared Experience with help of Vuforia Studio and then Alex starts thic experience in Vuforia View (on HoloLens). But all participants see this Experience by anothe eyewear - eyewear on camera on stage? operated by cameraman. Camera uses Vuforia View too (the label in left upper corner) and demonstrate all Alex's activities with AR from popint of view camera but not Alex POV. All MWS19 Participants see Alex and Howden experience fron another Vuforia View than Alex and Alex operates in his own Vuforia View in the same time - from MHO it can be as an example of vcollaboration in AR with help of Vuforia view. All collaborators can use theirs own Vuforia View devices to "live" in one AR experience. Moreover, Maria joins with her device  (iPAD may be, vith her Vuforia View) and directs Alex's actions in the presentation of the disassembly-assembly of the AR in Alex's session.

From my point of view - this is a real cexample of "Vuforia Studio collaboration with other users in runtime".  

Not?  Several time ago I sent request Julia Schwartz (number One MS HoloLens2 demonstrator and person who knows about MWS19 presentations absolutely all) how they can show the one Experience from two several HoloLenses at the same time (AR collaboration) but she may be very busy and cannot answer. 

Anyway, Samuel, do you know who from Vuforia experts could explaine how PTC prepared such example of collaboration in Vuforia Views?

Thank you in advance.

Vladimir.

 

 

Alex puts on his HoloLens with Vufoeia ViewAlex puts on his HoloLens with Vufoeia ViewAlex starts Vuforia View  and see AR assembly from his POV (Front view). But all others can cee the same assebly from another POV (Right side view)  which demonstrates  another Vuforia View in another device (on stage Cameraman)Alex starts Vuforia View and see AR assembly from his POV (Front view). But all others can cee the same assebly from another POV (Right side view) which demonstrates another Vuforia View in another device (on stage Cameraman)

Martini3119
6-Contributor
(To:Velkumar)

Hi,

 

I am not sure if what you are looking for is a feature you desire to be available out of the box, but there is a way to 'connect' more than eight Vuforia View sessions using ThingWorx. To what extend would you like to collaborate? What data would you like to share with another user?

 

With kind regards,

Martini3119

Hi Martin3119

 

Yes, you are absolutely right - solution I looking for doesn't OOTB.
But I see several times during tech show, LiveWorx or other demo : the demonstrator works in his own Vuforia View with his AR scenario in AR space, and the other participants
are watching/are following his actions in theirs AR viewers (Vuforia View?) in real time. And moreover - they are following him according theirs positioning in AR space. I didn't find how does it possible realize in PTC Vuforia user guides, technical papers or during learnings. It means it doesn't OOTB solution, may be undocumented functionality of Vuforia Studio or Vuforia Studio (and Vuforia View too ) skillful possession or PTC AR state-of-the-art. But I can see examples of s.c. "collaborative work in Vuforia View" in several videos such as LiveWorx or Microsoft+PTC kick-off.
You write: "there is a way to 'connect' more than eight Vuforia View sessions using ThingWorx"
OK,
How can I get familiar with this way?
Does it possible for Android Vuforia View users or for Hololens only?
Do I need to download additional SW?
To do something on TWx ES server too?
Can I use the official free shared Vuforia View or I need to replace it for getting collaboration?
Thank you in advance,

Vladimir

(Collaboraqtive work txample : non-VR explanation of future developers project. Mixed reality)

Multiplayer demo - digital mock-up design revuMultiplayer demo - digital mock-up design revu

Digital Mock-Up Design Revu - AR format of collaborative work.

Hi Vladimir,

 

What I did in my project was creating a few properties on an entity in ThingWorx. By sending data to TW from your device, updates the entity's properties. Other devices that have opened the same experience, can read those properties and update their own 3D space accordingly and thus creating a shared experience.

 

The work you need to do in TW is that you need to have an entity with a few properties and related services. Depending on the data you want to send, you determine what types of property you need. The system works in such a way that you need to send your data to a service (bound in the external data panel of Studio), the service writes to the property, and Vuforia View reads this property and writes it to a widget and/or app parameter.

For example, if you want to change colors on your model, it works best to use a text property and send the color and the component to your service like: 'red', 'modelItem-1'. Your service should be configured to write to the property, i.e. 'color' using a value: me.color = color_val; . Using a $scope.$watch in your Studio code, allows you to trigger something when the incoming data changes. So far in my experience, everything happens in real-time.

Compared to a 'multiplayer lobby', which is an active shared experience, because data is actively shared through a host which knows the population of the lobby and the corresponding data. This way of 'connecting' Vuforia View sessions is more passively, where the different sessions do not know anything of a lobby, participants or any kind of data except the values you send manually.

 

If it is also desired to show your peers' positions and gaze, you could send these values to TW and possibly create a value stream. I did not research this so far.

 

How can I get familiar with this way?

Programming skills (JS) and basic knowledge of how ThingWorx works


Does it possible for Android Vuforia View users or for Hololens only?

I have never tested this with HoloLens, but it works 100% with Android and Apple


Do I need to download additional SW?

Nope 🙂


To do something on TWx ES server too?

Yes, create properties and corresponding services. Link from Studio to the service in TW, the service to the property, and the property back to Studio.


Can I use the official free shared Vuforia View or I need to replace it for getting collaboration?

Nope, it all works with Vuforia View.

Hi Martin3119,

 

ThanXs for your last message - in my company we try to do the same way in general. We use the only ThingMark for two or more Vuforia Views when one user ("Master") can apply widgets for changing something in  the model (position, rotation, occlusion etc) of experience? others can only follow him in theirs Vuforia Views. For this task we make in Studio @connection@ between 3D-Model parameters and TWx Thing properties:  the value of X model coord, , Y model coord. or Z model coord. we "bind"  as a Properties of corresponding   Thing in TWx.  And  "Master"  after that can use  GetProperty../Set.Property... services  for connect Studio scene of Experience  and  TWx Thing Properties temporary value.  As a result: Vuforia View reacts on experience  model position simultaneously with Thing and all position changes reply from TWx to Vuforia View.  "Slaves" have the same experience as "Master" except control, visualization only. This way can be as  an example of  a "collaborative" work, but I cannot say about "real collaboration" regarding MWS19, because during Microsoft Vuforia View presentation both Julia and Alex demonstrate Vuforia View (first) without visible targets (they may be use spatial  target ?)  and (second) participants see (as stage Camera operator transmitted video stream) all actions not from Vuforia View.   And concerning this case: of "multiplayer lobby" : I think your are right - may be we need to establish connection between TWx server and visualization method = "value stream".  I cannot find official docs for resolving such tasks, neither from PTC, nor from Microsoft (regarding HoloLens). But!! I saw this  "multiplayer lobby" based on Vuforia Studio Experience  - I had  already saw it all "live", it had  worked than! 

 

Regards,

Vladimir K.

Hi Vladimir,

 

I have the same setup using a 'host' and 'client' viewers. My use case was to present products to a prospect/customer in which the sales agent (host) is able to tell their story and control the content the customer (client) sees on their screen, either remote or local. I added a simple boolean function in which I enable the client to also interact with the model only when the host allows it. I run the host and client views from the exact same experience, which enables the client to write to the same properties as the host, hence creating real-time collaboration. Added note: my experience uses spatial targets in which I did not synchronize the position of the model. This would be possible by either sending 'rotation' value and/or position vec3 in a value stream or only send it once the model has not been translated or rotated for 'x' seconds (manually inducing lag to reduce data stream).

 

Microsoft has a 'special camera' using HoloLens technology. It is able to enter in any form of connection with other HoloLens devices that are in a multiplayer lobby or using Azure's Spatial Anchor service. Using Vuforia View the same way, this would be possible. I have seen Microsoft's presentation(s) with PTC and Unreal. Both use some form of shared experience, where MWC19 uses the same Azure Spatial Anchor service (this requires both users to be together in the same location). If you would require the same functionality, I assume that Azure needs to be connected to ThingWorx? They say the functionality is integrated in Vuforia Studio and View, but I have never seen any of it in the software, so this is either a custom setup or they have done some work that was never documented or shared with us, or both.

 

If I am correct, ThingWorx/Vuforia Studio does not allow 3rd party integrations when using Vuforia Studio Starter, so I cannot confirm (or deny) that Azure Spatial Anchors can be used with Vuforia Studio Starter. Eventually, I think the spatial anchor is saved in a similar way as the app-keys in TW (using a destination URL and an app-key to access TW data). Basic information about Azure Spatial Anchors can be found here: Tutorial: Share anchors across sessions and devices - Azure Spatial Anchors | Microsoft Docs . Ignore the Unity part.

 

With kind regards,

Martini3119

Announcements
Topics available:
AR/VR for Data Optimization AR/VR for Security and Control AR/VR for Inspection