Collaborative AR/VR data visualization environment
This platform allows one or more users to share an interactive data visualization experience. Users can join on a windows computer, HoloLens 2, or Steam VR compatible device. Annotations can be added and removed from the visualization and the axis for the visualization can be re-mapped. The device coordinate systems of co-located XR users can be synchronized to align their virtual content. Visual feedback for AR user’s head and hands are provided for desktop users.
Unity | C# | Vuforia | MRTK | SteamVR | Photon | Virtual/Augmented Reality
Details
Cross-device
- Windows desktop
- Microsoft HoloLens 2 (AR)
- HTC Vive Pro (VR)
- SteamVR devices
Integration of the MRTK, SteamVR, and Unity input system provides cross device compatibility. C# directives control which code base each device loads. This platform can compile for both the Universal Windows Platform and stand alone desktop builds.
Collaboration
- Clients are networked using Photon engine
- The first device in a session becomes the server, others join as clients
- All virtual objects synchronize state and position by sharing serialized arrays
- Object transforms are linearly interpolated between key-frames to improve efficiency
- New clients receive all current state information and are synchronized to the current system state on log in
- Events and listeners are used to control interactions, class communication, and object synchronization
Synchronized coordinate systems
- Image targets are tracked by the HoloLens to generate a synchronized coordinate system
- SteamVR devices can use lighthouse positions (automatic) or controller interactions to align their coordinate system
- All object transforms and manipulations are translated from that device’s coordinate system to the shared coordinate system
- HoloLens users can interact with SteamVR inputs using this synchronization technique, allowing novel use of the Logitech XR Pen by a non-supported device
Annotation
- Custom annotations were developed for the system: Details on Demand, Highlight cube/sphere volumes, text/speech entry, and centrality planes, Line markings
- Annotations were written as serializable classes and could be exported to JSON files allowing for easy saving and loading
- Text was recognized using Microsoft’s speech to text API
- Annotations interface with the visualization, the raw csv file represented by the visualization, and the unity scene graph
- Several shaders were written for these annotations, one of note is used by the highlight volumes which is implemented as a fragment shader on PC
Interactions
- Hand and controller based ray-casting techniques were implemented
- Custom shaders provide visual feedback for all interactions
- Users head position and hand or controller positions and ray-casts are visually provided to all collaborators
- The Logitech VR Ink can be used for mid-air drawing and ray-casting
Visualization
- Users can interact with a 3D scatterplot
- The scatterplot was generated from modified IATK code
- The graph color mappings and axes can be altered using button controls
- Several world and user aware button interfaces were developed to allow rapid placement of the visualization and other user tools