Pol-XR

Viewing Structures in Major Ice-Sheets in Virtual Reality

About

Pol-XR is an Extended Reality (XR) application developed for the Oculus Quest2. The application combines the following data types: ice-penetrating radar imagery; topographical surface and bedrock digital elevation models (DEMs); aerogeophysical data flight paths; and data textures. The ice-penetrating radar data were collected by the ROSETTA-Ice Project and Operation IceBridge - both of which collected thousands of survey line kilometers of geophysical measurements in the polar regions. Development for the Pol-XR Application was inspired by the desire to elevate the antARctica and Greenland Project Applications previously developed by the Team. Development for Pol-XR was exclusively for the Oculus Quest 2 in order to maximize the use of the better hardware, wider field of view, and pre-set handheld controller interactions. This allowed higher resolution textures, additional interactions, different file types, and latency control for the Pol-XR application. This provides a template for the basis of streamlining the development process for future 3D ice-sheet data projects as well as enabling us to begin developing novel interactions and functionalities.

 


 

Development

This application was developed by: Joel Salzman, Qazi Ashikin, Ben Yang, and Shengue Guo; with support from: Alexandra Boghosian, Isabel Cordero, Kirsty Tinto, and Steven Feiner. The application successfully represents ice-penetrating radar in geospatial context without smoothing trajectories or flattening echograms. This is a redefined process which converts processed data into meshes textured by the echogram image. The echogram image is not constrained by dots-per-inch (dpi), page size, Y-Axis limits, or time; elements of the earlier process for radargrams which previous iterations of the application were hindered by. This improved radagram resolution enabled a restructuring of the entire development process and foundational infrastructure of the application. Solutions for latency and lag issues within the headset can now be resolved by reducing the triangles within the mesh, without decimating the resolution of the echograms. 

 

Development of this application is still underway.

 


 

Limitations

Pol-XR encountered similar latency issues as previous iterations of the application due to the sheer size and resolution of the radargrams, as well as the quantity of them within the scene. While this issue was resolved in the Petermann Glacier Scene by packaging the radargrams to be “loaded at will” by selecting their respective segments of flight path; the more radargrams rendered, the more difficulty the headset has with latency. One overall solution is, still, to run the application tethered to a computer. The graphical hardware requirements for the Ross Ice Shelf Scene is much higher than for the Petermann Glacier Scene. Bounding boxes for the new and improved mesh radargrams are rendered as small as possible, but occasionally affect Video Memory and latency issues.

Voice commands for the Petermann Glacier Scene are currently non-functional, as the Team focused on resolving issues within the updated workflow that they felt were irreconcilable. 

Networking between devices does not exist for this current iteration. Collaborative experiences are attainable while deploying the application while tethered to a PC computer. This allows Users to see what the headset-wearer sees while immersed in the scene, live on a computer monitor or TV through Unity. This brings the headset-wearer comfort in the physical and immersive environment, and allows for a real-time dialogue between the headset-wearer and nearby collaborators about the data in the scene.

Currently, Pol-XR has only been deployed on Oculus Quest2.

 


 

Conclusions

Successfully improving upon the Ross Ice Shelf Scene workflow has opened the door for robust data integration in future versions of this application. Latency issues have been resolved by implementing "load on select" features within the flight path in the scene. The two scenes in tandem show what is possible to achieve, and set an extremely high bar for what is next to come. The immersive experience of radargram exploration is a novel feature for the glaciology and geophysical community. Researchers find the interactions to be simple and the experience to be fascinating. 

When using the application, Users prefer the experience with external guidance. Our solution for this is deploying the application in tethered mode, with real-time video footage within Unity displayed on a monitor. Knowing someone else can see what they see, and is aware of the User's surroundings, helps the headset-wearer feel comfortable and safe within the space. The monitor also allows the User to communicate with other researchers, developers, and collaborators about what they are seeing in the application without removing the headset and trying to describe the feature they saw. 

Future iterations of this application may include, but is not limited to:

  • Multi-Platform Functionality (AR & VR)
  • Multi-Platform Networking for Collaborative Experience
  • Radargram Snapping
  • Improved Measurement Tools
  • More Geospatial Data Integrated
  • Un-Tethered Headset Deployment w/ 60fps Remote Networking Feedback
  • User Study Development 
  • Radar Reflector Picking
  • and more...