Sunday, 22 January 2017
4E (Washington State Convention Center )
Virtual reality can improve understanding of geospatial data, and head-mounted displays (HMDs) like the HTC Vive or the Oculus Rift can be used to view recorded radar scans or radar volumes and high-resolution model output. To facilitate shared discussion and collaborative analysis of meteorological phenomena in a medium closer to its own naturally three-dimensional form, a multi-user format is designed to enable the selection and annotation of specific areas of interest in the data. Tornado debris signatures within shallow supercells on October 31, 2015 in Houston, Texas, for example, can be highlighted in a 3D space while a viewer walks around or is artificially moved around by another viewer. During collaborative weather briefings, viewers can communicate with each other and meteorological variables can be selected and loaded within the application via spatially tracked hand controllers.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner