Sunday, 28 January 2024
Hall E (The Baltimore Convention Center)
Storms are complex atmospheric phenomena that present significant challenges in understanding their structure and dynamics due to their constantly changing and evolving nature, making traditional tracking, and studying methods difficult. To address this, 3D storm segmentation algorithms are utilized to identify and track different parts of the storm and identify areas with varying characteristics like cloud type, temperature, and wind speed. This research project aims to revolutionize the study and understanding of severe weather events by combining 3D storm segmentation algorithms with immersive virtual reality (VR) environments. By creating highly detailed and interactive models of storms using these technologies, one can explore storm structure and dynamics in ways previously impossible. Data is stimulated and segmented to generate a dataset for computer models, which will be rendered in 3 dimensions and displayed using Unity or Unreal Engine5 within specialized VR headsets like Oculus Meta Quest 2 or HTC Vive. The VR environment will enable users to immerse themselves in the storm and observe its behavior from various angles, providing a more detailed and accurate understanding of severe weather events. Using LiDAR Point Cloud to generate the cloud in Unreal Engine and also to manipulate and exploit the clouds with virtual VR, The objective is to get and view a bigger picture of the clouds, label, slice, and segment the cloud data using VR, enabling the identification of different storm parts, and tracking their movements over time. This information will facilitate the study of storm structure, dynamics and improved forecasting models. Overall, this research has the potential to transform how we study and understand severe weather events by leveraging cutting-edge technology, researchers can gain new insights into storm behavior, leading to better prediction and preparation.

