Thursday, 9 May 2024
Regency Ballroom (Hyatt Regency Long Beach)
Supervised machine learning requires a good set of labels to determine key features in the input data. Chaotic systems, however, pose a real challenge for supervised ML models, as chaotic attractors are dense and ergodic such that two almost identical input data could give rise to completely different states on the attractors at the later time. Recent studies showed that tropical cyclone (TC) intensity appears to possess a low-dimensional chaos at the potential intensity (PI) limit, which prevents TC intensity errors from being reduced indefinitely after some forecast lead time. Using several ML architectures including convolutional and recurrent neural networks (RNN), gated recurrent units (GRU), and long-short term memory (LSTM) with the same CM1 model output, we show in this study that all of these ML models could not predict TC intensity variability beyond 1 day lead time. In fact, the training for these ML models does not even converge after 3 days, indicating that ML could not learn anything during the training at the PI limit. One could force any of these ML model training to converge by changing some hyperparameters such as kernel size, learning rate, or the number of filters, but in the end the prediction of these models on a test set shows no skill relative to a simple climatology or persistence forecast. These results not only confirm the possible chaos of TC intensity dynamics at the PI limit, but also have a broader implication for applying ML techniques to chaotic regimes/systems in the atmosphere.

