368025 Thermodynamical Outlook on Machine Learning

Monday, 13 January 2020
Hall B1 (Boston Convention and Exhibition Center)
M. Jeremie Lafitte (Levitas), Metivdata, Safed, Israel

Celebrating the centenary meeting of the American Meteorological Society, we hail a new technological era which parallels that which ended in 1920: the second industrial revolution. In this day and age, Andrew Ng, a leading influential expositor & researcher of machine learning, contends that artificial intelligence is the new electricity, the very subject of our own contemporary revolution. We hereby discuss how the thermodynamical core aspects of the first industrial revolution potentially shed light on the possibilities offered by the present-day metamorphosis of numerical prediction via machine learning. At the heart of the 19th century mechanization, there was Sadi Carnot who realized that in any practical conversion of fuel into heat and then into mechanical work, part of the stored energy would be lost or dissipated. This “lost” energy, unavailable for work, is the entropy defined 30 years later by Emanuel Clausius. 8 years before his own demise, Carnot proposed the basic tenets of the 2nd law of thermodynamics, without fully understanding the 1st law. The year was 1824 and the conservation of energy wasn’t fully acknowledged. Nevertheless, Carnot knew to postulate that though a heat engine can convert heat to work in a cyclic process, the conversion will never be thorough. In 1865 Clausius had formulated the two laws as follows: 1. The energy of the universe remains constant and 2. The entropy of the universe tends to a maximum. The latter really expresses an inherent inefficiency, further conveying the irreversibility of certain processes in a universe with a somber destiny. Such a seemingly dark enunciation took place in an era of prodigious technological optimism. It actually gave an arrow to time when every other physical law would work indifferently with time flowing forward or backward. Entropy is an amount of disorder. The entropy of a plate is higher when broken in pieces on the floor than as one piece on a shelf. Prediction is about finding back intrinsic order. More than a century later, Doyne Farmer joked with colleagues [1] about a “2nd law of self-organization” [2]. Together with chaos appears some form of pattern. Yes, water molecules form a well defined lattice in an ice cube whereas they float unpredictably when heated into a gaz. Yet, clouds made of this very gaz have definite shapes presenting aspects of self-similarity, an unmistakable kind of order. The universe started in a very rare, highly ordered state, and is running down into increasingly common states of disorder. Though this explains why we will not arrange back molecules of water and send a steam train rearward, why would we not send a time-series’ future data back into the past by expending few megawatt hours of computing power? Can we draw an analogy between machine learning and heat engines? Prediction would be mechanical work? Entropy, chaos? Is forecasting a time series like receiving information from what is bound to be appended? Where and how exactly would it be stored inside the past data? Considering the universe as having an open topology and a positive cosmological constant, as appears to be the case in recent observations, it will asymptotically approach a state of maximum entropy in which no further work will be possible. This “heat death” will correspond to a regime where temperature differences (or other processes) may no longer be exploited to perform work. Nevertheless, before that distant point in time, energy will still maintain enough unevenness in its distribution, individual entities will still present ample predictability, at least through the way they are organized together. Something other than the strict arrow of time toward more disorder is at play [3]. Certainly the 2nd law of thermodynamics allows the emergence of complex order. It clearly enacts an essential role, nonetheless other actors also exist, mostly unknown or unidentified.

[1] Farmer JD., Sidorowich JJ., Predicting Chaotic Time Series, Physical Review Letters, Vol. 59 (1987), pp. 845—848

[2] Farmer JD. et al., The second law of organization, Chap. 22 in Brockman J., The Third Culture: Beyond the Scientific Revolution, Simon&Schuster (1995)

[3] Farmer JD., Cool is not enough, Nature, Vol. 436 (2005), pp. 627—628

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner