Physics origins of the most important statistical ideas of recent times

science-memo.blogspot.com
3 min read
difficult
Figure: Maxwell's handwritings, state diagram (Wikipedia) Preamble The modern statistics now move into an emerging field called data science that amalgamate many different fields from high performance computing to control engineering. However, the emergent behaviour from researchers in machine learning and statistics that, sometimes they omit naïvely and probably unknowingly the fact that some of the most important ideas in data sciences are actually originated from Physics discoveries and specifically developed by physicist. In this short exposition we try to review these physics origins on the areas defined by Gelman and Vehtari (doi). Additional section is also added in other possible areas that are currently the focus of active research in data sciences. Bootstrapping and simulation based inference : Gibbs's Ensemble theory and Metropolis's simulations Bootstrapping is a novel idea of estimations with uncertainty with given set of samples. It is mostly popularised by Efron and his contribution is immense, making this tool available to all researchers doing quantitative analysis. However, the origins of bootstrapping can be traced back to the idea of ensembles in statistical physics, which is introduced by J. Gibbs. The ensembles in physics allow us to do just what bootstrapping helps, estimating a quantity of interest with sub-sampling, in the case of statistical physics this appears as sampling a set of different microstates. Using this idea Metropolis devised a inference in 1953, to compute ensemble averages for liquids using computers. Note that, usage of Monte Carlo approach for pure mathematical nature, i.e., solving integrals, appear much earlier with von Neumann's efforts. Causality : Hamiltonian systems to Thermodynamic potentials Figure: Maxwell Relations as causal diagrams. Even though the historical roots of causal analysis in early 20th century attributed to Wright 1923 for his definition of path analysis, causality was the core tanents of Newtonian mechanics in distinguishing left and right of the equations of motions in the form of differential equations, and the set of differential equations following that with Hamiltonian Mechanics is actually forms a graph, i.e., relationships between generalised coordinates, momentum and positions. This connection is never acknowledge in early statistical literature, and probably causal constructions from classical physics were not well known in that community or did not find its way to data-driven mechanics. Similarly, causal construction of thermodynamic potentials appear as a directed graph as in, Born wheel. It appears as a mnemonic but it is actually causally constructed via Legendre Transformations. Of course, causality, philosophically speaking, is discussed since Ancient Greece but here we restrict the discussion on solely quantitative theories after Newton. Overparametrised models and regularisation : Poincaré classifications and astrophysical dynamics The current deep learning systems classified as massively overparametrized systems. However, the lower dimensional understanding of this phenomenon were well studied by Poincare's classification of classical dynamics, namely the measurement problem of having overdetermined system of differential equations, i.e., whereby inverse problems are well known in astrophysics and theoretical mechanics. High-performance computing: Big-data to GPUs Similarly, using supercomputers or as now we call it high-performance computation with big data generating processes were actually can be traced back to Manhattan project and ENIAC that aims solving scattering equations and almost 50 years of development on this direction before 2000s. Conclusion The impressive development of new emergent field of data science as a larger perspective of statistics into computer science have strong origins from core Physics literature and research. These connections are not sufficiently cited or acknowledged. Our aim in this short exposition is to bring these aspects into the attention of data science practitioners and researchers alike. Further reading Some of the mentioned works and related reading list, papers or books. What are the Most Important Statistical Ideas of the Past 50 Years? Gelman & Vehtari (2021) Leisurely Look at the Bootstrap, the Jackknife, and Cross-Validation, Bradley Efron and Gail Gong (1983) Elementary Principles in Statistical Mechanics, Gibbs (1902) Equation of State Calculations by Fast Computing Machines, Metropolis et. al. (1953) Generalized statistical mechanics: connection with thermodynamics, Curado-Tsallis (1992) Poincaré sections of Hamiltonian systems (1996) Statistical mechanics of ensemble learning, Anders Krogh and Peter Sollich (1997)
Scientific Scratch Pad of Memo:

Physics,…
Read full article