Bayes is Plural.

The strict application of probability rules (the Bayesian approach) forms the basis of artificial intelligence (AI) and empirical sciences (or data based sciences) because it ensures optimal beliefs given the available information. Probability rules have been known since the late 18th century and are conceptually intuitive: preserving the prior belief that remains consistent with the data (product rule) and predicting with the contribution of all hypotheses (sum rule). While nothing better has been proposed in practical terms during this time, its application has historically been limited due to the computational cost associated with evaluating the entire hypothesis space. At the turn of the 21st century, these obstacles have been partially overcome thanks to modern computational and algorithmic advances. Although Bayesian approaches are currently experiencing worldwide growth that is accelerating, historical inertia means that its development remains incipient, especially in peripheral countries.
The relevance of these methods contrasts with the current reality in which the Bayesian approach is still marginal even in the most important universities of the Plurinational America, and practically nonexistent in empirical sciences that lack specific training in mathematics and programming. The following figure details the number of scientific articles linked to the Bayesian approach published each year with at least one author affiliation in the Plurinational America according to Scopus.
papers
At the end of the century, with the massification of personal computers, the Bayesian approach began to take center stage in artificial intelligence (AI) until a deep convolutional neural network trained on GPUs was presented in 2012. Again, computational complexity acted as a limit to the strict application of probability rules, which require evaluating the entire hypothesis space. Still, the rise of deep neural networks did not produce a new reasoning system under uncertainty with better performance than probability rules, so the Bayesian approach remains at the heart of artificial intelligence and all empirical science. In this context, during the 35th Annual Conference on Neural Information Processing Systems (NeurIPS 2021), a competition called "Approximate inference in Bayesian Deep Learning" was held, where it was confirmed that deep ensembles are good approximations of Bayesian neural networks [1,2].
Finding good computational approximations to the strict application of the rules probability is necessary but not sufficient. To explain the world and make decisions in contexts of uncertainty, we must propose causal theories that predict the effect of interventions on reality [3,4]. Causal theories are hypotheses, and as such, they can be evaluated through the strict application of probability rules [5], P(Causal Theory|Data), which naturally penalizes unnecessary model complexity. Today, artificial intelligence has the opportunity to use causal models developed in various empirical sciences, and empirical sciences have the opportunity to create and evaluate models tailored to each specific problem [6]. In this context, our goal is to promote Bayesian Intelligence in the Plurinational America and the peoples of the global south.
For us to talk about Bayesian Intelligence, it is necessary that the cost function used to update beliefs guarantees learning from interaction with reality. This same point was already expressed in 1956 by John L. Kelly in his article "A new interpretation of information rate" [7], approved by his colleague Claude Shannon: "The cost function approach [..] can actually be used to analyze nearly any branch of human endeavor. The utility theory of Von Neumann shows us one way to obtain such a cost function. The point here is that an arbitrary combination of a statistical transducer (i.e., a channel) and a cost function does not necessarily constitute a communication system. What can be done, however, is to take some real-life situation which seems to possess the essential features of a communication problem, and to analyze it without the introduction of an arbitrary cost function. The situation which will be chosen here is one in which a gambler uses knowledge of the received symbols of a communication channel in order to make profitable bets on the transmitted symbols."
Not coincidentally, the cost function of Probability Theory (Bayes' theorem) and the cost function of Evolution Theory (the replicator dynamic) are isomorphic [8]. In particular, both select variants (alternative hypotheses or life forms) following multiplicative processes: by sequence of predictions in probability, or by sequence of reproduction and survival rates in evolution. The multiplicative nature of the selection process is the origin of Bayesian and Evolutionary Intelligence. Because the impact of negative events is stronger than the impact of positive events (a single zero in the sequence produces an irreversible extinction), the variants that emerge are those that reduce fluctuations through individual diversification, cooperation, cooperative specialization, and cooperative heterogeneity [7,9]. This is evidenced in our own lives, which depend on at least 4 levels of cooperation with specialization without which we could not survive [10]: the cell with the mitochondria, the multicellular organism, society, and the ecosystem. The same occurs with culture: elementary hypotheses are grouped to form variables (or hypothesis spaces), variables are related to each other to form causal models, and systems of causal models form causal theories.
The word Plurinational represents in South America the coexistence of our local cultural diversities. This vision is especially suitable for a community that works with Bayesian methods, which unlike ad-hoc procedures that select a single hypothesis (e.g., by maximum likelihood), takes advantage of the practical value of considering the plurality of existing hypotheses. Just as selecting a single hypothesis has known negative consequences in probability (overfitting) [11], the massive loss of cultural diversity caused by the imposition of a single type of society during colonial-modernity is having increasingly evident biocenotic and environmental consequences [12,13]. Just as the Bayesian approach adapts to reality by virtue of working with mutually contradictory hypotheses, a plurinational society adapts to life through the coexistence of our diversities.
References:
[1] Wilson AG, Izmailov P. Bayesian deep learning and a probabilistic perspective of generalization. Advances in neural information processing systems. 2020.
[2] Izmailov P, Vikram S, Hoffman MD, Wilson AGG. What are Bayesian neural network posteriors really like?. International conference on machine learning. 2021.
[3] Pearl J. Causality. Cambridge University Press. 2009.
[4] Peters J, Janzing D, Schölkopf B. Elements of causal inference: foundations and learning algorithms. The MIT Press. 2017
[5] Winn J. Causality with gates. In: Artificial Intelligence and Statistics. Proceedings of Machine Learning Research. 2012.
[6] Bishop CM. Model-based machine learning. Philosophical Transactions of the Royal Society A: Mathematical,Physical and Engineering Sciences. 2013.
[7] Kelly, JL. A new interpretation of information rate. The Bell System Technical Journal. 1956.
[8] Czégel D, Giaffar H, Tenenbaum JB, Szathmáry E. Bayes and Darwin: How replicator populations implement Bayesian computations. Bioessays. 2022.
[9] Peters O. The ergodicity problem in economics. Nature Physics. 2019.
[10] Szathmáry E, Maynard Smith J. The major evolutionary transitions. Nature. 1995.
[11] Bishop CM. Pattern recognition and machine learning. Springer. 2006.
[12] Ostrom E. Governing the commons: The evolution of institutions for collective action. Cambridge university press. 1990.
[13] Segato, RL. La crítica de la colonialidad en ocho ensayos y una antropología por demanda. Prometeo (2013).


extension
UNSa
FCN
PythonNorte
khipu
khipu
Metodos