Fondation Cournot

 

Conferences

Seminars

Publications

PUBLICATIONS

The Foundation supports publications that advance the theory and practice of probability. A selection of booklets from the Cournot Centre’s Prisme series is presented below. Its collective volume Augustin Cournot: Modelling Economics pays tribute to Cournot's intuitions in game theory and his pioneering work in modeling social phenomena.

How to Flee on a Straight Line: Tracking Self-avoiding Random Walks
Laure Dumaz

prisme31 Prisme N°31 August 2016 (2.9 MiB)

This text focuses on the path of a particular random walk, that, for example, of a fugitive who is trying to escape her pursuers, but who can only move on a real number line. The aim of the fugitive is to find the continuous path that leaves behind the least information possible. This problem is addressed within the framework of the theory of random walks. Different kinds of random walks are presented, starting with the well-known Brownian motion. The scaling limit of the self-avoiding random walk is then explained before examining the self-repelling process, which provides an optimal strategy.

The Evolving Connection Between Probability and Statistics.
Do Statisticians Need a Probability Theory?

Noureddine El Karoui

prisme30

This text is currently being translated into English.

Prisme N°30 December 2014 (0.45 MiB)

Told from the perspective of the daily life and teaching of an academic statistician, the aim of this text is to show how the field of statistics has evolved and continues to evolve, especially in relation to probability theory. The text will use two examples to illustrate that purpose. In the first case, we will look at housing data and ask whether it is possible to predict a house’s future sale price based on its characteristics. In the second case, we will examine the possibility of building a SPAM filter for an e-mail account. The text will explore, at a high-level, various classical and progressively more modern statistical techniques that could be used to analyse these data, examining the role of probability theory in the development and use of these ideas, and thus illustrating the evolving connection between probability theory and statistics.

How Quantum Can a Computer Be?
Elham Kashefi

elham kashefi_prisme29 Prisme N°29 September 2014 (1.2 MiB)

This text briefly reviews the history of quantum computing to provide a backdrop to the new emerging field of quantum technology, which is raising new challenges. In particular, quantum computing has an acute verification and validation problem: on the one hand, since classical computations cannot scale up to the computational power of quantum mechanics, verifying the correctness of a quantum-mediated computation is challenging; on the other hand, the underlying quantum structure resists classical certification analysis. The text concludes with recent progress on how to evaluate today’s quantum computing devices so that we can effectively exploit tomorrow’s.

From Cournot to Public Policy Evaluation: Paradoxes and Controversies of Quantification
Alain Desrosières

prisme7 Prisme, April 2006 (258.2 KiB)

The French mathematician, economist and thinker Augustin Cournot inaugurated the philosophical treatment of the new probabilistic and quantitative modes of reasoning that emerged in the first half of the 19th century. The text reviews the legacy and implementation of Cournot’s intuitions concerning the distinction between so-called objective and subjective probabilities, and the interpretation of the categories constructed by statisticians according to “equivalence conventions”. Suggestive clues emerge for the empirical study of current statistical practices, in particular, those transactions that take place in the “contact zone”, where quantified assertions recorded in more or less formal models replace unquantified assertions formulated in natural language. Examples of these exchanges are illustrated in the cases of risk management, macroeconomic policy and public service performance evaluation.

In conclusion, the paper highlights how the ambivalence of Cournot’s thought is echoed in the controversies raised in some recent sociology of science, polarized between diverse forms of “realism” and “constructivism”. Questions suggested by Cournot are the starting point for an exploration of the sense in which quantification can be said to create objectivity.

Is Everything Stochastic?
Glenn Shafer

prisme20_en_web Prisme, December 2010 (747.1 KiB)

Kolmogorov said no, Popper said yes. My sympathies lie with Kolmogorov, the old-fashioned empiricist.

In the on-line setting, where we see previous outcomes before making the next probability forecast, we can give probabilities that have objective value because they pass statistical tests. This accounts for the success of many adaptive methods, and it is related to Leonid Levin’s notion of a universal prior probability distribution. It tells us that yes, everything is stochastic, but in a sense that is empirically empty because it is not falsifiable.

When we understand that success of adaptive methods does not depend on the world being stochastic in a falsifiable sense, we may want to be more parsimonious in causal modeling and more open to non-standard methods of probability judgement.

This text is based on the presentation, Is Everything Stochastic?, given by Glenn Shafer on 13 October 2010 at the Cournot Centre’s seminar “The Probabilism Sessions”.

Tracking the Random Race
Michel Armatte

prisme_21_en Prisme, November 2012 (873.4 KiB)

The historical trajectory of randomness in scientific practices has not been smooth. What have been the different stages in its ascent, and how has it been interpreted? The classical view of 19th century probability, followed by the emergence of objective chance and the many different roles attributed to it from the 1830s on, led to the development of the theory of processes in the 20th century. The mathematics of chance has been marked out by the milestones of randomness as it has gradually penetrated the disciplines that owe it so much: physics, biology, economics and finance.

This text was inspired by the presentation, Three Sources of Probability Calculations in the 18th Century, given by Michel Armatte on 28 October 2009 at the Cournot Seminar.

Should You Take a Risk When You Do Not Know for Sure? From Judging to Acting since Condorcet
Pierre-Charles Pradier

PRISME16_EV Prisme, March 2010 (513.4 KiB)

Condorcet proposed a principle of reasonable probability: actions entailing a prohibitive risk with a non-negligible probability should not be taken. This principle guides the development of knowledge as much as it guides the action itself. The mathematics developed by Laplace has allowed for the effective application of this principle in mathematical statistics (point estimates combined with a high confidence level) or in the management of insurance companies (calculating the loading rate to ensure the solvency of the company). During the same period, Tetens was developing related ideas – though with less mathematical efficacy. These ideas from the 18th century still apply today, both in (the interpretation of) certain modern decision models and in the informational and legal requirements that should be enforced to ensure that financial decisions are rational.

This text is based on the presentation, The Probabilization of Risk, given by Pierre-Charles Pradier on 30 September 2009 at the Cournot Centre’s seminar “The Probabilism Sessions”.

Sifting Noise: The Role of Probability in Imaging
Josselin Garnier

Augustin Cournot : Modelling Economics Prisme, November 2012 (686.6 KiB)

Imaging techniques using waves to probe unknown media have long existed. Classically, these techniques can be divided into a phase of data gathering and a phase of data processing. During the data-gathering phase, waves are emitted by a source or source array, propagated through the medium being studied, and are then recorded by a receiver array. The processing phase consists in extracting information about the medium from the data recorded by the receivers. Recently, new ideas have emerged driven by observations made during time-reversal experiments. Based on these observations, new imaging methods have been developed using cross correlations of the signals recorded by sensor arrays. Mathematical analysis has shown that the cross correlation of signals recorded by two passive sensors essentially contains as much information about the medium as the signal that would have been recorded if one of the sensors were active (emitter) and the other passive (receiver). The important point demonstrated by this analysis is that uncontrolled sources of ambient noise can be used instead of controlled sources to compute cross correlations and use them for imaging. This possibility has attracted the attention of researchers in mathematics, in the domain of probabilities, for profound theoretical reasons, because the idea of useful noise overturns the customary distinction between signal and noise. This has also been the case in seismology for obvious practical reasons concerning the sparsity of sources (earthquakes) and the impossibility of controlling them. The aim of this paper is to describe how the idea of exploiting ambient noise to address problems of imaging took shape.

The Impossible Evaluation of Risk
André Orléan

Prisme18_couvAN Prisme, April 2010 (647.9 KiB)

The current financial crisis stems from a massive under-estimation of mortgage risks, particularly of the subprime kind. This essay seeks to understand the origins of such an error. Economists most often advance the perverse incentive structure as the cause. This is a valid point, but it only provides a partial explanation. This text explores another hypothesis: the difficulty inherent in predicting the future when agents face uncertainty of a Knightian or Keynesian type. It seeks to show that economic uncertainty is of this type. Probability calculus cannot be applied to it. For that reason, economic uncertainty evades the only available method of prediction: statistical inference. Consequently, in a Knightian world, there is no such thing as an objective evaluation of risk. This point is illustrated by examining the case of the US presidential elections of 2000.

The Theory of Stochastic Processes and their Role in the Financial Markets
Nicole El Karoui

prisme17_enNicole El Karoui, Michel Armatte

Prisme, February 2010 (560.6 KiB)

Probabilists are often interested in the history of their discipline, and more rarely by the fundamental questions that they could ask about the facts they model. Works like Augustin Cournot: Modelling Economics, and especially the chapter by Glenn Shafer, throw light on some of my experience in the domain of probability over the last 40 years, which began in 1968, at the end of the first year of my Ph.D. They have prompted me to present my own point of view here.

I had the good fortune to participate in an extraordinary moment in the development of probability, more precisely the theory of stochastic processes. This was an unforgettable period for me. At the time, I had the feeling that I was witnessing science — probability theory — in the making. Subsequently, (rather by chance, it must be said) I switched over to the side of probability users, about 20 years ago, by focusing my research on one particular sector of finance. In the present text, I shall try to explain what interested me in the approach and in this aspect of finance on which I am still working today. To begin with, my account will be that of a pure probability theorist, and then that of an applied probabilist.

This text is based on the presentation, The Autonomization of Probability as a Science: The Experience of a Probabilist, given by Nicole El Karoui on 18 September 2008 at the Cournot Centre’s seminar “The Probabilism Sessions”.

Towards a Probabilistic Theory of Life
Thomas Heams

prisme12 Prisme, September 2008 (551.8 KiB)

Biology has long been dominated by a deterministic approach. The existence of a genetic code, even a “genetic programme”, has often led to descriptions of biological processes resembling finely-regulated, precise events written in advance in our DNA. This approach has been very helpful in understanding the broad outlines of the processes at work within each cell. However, a large number of experimental arguments are challenging the deterministic approach in biology.
One of the surprises of recent years has been the discovery that gene expression is fundamentally random: the problem now is to describe and understand that. Here I present the molecular and topological causes that at least partly explain it. I shall show that it is a wide-spread, controllable phenomenon that can be transmitted from one gene to another and even from one cell generation to the next. It remains to be determined whether this random gene expression is a “background noise” or a biological parameter. I shall argue for the second hypothesis by seeking to explain how this elementary disorder can give rise to order. In doing so, I hope to play a part in bringing probability theory to the heart of the study of life. Lastly, I shall discuss the possibility of moving beyond the apparent antagonism between determinism and probabilism in biology.

This text is based on the presentation, The Random Expression of Genes: Probability or Biological Parameter?, given by Thomas Heams on 20 March 2008 at the Cournot Centre’s seminar “The Probabilism Sessions”.

What Probabilities Measure
Mikaël Cozic and Bernard Walliser

forthcoming

 

Collective Volume

Augustin Cournot: Modelling Economics
Jean-Philippe Touffut (ed.)

Augustin Cournot : Modelling Economics

Contributors: Robert J. Aumann, Alain Desrosières, Jean Magnan de Bornier, Thierry Martin, Glenn Shafer, Robert M. Solow and Bernard Walliser

Edited by Jean-Philippe Touffut
Edward Elgar 2007

This rich and fascinating collection of essays helps enormously to establish the reputation of Augustin Cournot as a diverse and powerful thinker, whose numerous contributions range far beyond his widely acknowledged model of oligopoly. Cournot is revealed not merely as a mathematician, but one who was engaged in philosophical debates concerning epistemology and the nature of science. Anyone with the preconception that the development of modern economics was confined to the Anglophone world – from Smith through Marshall to the Nobel Laureates of today – will be amazed by the details of Cournot’s contribution revealed here.
– Geoffrey M. Hodgson, University of Hertfordshire

“If Augustin Cournot had still been alive, he could have won the Nobel Memorial Prize in Economics on at least three different occasions”, exclaimed Nobel Laureate Robert Aumann during the 2005 Cournot Centre conference. From his earliest publications, Cournot broke from tradition with his predecessors in applying mathematical modelling to the social sphere. Consequently, he was the first to affirm the mathematization of social phenomena as an essential principle. The fecundity of Cournot’s works stems not only from this departure, but also from a richness that irrigated the social sciences of the twentieth century.
In this collection, the contributors – including two Nobel Laureates in economics – highlight Cournot’s profound innovativeness and continued relevance in the areas of industrial economics, mathematical economics, market
competition, game theory and epistemology of probability and statistics. Each of the seven authors reminds us of the force
and modernity of Cournot’s thought as a mathematician, historian of the sciences, philosopher and, not least, as an economist. Combining an epistemological perspective with a theoretical one, this book will be of great interest to researchers and students in the fields of economics, the history of economic thought, and epistemology.

The first versions of these texts were presented at the Cournot Centre’s conference, Augustin Cournot, Economic Models and Rationality, held 1-2 December 2005.



 

 

 

 

 

 

Cournot Foundation



augustin@cournotfoundation.org