Category Archives: Papers

Syria and the Statistics of War | Medium


Trust none of what you hear, some of what you read, half of what you see goes an old trader adage. As a trader and quant/mathematical statistician, I have been taught to take data seriously, trust nobody’s numbers, and avoid people naive enough to engage in policy based on lurid but questionable pictures of destruction: the fake picture of a dying child is something nobody can question without appearing to be an asshole. As a citizen, I require that the designation “murderer” be determined in a court of law, not by Saudi-funded outlets — once someone is called a murderer or butcher, all bets are off. I cannot believe governments and bureaucrats could be so stupid. But they are.

Full article: Syria and the Statistics of War

On the Super-Additivity and Estimation Biases of Quantile Contributions

On the Super-Additivity and Estimation Biases of Quantile Contributions

Nassim N Taleb, Raphael Douady
(Submitted on 8 May 2014 (v1), last revised 12 Nov 2014 (this version, v3))
Sample measures of top centile contributions to the total (concentration) are downward biased, unstable estimators, extremely sensitive to sample size and concave in accounting for large deviations. It makes them particularly unfit in domains with power law tails, especially for low values of the exponent. These estimators can vary over time and increase with the population size, as shown in this article, thus providing the illusion of structural changes in concentration. They are also inconsistent under aggregation and mixing distributions, as the weighted average of concentration measures for A and B will tend to be lower than that from A U B. In addition, it can be shown that under such fat tails, increases in the total sum need to be accompanied by increased sample size of the concentration measurement. We examine the estimation superadditivity and bias under homogeneous and mixed distributions.

pdf download
via [1405.1791] On the Super-Additivity and Estimation Biases of Quantile Contributions.

Response to review by Trevor Charles re: Precautionary Principle | NECSI

A few days ago, Trevor Charles posted a review of our paper entitled “The Precautionary Principle (with Application to the Genetic Modification of Organisms)”. Here we provide a response.

Thank you for the review of our paper. We will provide a point by point response below to your comments. Since you have focused on biological questions it is important for us to emphasize that we did not perform a “statistical analysis” (which is inherently evidentiary and data based and anchored in biological experiments). Instead we are engaged in a rigorous analysis of risk as it is derived from mathematical probability theory. Many of the citations you are asking for fall within the “carpenter fallacy” that we present in the text, i.e. that discussions about carpentry are not relevant to and distract from identifying the risks associated with gambling even though the construction of a roulette wheel involves carpentry. Mathematical probability-related arguments do not require biological citations. At the same time we have striven to explain how the biological context maps onto the risk analysis so that the connection between the two is more apparent to those who are focused on biology. For this reason we are providing the responses below. As a general comment, it would be very helpful for biologists who are contemplating or engaging in engineering strategies to read about the failures of systems engineering discussed in the text (Section VIII). This should lead to a better understanding about why the issue is not biology per se, but about the nature of engineering of complex systems in cases that carry high potential harm, for example as has been found in modernization of the Air Traffic Control system. Reading that discussion should establish a better context for a conversation about the risks in biological engineering.

via Response to review by Trevor Charles re: Precautionary Principle | NECSI.

Mapping Payoffs (pdf)

This is a mathematical-legal attempt at formally mapping payoffs and assessing their memberships in precisely defined classes. By legal we mean as expressed explicitly in a codified term sheet, legal contract, or formal legal code, which naturally converge to the mathematical definitions. The aims is showing the impossibility of verbalistic discussion of risk and exposures and the corresponding biases, and shows how the gap in stochastic properties between the verbalistic and mathematical increases under fat tails. Many biases in the psychology-decision science literature (such as the overestimation of tail events, or the long shot bias in fat-tailed domains) are shown to simply result from misdefinitions or sloppy verbalism.

via MappingPayoffs.pdf.