- Gambling and information theory
Statistical inference might be thought of as gambling theory applied to the world around. The myriad applications for logarithmic information measures tell us precisely how to take the best guess in the face of partial information [Jaynes, E.T. (1998/2003) [http://bayes.wustl.edu/ "Probability Theory: The Logic of Science"] (Cambridge U. Press, NY).] . In that sense,
information theorymight be considered a formal expression of the theory of gambling. It's no surprise, therefore, that information theory has applications to games of chance.
Kelly betting or proportional betting is an application of
information theoryto investingand (with some ethical and legal reservations) gambling. Its author was John Larry Kelly Jr..
Part of Kelly's insight was to have the gambler maximize the expectation of the "logarithm" of his capital, rather than the expected profit from each bet. This is important, since in the latter case, one would be led to gamble all he had when presented with a favorable bet, and if he lost, would have no capital with which to place subsequent bets. Kelly realized that it was this logarithm which is additive in sequential bets, and "to which the law of large numbers applies." See "
Doubling rate in gambling on a horse race is
where there are horses, the probability of the th horse winning being , the proportion of wealth bet on the horse being , and the
odds(payoff) being (e.g., if the th horse winning pays double the amount bet). This quantity is maximized by proportional (Kelly) gambling:
An important but simple relation exists between the amount of side information a gambler obtains and the expected exponential growth of his capital (Kelly):
for an optimal betting strategy, where is the initial capital, is the capital after the "t"th bet, and is the amount of side information obtained concerning the "i"th bet (in particular, the
mutual informationrelative to the outcome of each betable event). This equation applies in the absence of any transaction costs or minimum bets. When these constraints apply (as they invariably do in real life), another important gambling concept comes into play: the gambler (or unscrupulous investor) must face a certain probability of ultimate ruin, which is known as the gambler's ruinscenario. Note that even food, clothing, and shelter can be considered fixed transaction costs and thus contribute to the gambler's probability of ultimate ruin.
This equation was the first application of Shannon's theory of information outside its prevailing paradigm of data communications (Pierce).
Applications for self-information
The logarithmic probability measure
self-informationor surprisal [Tribus, Myron (1961) "Thermodynamics and Thermostatics: An Introduction to Energy, Information and States of Matter, with Engineering Applications" (D. Van Nostrand Company Inc., 24 West 40 Street, New York 18, New York, U.S.A) ASIN: B000ARSH5S.] , whose average is information entropy/uncertainty and whose average difference is KL-divergence, has applications to odds-analysis all by itself. Its two primary strengths are that surprisals: (i) reduce minuscule probabilities to numbers of manageable size, and (ii) add whenever probabilities multiply.
For example, one might say that "the number of states equals two to the number of bits" i.e. #states = 2#bits. Here the quantity that's measured in bits is the logarithmic information measure mentioned above. Hence there are N bits of surprisal in landing all heads on one's first toss of N coins.
The additive nature of surprisals, and one's ability to get a feel for their meaning with a handful of coins, can help one put improbable events (like winning the lottery, or having an accident) into context. For example if one out of 17 million tickets is a winner, then the surprisal of winning from a single random selection is about 24 bits. Tossing 24 coins a few times might give you a feel for the surprisal of getting all heads on the first try.
The additive nature of this measure also comes in handy when weighing alternatives. For example, imagine that the surprisal of harm from a vaccination is 20 bits. If the surprisal of catching a disease without it is 16 bits, but the surprisal of harm from the disease if you catch it is 2 bits, then the surprisal of harm from NOT getting the vaccination is only 16+2=18 bits. Whether or not you decide to get the vaccination (e.g. the monetary cost of paying for it is not included in this discussion), you can in that way at least take responsibility for a decision informed to the fact that not getting the vaccination involves more than one bit of additional risk.
More generally, one can relate probability p to bits of surprisal "sbits" as probability = 1/2sbits. As suggested above, this is mainly useful with small probabilities. However, Jaynes pointed out that with true-false assertions one can also define bits of evidence "ebits" as the surprisal against minus the surprisal for. This evidence in bits relates simply to the odds ratio = p/(1-p) = 2ebits, and has advantages similar to those of self-information itself.
* J. L. Kelly, Jr., " [http://www.arbtrading.com/reports/kelly.pdf A New Interpretation of Information Rate] ," "Bell System Technical Journal", Vol. 35, July 1956, pp. 917-26
Principle of indifference
Wikimedia Foundation. 2010.
Look at other dictionaries:
Information theory — Not to be confused with Information science. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental… … Wikipedia
Computers and Information Systems — ▪ 2009 Introduction Smartphone: The New Computer. The market for the smartphone in reality a handheld computer for Web browsing, e mail, music, and video that was integrated with a cellular telephone continued to grow in 2008. According to… … Universalium
Gambling — ➡ nightlife * * * Betting or staking of something of value on the outcome of a game or event. Commonly associated with gambling are horse racing, boxing, numerous playing card and dice games, cockfighting, jai alai, recreational billiards and… … Universalium
probability and statistics — ▪ mathematics Introduction the branches of mathematics concerned with the laws governing random events, including the collection, analysis, interpretation, and display of numerical data. Probability has its origin in the study of gambling… … Universalium
Game theory — is a branch of applied mathematics that is used in the social sciences (most notably economics), biology, engineering, political science, computer science (mainly for artificial intelligence), and philosophy. Game theory attempts to… … Wikipedia
Mathematics and Physical Sciences — ▪ 2003 Introduction Mathematics Mathematics in 2002 was marked by two discoveries in number theory. The first may have practical implications; the second satisfied a 150 year old curiosity. Computer scientist Manindra Agrawal of the… … Universalium
game theory — a mathematical theory that deals with strategies for maximizing gains and minimizing losses within prescribed constraints, as the rules of a card game: widely applied in the solution of various decision making problems, as those of military… … Universalium
Problem gambling — Classification and external resources ICD 10 F63.0 ICD 9 312.31 … Wikipedia
State and Local Affairs — ▪ 1997 Introduction States continued to be at the centre of national debates on public policy during 1996. The U.S. Congress, reacting in part to successful experimentation by a number of states, enacted a historic welfare reform measure… … Universalium
Law, Crime, and Law Enforcement — ▪ 2006 Introduction Trials of former heads of state, U.S. Supreme Court rulings on eminent domain and the death penalty, and high profile cases against former executives of large corporations were leading legal and criminal issues in 2005.… … Universalium