- Shopping Bag ( 0 items )
Want a NOOK? Explore Now
INTRODUCTION
The playing of games has long been a natural human leisure activity. References in art and literature go back for several thousand years, and archaeologists have uncovered many ancient objects which are most readily interpreted as gaming boards and pieces. The earliest games of all were probably races and other casual trials of strength, but games involving chance also appear to have a very long history. Figure 1.1 may well show such a game. Its rules have not survived, but other evidence supports the playing of dice games at this period.
And if the playing of games is a natural instinct of all humans, the analysis of games is just as natural an instinct of mathematicians. Who should win? What is the best move? What are the odds of a certain chance event? How long is a game likely to take? When we are presented with a puzzle, are there standard techniques that will help us to find a solution? Does a particular puzzle have a solution at all? These are natural questions of mathematical interest, and we shall direct our attention to all of them.
To bring some order into our discussions, it is convenient to divide games into four classes:
a. games of pure chance;
b. games of mixed chance and skill;
c. games of pure skill;
d. automatic games.
There is a little overlap between these classes (for example, the children's game 'beggar your neighbour', which we shall treat as an automatic game, can also be regarded as a game of pure chance), but they provide a natural division of the mathematical ideas.
Our coverage of games of pure chance is in fact fairly brief, because the essentials will already be familiar to readers who have made even the most elementary study of the theory of probability. Nevertheless, the games cited in textbooks are often artificially simple, and there is room for an examination of real games as well. Chapters 2 and 3 therefore look at card and dice games respectively, and demonstrate some results which may be surprising. If, when designing a board for snakes and ladders, you want to place a snake so as to minimize a player's chance of climbing a particular ladder, where do you put it? Make a guess now, and then read Chapter 3; you will be in a very small minority if your guess proves to be right. These chapters also examine the effectiveness of various methods of randomization: shuffling cards, tossing coins, throwing dice, and generating allegedly 'random' numbers by computer.
Chapter 4 starts the discussion of games which depend both on chance and on skill. It considers the spread of results at ball games: golf (Figure 1.2), association football, and cricket. In theory, these are games of pure skill; in practice, they appear to contain a significant element of chance. The success of the player's stroke in Figure 1.2 will depend not only on how accurately he hits the ball but on how it negotiates any irregularities in the terrain. Some apparent chance influences on each of these games are examined, and it is seen to what extent they account for the observed spread of results.
Chapter 5 looks at ways of estimating the skill of a player. It considers both games such as golf, where each player returns an independent score, and chess, where a result merely indicates which of two players is the stronger. As an aside, it demonstrates situations in which the cyclic results 'A beats B, B beats C, and C beats A' may actually represent the normal expectation.
Chapter 6 looks at the determination of a player's optimal strategy in a game where one player knows something that the other does not.
This is the simplest case of the 'theory of games' of von Neumann. The value of bluffing in games such as poker is demonstrated, though no guarantee is given that the reader will become a millionaire as a result. The chapter also suggests some practical ways in which the players' chances in unbalanced games may be equalized.
Games of pure skill are considered in Chapters 7-10. Chapter 7 looks at puzzles, and demonstrates techniques both for solving them and for diagnosing those which are insoluble. Among the many puzzles considered are the 'fifteen' sliding block puzzle, the N queens' puzzle both on a flat board and on a cylinder, Rubik's cube, peg solitaire, and the 'twelve coins' problem.
Chapter 8 examines 'impartial' games, in which the same moves are available to each player. It starts with the well-known game of nim, and shows how to diagnose and exploit a winning position. It then looks at some games which can be shown on examination to be clearly equivalent to nim, and it develops the remarkable theorem of Sprague and Grundy, according to which every impartial game whose rules guarantee termination is equivalent to nim.
Chapter 9 considers the relation between games and numbers. Much of the chapter is devoted to a version of nim in which each counter is owned by one player or the other; it shows how every pile of counters in such a game can be identified with a number, and how every number can be identified with a pile. This is the simplest case of the theory of 'numbers and games' which has recently been developed by Conway.
Chapter 10 completes the section on games of skill. It examines the concept of a 'hard' game; it looks at games in which it can be proved that a particular player can always force a win even though there may be no realistic way of discovering how; and it discusses the paradox that a game of pure skill is playable only between players who are reasonably incompetent (Figure 1.3).
Finally, Chapter 11 looks at automatic games. These may seem mathematically trivial, but in fact they touch the deepest ground of all. It is shown that there is no general procedure for deciding whether an automatic game terminates, since a paradox would result if there were; and it is shown how this paradox throws light on the celebrated demonstration, by Kurt Gödel, that there are mathematical propositions which can be neither proved nor disproved.
Most of these topics are independent of each other, and readers with particular interests may freely select and skip. To avoid repetition, Chapters 4 and 5 refer to material in Chapters 2 and 3, but Chapter 6 stands on its own, and those whose primary interests are in games of pure skill can start anywhere from Chapter 7 onwards. Nevertheless, the analysis of games frequently brings pleasure in unexpected areas, and I hope that even those who have taken up the book with specific sections in mind will enjoy browsing through the remainder.
As regards the level of our mathematical treatment, little need be said. This is a book of results. Where a proof can easily be given in the normal course of exposition, it has been; where a proof is difficult or tedious, it has usually been omitted. However, there are proofs whose elegance, once comprehended, more than compensates for any initial difficulty; striking examples occur in Euler's analysis of the queens on a cylinder, Conway's of the solitaire army, and Hutchings's of the game now known as 'sylver coinage'. These analyses have been included in full, even though they are a little above the general level of the book. If you are looking only for light reading, you can skip them, but I hope that you will not; they are among my favourite pieces of mathematics, and I shall be surprised if they do not become among yours as well.
CHAPTER 2THE LUCK OF THE DEAL
This chapter looks at some of the probabilities governing play with cards, and examines the effectiveness of practical shuffling.
Counting made easy
Most probabilities relating to card games can be determined by counting. We count the total number of possible hands, and the number having some desired property. The ratio of these two numbers gives the probability that a hand chosen at random does indeed have the desired property.
The counting can often be simplified by making use of a well-known formula: if we have n things, we can select a subset of r of them in n !/{r !(n -r)!} different ways, where n ! stands for the repeated product n × (n -1) × ... × 1. This is easily proved. We can arrange r things in r ! different ways, since we can choose any of them to be first, any of the remaining (r - 1) to be second, any of the (r - 2) still remaining to be third, and so on. Similarly, we can arrange r things out of n in n × (n -1) × ... × (n - r + 1) different ways, since we can choose any of them to be first, any of the remaining (n - 1) to be second, any of the (n - 2) still remaining to be third, and so on down to the (n -r + 1)th; and this product is clearly n!/(n - r!. But this gives us every possiblearrangement of r things out of n, and we must divide by r! to get the number of selections of r things that are actually different.
The formula n!/{r !(n - r)!} is usually denoted by (nr). The derivation above applies only if 1 = r = (n - 1), but the formula can be extended to cover the whole range 0 ≤ r ≤ n by defining 0! to be 1. Its values form the well-known 'Pascal's Triangle'. For n ≤ 10, they are shown in Table 2.1.
To see how this formula works, let us look at some distributional problems at bridge and whist. In these games, the pack consists of four 13-card suits (spades, hearts, diamonds, and clubs), each player receives thirteen cards in the deal, and opposite players play as partners. For example, if we have a hand containing four spades, what is the probability that our partner has at least four spades also?
First, let us count the total number of different hands that partner may hold. We hold 13 cards ourselves, so partner's hand must be taken from the remaining 39; but it must contain 13 cards, and we have just seen that the number of different selections of 13 cards that can be made from 39 is (3913). So this is the number of different hands that partner may hold. It does not appear in Table 2.1, but it can be calculated as (39 × 38 × ... × 27)/(13 × 12 × ... × 1), and it amounts to 8 122 425 444. Such numbers are usually rounded off to a sensible number of decimals (8.12 × 109, for example), but it is convenient to work out this first example in full.
Now let us count the number of hands in which partner holds precisely four spades. Nine spades are available to him, so he has (94) = 126 possible spade holdings. Similarly, 30 non- spades are available to him, and his hand must contain nine non-spades, so he has (309) = 14 307 150 possible non-spade holdings. Each spade holding can be married with each of the non-spade holdings, which gives 1 802 700 900 possible hands containing precisely four spades.
So, if partner's hand has been dealt at random from the 39 cards available to him, the probability that he has exactly four spades is given by dividing the number of hands containing exactly four spades (1 802 700 900) by the total number of possible hands (8 122 425 444). This gives 0.222 to three decimal places.
A similar calculation can be performed for each number of spades that partner can hold, and summation of the results gives the figures shown in the first column of Table 2.2. In particular, we see that if we hold four spades ourselves, the probability that partner holds at least four more is approximately 0.34, or only a little better than one third. The remainder of Table 2.2 has been calculated in the same way, and shows the probability that partner holds at least a certain number of spades, given that we ourselves hold five or more.
Table 2.3 shows the other side of the coin. It assumes that our side holds a certain number of cards in a suit, and shows how the remaining cards are likely to be distributed between our opponents. Note that the most even distribution is not necessarily the most probable. For example, if we hold seven cards of a suit ourselves, the remaining six are more likely to be distributed 4–2 than 3–3, because '4–2' actually covers the two cases 4–2 and 2–4.
Tables 2.2 and 2.3 do not constitute a complete recipe for success at bridge, because the bidding and play may well give reason to suspect abnormal distributions, but they provide a good foundation.
4-3-3-3 and all that
A similar technique can be used to determine the probabilities of the possible suit distributions in a hand.
For example, let us work out the probability of the distribution 4-3-3-3 (four cards in one suit and three in each of the others). Suppose for a moment that the four-card suit is spades. There are now (134) possible spade holdings and (133) possible holdings in each of the other three suits, and any combination of these can be joined together to give a 4-3-3-3 hand with four spades. Additionally, the four-card suit may be chosen in four ways, so the total number of 4-3-3-3 hands is 4 × (134) × (133) × (133) × (133). But the total number of 13-card hands that can be dealt from a 52-card pack is (5213), so the probability of a 4--3--3--3 hand is 4 × (134) × (133) × (133) × (133)/(5213). This works out to 0.105 to three decimal places.
Equivalent calculations can be performed for other distributions. In the case of a distribution which has only two equal suits, such as 4-4-3-2, there are twelve ways in which the distribution can be achieved (4S-4H-3D-2C, 4S-4H-2D-3C, 4S-3H-4D-2C, 4S-3H-2D-4C, and so on), while in the case of a distribution with no equal suits, such as 5-4-3-1, there are 24 ways. The multiplying factor 4 must therefore be replaced by 12 or 24 as appropriate. This leads to the same effect that we saw in Table 2.3: the most even distribution (4-3-3-3) is by no means the most probable. Indeed, it is only the fifth most probable, coming behind 4-4-3-2, 5-3-3-2, 5-4-3-1, and 5-4-2-2.
The probabilities of all the possible distributions are shown in Table 2.4. This table allows some interesting conclusions to be drawn. For example, over 20 per cent of all hands contain a suit of at least six cards, and 4 per cent contain a suit of at least seven; over 35 per cent contain a very short suit (singleton or void), and 5 per cent contain an actual void. These probabilities are perhaps rather higher than might have been guessed before the calculation was performed. Note also, as a curiosity, that the probabilities of 7-5-1-0 and 8-3-2-0 are exactly equal. This may be verified by writing out their factorial expressions in full and comparing them.
Table 2.4 shows that the probability of a 13-0-0-0 distribution is approximately 6.3 × 10–12. The probability that all four hands have this distribution can be calculated similarly, and proves to be approximately 4.5 × 10–28. Now if an event has a very small probability p, it is necessary to perform approximately 0.7/p trials in order to obtain an even chance of its occurrence.2 A typical evening's bridge comprises perhaps twenty deals, so a once-a- week player must play for over one hundred million years to have an even chance of receiving a thirteen-card suit. If ten million players are active once a week, a hand containing a thirteen-card suit may be expected about once every fifteen years, but it is still extremely unlikely that a genuine deal will produce four such hands.
Shuffle the pack and deal again
So far, we have assumed the cards to have been dealt at random, each of the possible distributions being equally likely irrespective of the previous history of the pack. The way in which previous history is destroyed in practice is by shuffling, so it is appropriate to have a brief look at this process.
Let us restrict the pack to six cards for a moment, and let us consider the shuffle shown in Figure 2.1. The card which was in position I has moved to position 4, that which was in position 4 has moved to position 6, and that which was in position 6 has moved to position 1. So the cards in positions 1, 4, and 6 have cycled among themselves. We call this movement a three-cycle, and we denote it by (1,4,6). Similarly, the cards in positions 2 and 5 have interchanged places, which can be represented by the two-cycle (2,5), and the card in position 3 has stayed put, which can be represented by the one-cycle (3). So the complete shuffle is represented by these three cycles. We call this representation by disjoint cycles, the word 'disjoint' signifying that no two cycles involve a common card. A similar representation can be obtained for any shuffle of a pack of any size. If the pack contains n cards, a particular shuffle may be represented by anything from a single n-cycle to n separate one-cycles.
Excerpted from The Mathematics of Games by John D. Beasley. Copyright © 2014 Dover Publications, Inc.. Excerpted by permission of Dover Publications, Inc..
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
Overview
"Mind-exercising and thought-provoking."—New Scientist
If playing games is natural for humans, analyzing games is equally natural for mathematicians. Even the simplest of games involves the fundamentals of mathematics, such as figuring out the best move or the odds of a certain chance event. This entertaining and wide-ranging guide demonstrates how simple mathematical analysis can throw unexpected light on games of every type—games of chance, ...