Read an Excerpt
Two-Person Game Theory
By Anatol Rapoport Dover Publications, Inc.
Copyright © 1966 Anatol Rapoport
All rights reserved.
ISBN: 978-0-486-28109-4
CHAPTER 1
Games
Game theory is to games of strategy what probability theory is to games of chance. And just as probability theory far transcends its role as the logical basis of rational gambling, so does game theory transcend its original guise as the logical basis of parlor games.
Yet the history of game theory is not a replica of the history of probability theory. The birth of the latter is reckoned from some problems posed by the intellectually inclined gambler Chevalier de Méré to the philosopher-mathematician Blaise Pascal. Probability theory received its original impetus from attempts to solve concrete problems.
This was not the case with game theory. No one came to John von Neumann with questions about how to play chess or poker. Nor do we have any evidence that von Neumann, who laid the foundations of game theory practically single-handed, was an outstanding expert in any game of strategy. Nor is there any more reason to expect such skill from a game theoretician than to expect instrumental virtuosity from an orchestra conductor or mechanical ingenuity from a physicist. Came theory is concerned not with any particular game but with all of them, not with technical but with theoretical matters. "What is the best way to play Chess?" is not a game-theoretical question. On the other hand, "Is there a best way to play Chess?" is a game-theoretical question.
It may seem at first that the question is a foolish one. One might be convinced that of course there is a best way to play Chess since there is a hierarchy of Chess players. Some players are obviously better than others; therefore they must use better ways of playing, and so there must be a best way to play Chess.
In response to this line of reasoning, a game theoretician would reply as follows. Yes, there is a best way to play Chess, but the supporting arguments just given for this conclusion are not valid. It is not true in general that if we have a number of quantities clearly comparable with respect to magnitude, then one of them must be the largest. Consider, for example, the sequence of natural numbers, 1, 2, 3, etc. Each is larger than the preceding; yet there is no greatest natural number. Nor must the magnitudes increase without bound so that an ordered set of them will have no largest member. Consider, for example, the set of all real numbers between zero and one, exclusive of zero and one. This is an ordered set, because of any two such numbers one is definitely the larger. Yet the set as defined has neither a smallest nor a largest number. To be sure, it is true that although the magnitudes of the numbers from zero to one are not increasing without bound, the quantity of such numbers is infinite; and it is true that if a set ordered by magnitude has only a finite number of members, there must be a smallest and a largest. Therefore, the question of whether there is a best way to play Chess is related to the question of whether the number of ways to play a game of Chess is finite or infinite. This number happens to be finite. But whether it is an ordered set in the sense that it is always possible to decide which is the better way of playing between any two, depends on the principle according to which the ordering is made. Finally, it does not follow that if there is a best way of playing a game, the player who knows it will always win. For the player who plays against him may also know a "best way." What happens when both have this knowledge is also an open question. In some games the victory must go to the one or to the other; in other games, if both know a "best way to play," the outcome must always be a draw. In still other games, not necessarily games of chance as these are usually understood, the outcome will nevertheless depend on chance.
All the matters just touched upon pertain to the theory of games. They are not matters related to any specific game; they are matters related to games in general. The way the questions have been put suggests that games can be classified according to the way the questions are answered.
Game theory is, accordingly, very largely concerned with the classification of games, and in this it has much in common with other sciences which at a certain stage of their development were concerned mainly with classification.
For example, biology was for many centuries mainly a classification science (a taxonomy). Biologists sought a "proper" way to classify plants and animals. It would seem at first that what the "proper" principles of classification are depends crucially on what the classifier is interested in. For instance, someone coming in frequent contact with animals in a primitive life environment might classify animals into large and small, or into dangerous and harmless, or into edible and inedible. There comes a time, however, when observation and description of nature becomes more or less separated from immediate utilitarian interests. Accordingly, biologists soon recognized that although mice and lizards were both small animals while horses and crocodiles were both large animals, nevertheless mice were more closely related to horses than to lizards while crocodiles were more closely related to lizards than to horses.
Formal Decision Theory
The principle according to which game theory classifies games is best understood if game theory is viewed as the branch of mathematics concerned with the formal aspect of rational decision. The emphasis is on the word "formal," which in this context means "devoid of content." As has been said, game theory has been hitherto developed as a branch of mathematics. Mathematics treats of formal relations devoid of content. For example, arithmetic is not concerned with apples, sheep, or dollars, but only with relations among numbers be they of apples, sheep, dollars, or divorces. Geometry is not concerned with land tracts or shapes of objects but only with spatial relationships. Similarly a mathematical theory of rational decision is concerned not with the problem of making wise decisions but with the logical structure of problems which arise in connection with the necessity of making decisions.
A decision problem is the problem of choosing among a set of alternative actions. Clearly such a problem is meaningful only if whoever must make the choice has some idea of what the consequences of his choice may be. In the simplest case, each choice of action leads to a single specific consequence, so that the choice of action is equivalent to a choice of a consequence among a set of consequences. Hence the first condition which must be fulfilled if a decision problem is to have meaning is that choices have known consequences:
Next, choice is meaningful only if the chooser has preferences. Thus the second condition which must be fulfilled, if the decision problem is to have meaning, is the existence of preferences of the chooser. Again we may consider the simplest case in which all the possible consequences can be clearly rank ordered, calling the most preferred consequence first choice, the next most preferred second choice, etc. We have already assumed that in the simplest case, each choice of action has exactly one consequence, known to the chooser. Therefore the decision problem in this case reduces to the problem of assigning preference rank orders to the consequences, noting which choice leads to the most preferred outcome, and choosing it.
Most decision problems are not so simple. As a rule, an action may lead to a number of different outcomes, and which one will actually obtain in a given instance is not known to the chooser. The structure of the decision problem depends critically on the factors that determine which of the possible outcomes of an action will actually occur. If the actual outcome is determined purely by chance, we have a decision problem under risk or uncertainty. Probability theory is sometimes a valuable tool in such decision problems.
In some cases the outcome of one's choice of action will be determined not by chance but by someone else's choice of action. These situations fall properly within the scope of game theory. The classification of games is guided by the sort of decision problems that arise in the course of a game. Moreover, any situation having the abstract (formal) features of a decision problem of the same sort that appear in a game can also be called a game.
In short, what distinguishes games from nongames from the point of view of game theory is not the seriousness or lack of seriousness of a situation, nor the attitudes of the participants, nor the nature of the acts and of the outcomes, but whether certain choices of actions and certain outcomes can be unambiguously defined, whether the consequences of joint choices can be precisely specified, and whether the choosers have distinct preferences among the outcomes.
It may seem surprising that nothing has yet been said about the rules which define a game. Rules are important only to the extent that they allow the outcomes resulting from the choices of the participants to be unambiguously specified. Once these choices have been listed and the outcomes resulting from the participants' choices have been ordered according to the preference of each participant, the rules according to which the game is played are no longer of any consequence. Any other game with possibly quite different rules but leading to the same relations among the choices and the outcomes is considered equivalent to the game in question. In short, game theory is concerned with rules only to the extent that the rules help define the choice situation and the outcomes associated with the choices. Otherwise the rules of games play no part in game theory. How this comes about will, we hope, become clear in the next chapter.
The Essential Features of a Game
Game theory is concerned with situations which have the following features.
1. There must be at least two players.
2. The game begins by one or more of the players making a choice among a number of specified alternatives. In ordinary parlance such a choice is called a move. In game theory "move" refers rather to the situation in which the choice is made, for example, the specification of who is to make the choice and what alternatives are open to him. Thus the game of Chess begins with a range of twenty alternatives open to the player called White. [These twenty alternatives are advances of one or two squares for each of the eight pawns (sixteen alternatives) and two squares open to each of two knights (four alternatives).]
3. After the choice associated with the first move is made, a certain situation results. This situation determines who is to make the next choice and also what alternatives are open to him. For example, in Chess, the players alternate in making their choices regardless of the situation. In many card games, however, the player to make the next choice is always the player who took the last trick. In all games, however, it is clear in every situation which player is to make the next choice and what alternatives are open. In Chess, after White has moved, Black has the same twenty alternatives regardless of how White has moved. But this is not the case on the next move (White's). For example, if White advanced his King's pawn two squares on his first move, and Black responded by advancing his King's pawn two squares, then White cannot advance the King's pawn on his next move. Had Black responded in any other way, White would have that alternative open.
4. The choices made by the players may or may not become known. In Chess all the choices are known to both players. But in a variant of Chess called Kriegspiel none of the choices made by one player are made known to the other (although these choices can sometimes be inferred from other information). This circumstance makes this game quite different from Chess. Games in which all the choices of all the players are known to everyone as soon as they are made are called games of perfect information. Chess, Checkers, Go, Backgammon, and Tic-Tac-Toe are all games of perfect information. Most card games are not games of perfect information. It is instructive to see why this is so. In most card games, the cards are dealt face down, so that each player can see only his own hand. Imagine that Chance is a fictitious player in such a card game. The first move is Chance's. She is to "choose" between all the possible arrangements of the deck. The result of this move is unknown to the other players. Thereafter, the results of each move may be known to everyone (e.g., if the cards are played face up), but Chance's choice remains unknown at least for some time. (It may be inferred later.) Therefore such a card game is not a game of perfect information. The importance of singling out games of perfect information is that in such games there is always a "best way to play" which can be specified without mentioning chance, while in other games this is not necessarily the case.
5. If a game is described in terms of successive choices (moves), there is a termination rule. Each choice made by a player determines a certain situation. For example, following each move, the arrangement of the pieces on the chessboard defines a situation. Certain situations in Chess are called "checkmate." When such a situation occurs, the game is ended. Many card games end when all cards in the deck have been played; a game of Tic-Tac-Toe ends when one player has three naughts (or crosses) in a straight line, etc. While in common parlance one speaks of the end of a game, in game theory one speaks of the end of a play of the game. The word "game" is reserved for the totality of rules which define it. We shall for the most part adhere to the usage of game theory, but on occasions, when there is little danger of confusion, we shall revert to common usage in order to avoid awkward expressions. Thus we shall say, "Three games of Chess were played" instead of "Three plays of Chess were played."
6. Every play of a game ends in a certain situation (i.e., one of the situations which define the end of a play). Each of these situations determines a payoff to each bona fide player. A bona fide player is one who (1) makes choices and (2) receives payoffs. Thus, although we called Chance a player in a card game (because she "chose" the arrangement of the cards), we cannot call her a bona fide player, because she gets no payoffs. In a game, as defined in game theory, there must be at least two bona fide players, in the sense that each of them makes choices and receives payoffs.
Solitaire is not a game from the point of view of game theory. The player of Solitaire is, to be sure, a bona fide player, because he both makes choices (in the more sophisticated forms of Solitaire) and receives payoffs (at least wins or loses). However, the other player, Chance, only makes choices (arranging the cards), but receives no payoffs. Chance is only a dummy player. Nor can the House (if solitaire is played in a gambling establishment) be called a bona fide player, because although the House receives payoffs, it makes no choices. Playing the slot machine is not a game either. Curiously, the slot machine can be considered a bona fide player. It makes choices (to be sure at random, but this does not matter) and receives payoffs. But the person who plays the slot machine only receives payoffs. He makes no choices, because the only thing he can do in each play of the game is insert a coin and pull the lever. Therefore he is not a bona fide player. He is only a dummy player.
If the above six criteria are satisfied, we can speak of a game. A particular game is defined when the choices open to the players in each situation, the situations defining the end of a play, and the payoffs associated with each play-terminating situation have been specified.
CHAPTER 2
Utilities
So far nothing has been said about the nature of the payoffs. In game theory it is simply assumed that numbers, positive or negative, can be specified as payoffs for each of the terminating situations. In card games it is quite natural to view the money gains and losses as the payoffs. Some games like Chess or Tic-Tac-Toe are not usually played for money. However, the outcomes of these games are clearly defined as "win," "draw," or "lose."
It is assumed in game theory that all outcomes can be interpreted as numbers. For example, if the only distinguishable outcomes are win, lose, and draw, 1 can stand for win, —1 for lose, 0 for draw. How these numbers are assigned is not the game-theoretician's concern. It is conceivable, for example, that a Chess player would rather risk losing a game than settle for a draw. It is even conceivable that a man playing Checkers with a child would rather lose than win. In that case a larger payoff must be assigned to his loss than to his win.
If payoffs are in money it is quite likely that the psychological "worths" of the amounts of money do not correspond to their numerical values. To win $20 in a poker game may be "worth," to a given Poker player, more than twice as much or less than twice as much as to win $10. Since the magnitudes of the payoffs play a part in defining a particular game, the game remains undefined if we do not know what payoff magnitudes are assigned by the players to the outcomes, even if the latter are specified in terms of monetary payoffs. However, this problem is bypassed by the game theoretician, who assumes that the payoffs are given. Once the payoffs are given, the game is defined. Once the game is defined, game-theoretical analysis can be brought to bear on it.
(Continues...)
Excerpted from Two-Person Game Theory by Anatol Rapoport. Copyright © 1966 Anatol Rapoport. Excerpted by permission of Dover Publications, Inc..
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.