Re: 15 out of 16 times (math, not laundry)
Pretend the casino is run out of a church. "Parishioners" arrive and enter a confessional to place their bets. The "priest" cannot see who is placing each bet. Each "parishioner" plays until he or she is broke. "Parishioners" arrive at a steady rate and will do so indefinitely.
Let me just make sure I understand what you mean. I believe you are saying: Conjecture A: A.1 As parishoners play and leave, the division of wealth approaches the `odds' of the game. Thus if the odds are .51 house (of God), .49 parishoner, then eventually the house will end up with 51 cents out of every dollar `played'. Just as it would if the church were playing against one very wealthy parishoner (i.e., the `world'). A.2 Since there are a large number of parishoners, enough games can always be played to make the distribution match the odds. If this is _not_ what you mean to say then I apologize for missing your point; read no further---just send me explanations to clear up my mis-understanding. If Conjecture A is accurate statement of your belief, then please step across this line. ---------- Let me walk through your model, one parishoner at a time. Please read this with an open mind; it could be true.
Each "parishioner" plays until he or she is broke.
Lets say the odds of the game are .51 to .49. Each parishoner has $100. Each parishoner plays until broke. At some point in play, the distribution of wealth with respect to _that player_ may be arbitrarily close to c=$51, p=$49. What, though, is the distribution at the _end_ of that game? Since each game only ends when the p=$0, the distribution is c=$100, p=$0. On to the next parishoner. After the 9th, but before the 10th parishoner, the distribution must be c=$900, p[10]=$100. It can't be worse than that for the church, or we wouldn't have moved on to the 10th parishoner. It can't be better for the player because each has only $100 to wager. After the n'th, c=$100n, p[n+1]=$100. Conjecture A predicts that as n, the number of players, goes to infinity, c, the fraction of money won by the church, approaches C, the probability the church will win a single trial. But in fact, the model shows that as n approaches infinity, c goes to 1. Where could one disagree with this interpretation of the model? a. Maybe the church has 10 confessionals, or 1000, or 10,000. Serializing the players might be a `paper' advantage to the church that doesn't occur in reality. b. Players can have any amount of money, not just $100 dollars. c. What if the church goes broke? (a) Imagine that the church has at most k confessionals, and thus can play no more than k simultaneous games. Fill all k. All other players are waiting in line for an open spot. The next parishoner can't play until an existing player goes broke. The distribution of wealth during play by the (k-1+10)th player is exactly as before, except now it is +/-$100(k-1). (b) has no impact. As above, at the end of each game the fraction of money won by the church with respect to that player is 1 (assuming it's the player and not the church that `went out'). (c) If the church goes broke, all bets are off, literally but not figuratively. The distribution of wealth is c=0, P=1 (P for all players as opposed to p for a single player). This also does not match the expectation of .51.
The chance of the "church" to win or lose is the same on every bet, regardless of who places it.
That is true. But the only way the player can realize his mathematical expectations is if he is allowed to continue playing even after he is out of money (i.e., so he can climb back out of the hole). Ok, the first player goes out, but the infinity of players after him can make up for that, right? Wrong, because on his way to winning back the first players money, if the second player goes broke, _his_ game is over. Now its up the third guy, ad infinitum (literally)..... just because the series is infinite doesn't mean the sum is. No set of players, all of whom go broke, break the church. Therefore, for the series to end it must be instigated by a set of players that includes at least one who doesn't go broke (i.e., the church goes broke instead). In fact, a single player who doesn't go broke ends the series without any help from other players. Thus, to stem the tide of pious donations (i.e., the church's winnings), a single player with enough money to `outlast' the church is required. Hope you found this interesting but not insulting, Scott Collins | "That's not fair!" -- Sarah | "You say that so often. I wonder what your basis 408.862.0540 | for comparison is." -- Goblin King ................|.................................................... BUSINESS. fax:974.6094 R254(IL5-2N) collins@newton.apple.com Apple Computer, Inc. 5 Infinite Loop, MS 305-2D Cupertino, CA 95014 ..................................................................... PERSONAL. 408.257.1746 1024:669687 catalyst@netcom.com
Pretend the casino is run out of a church. "Parishioners" arrive and enter a confessional to place their bets. The "priest" cannot see who is placing each bet. Each "parishioner" plays until he or she is broke. "Parishioners" arrive at a steady rate and will do so indefinitely.
Let me just make sure I understand what you mean. I believe you are saying:
Conjecture A:
A.1 As parishoners play and leave, the division of wealth approaches the `odds' of the game. Thus if the odds are .51 house (of God), .49 parishoner, then eventually the house will end up with 51 cents out of every dollar `played'. Just as it would if the church were playing against one very wealthy parishoner (i.e., the `world').
A.2 Since there are a large number of parishoners, enough games can always be played to make the distribution match the odds.
If this is _not_ what you mean to say then I apologize for missing your point; read no further---just send me explanations to clear up my mis-understanding. If Conjecture A is accurate statement of your belief, then please step across this line.
I agree with both conjectures.
----------
Let me walk through your model, one parishoner at a time. Please read this with an open mind; it could be true.
Each "parishioner" plays until he or she is broke.
Lets say the odds of the game are .51 to .49. Each parishoner has $100. Each parishoner plays until broke.
At some point in play, the distribution of wealth with respect to _that player_ may be arbitrarily close to c=$51, p=$49. What, though, is the distribution at the _end_ of that game? Since each game only ends when the p=$0, the distribution is c=$100, p=$0. On to the next parishoner.
After the 9th, but before the 10th parishoner, the distribution must be c=$900, p[10]=$100. It can't be worse than that for the church, or we wouldn't have moved on to the 10th parishoner. It can't be better for the player because each has only $100 to wager. After the n'th, c=$100n, p[n+1]=$100.
Conjecture A predicts that as n, the number of players, goes to infinity, c, the fraction of money won by the church, approaches C, the probability the church will win a single trial. But in fact, the model shows that as n approaches infinity, c goes to 1.
There is a slight difference between what Conjecture A predicts and this statement. Conjecture A predicts that as b, the number of bets, goes to infinity the fraction of bets won will approach C, the probability that the church will win a single trial.
Where could one disagree with this interpretation of the model?
You should think about what you mean by "fraction of money". I think there is a seductive error here. In one sense, we mean the amount of money placed on bets, but we also mean the actual bank notes in play. These concepts address two different things. Whether or not banknotes are recycled by the parishioners will not affect the church's winnings.
[...Deleted parts which I think are answered above...]
The chance of the "church" to win or lose is the same on every bet, regardless of who places it.
That is true. But the only way the player can realize his mathematical expectations is if he is allowed to continue playing even after he is out of money (i.e., so he can climb back out of the hole).
Each parishioner has a high probability of losing their savings and a low probability of winning everything owned by the church. It is possible for any single parishioner to win everything, but it is unlikely.
Ok, the first player goes out, but the infinity of players after him can make up for that, right? Wrong, because on his way to winning back the first players money, if the second player goes broke, _his_ game is over. Now its up the third guy, ad infinitum (literally)..... just because the series is infinite doesn't mean the sum is.
No set of players, all of whom go broke, break the church. Therefore, for the series to end it must be instigated by a set of players that includes at least one who doesn't go broke (i.e., the church goes broke instead). In fact, a single player who doesn't go broke ends the series without any help from other players.
Thus, to stem the tide of pious donations (i.e., the church's winnings), a single player with enough money to `outlast' the church is required.
The player needs to be lucky. Let's say the church's assets are H dollars. In order for it to lose everything, it has to have a series of bets whose sum is a negative value less than -H. This series has a beginning - the point at which the church's assets dropped below H and moved down to 0. If parishioners play until they win or are broke, the player who took the church below H will be the same player who wins everything. (I am assuming fixed size bets, but the conclusions can be generalized.) This player wins because he or she was fortunate enough to place the first bet in the series. Having more capital means that more bets can be placed. That increases the probability of placing the first bet in the winning series, but does not affect the odds of the church losing everything.
Hope you found this interesting but not insulting,
I found it interesting. Your message was written clearly. I've seen this question and similar ones come up again and again in discussions of gambling, trading, and insurance. It would be nice if having a large body of capital would allow one to "make money off the noise", but it isn't so. It has been observed that small traders in the futures markets tend to lose money to large traders. One way this has been explained is that the large traders outlast the small traders with their larger capital and that is how they make money. I think a more likely explanation is that the large traders tend to make good trades. Peter
participants (2)
-
collins@newton.apple.com -
ph@netcom.com