Spoiler: It's almost always 3-4x the value of a royal flush. So you needed $12-16k if you were playing a $1-per-coin game with a 1% edge at a pretty good clip.
And what do you earn with perfect play in that situation? The princely sum of around $30 an hour.
"$1 per coin game" is this a game where you put in $1 to play and get paid either $2 or $0 with 50-50 probability (0 expected).
And the what does it mean %1 edge? Does it mean the probabilities are such that the expected payout is 1c per coin flip?
Even though it's +EV for the player, you'd need some bankroll to ride out the variance as you could lose on X casinos in a row. Ages ago these were really +EV and you could usually just autoplay them with small bets, so the bankroll requirements weren't that harsh. Later on the wagering requirement on the bonuses grew, often making the small bet grind unprofitable, but you could still find profitable situations when played with correct bet sizing. But those needed much bigger bankroll as usually it was more +EV the bigger the bets you made, so you'd often play many casinos for just a few minutes with big bets losing your deposit and bonus, but sometimes winning big and covering the losses with profit left over.
Every video poker game in Nevada is required to be truly random. And every game has the payout for every possible poker hand shown on the game. A bit of math allows you to calculate both the correct strategy for any five cards dealt (which you memorize, just like proper blackjack strategy), but it also tells you the theoretical return of the game with perfect play.
As an example, 9/6 Jacks or Better (a game that pays nine coins for each coin played for a full house and six coins for each coin played for a flush) has a theoretical payout of 99.54% with perfect play. This puts it in the range of blackjack. And, like blackjack, you will eventually go broke because it's still not over 100%.
Unlike blackjack, you can't count cards. But what you can do is seek out returns in other ways. In the 1990s and 2000s, some casinos would compete on cashback comps. Add 0.33% or 0.5% cashback to the game I just described, and you're close to (or barely over) 100% payback. Find a game with a baseline payout of over 100% (full-pay Deuces Wild is 100.76, as a [rare] example), and you're deeper into the profitability zone.
Small returns unless you're playing higher denomination returns with a giant bankroll. Most people who do this make it a bit of a lifestyle -- pushing tens or hundreds of thousands of dollars through the machine gets you noticed by the casino, leading to free rooms, free meals, invitations to parties, etc.
Others look (or looked -- it's rarer now) for poorly planned promotions where a scarce hand pays off grandly and changes the math. Most of the life-changing wins in this space came from those sorts of situations.
This argument was used--by SBF and others--to justify truly absurd risk taking. I don't think it's an exaggeration to suggest that this misunderstanding may have been one of the primary drivers of Alameda's (and hence FTX's) downfall. For a group with as many smart people as EA and as many people obsessed with existential risks as EA not to have started screaming en masse when SBF suggested he would take a 51-49 bet on doubling utility or deleting all known life out of existence[1] is insane.
The mathematical misunderstanding is one part of it. Kelly betting dominates any other betting strategy in the sense that as the number of bets increases the probability that the Kelly better will have more money than someone following any other strategy approaches 1. You don't need a logarithmic utility function. If I bet Kelly and you follow some other strategy, eventually I will almost surely end up with more money and more utility than you.
I suspect another part of it is a misunderstanding by SBF (and perhaps others) of Jane Street's trading strategy. Jane Street encouraged their traders to be "risk neutral", which can be expressed as maximizing expected utility with a linear utility function. They wanted their traders to be willing to take big risks. But any individual trader is only working with a tiny fraction of Jane Street's capital, so even if they're risking all the money they've been given to work with on a bet that's still a small bet relative to the entire company. SBF seems to have taken that same risk neutral idea and applied it to the entirety of Alameda/FTX's available capital (and indeed expressed a willingness to apply it to the combined utility of the entire world), with predictably disastrous results.
[1] https://elmwealth.com/a-missing-piece-of-the-sbf-puzzle/
I don't actually buy that argument and think it's insane, but it would not remotely surprise me if SBF believed it, and if you do, then you don't really observe the Kelly criterion. You take the ruin for the larger team of other yous that collectively wins. If the density of quantum branches in which he funded colonization of the galaxy is greater than the density in which he is serving life in prison, it was worth it.
I had forgotten about this line of argument, but I came across it in a post on the EA forums arguing that you should choose what to do with your life this way. Basically if you believe that in this branch you have far above average ability (compared to yourself in other branches) to do good in the world then you should devote your life to altruism; conversely, if in this branch you have below average ability (again compared to yourself in other branches) then it's ok to spend your time playing video games instead.
[1] https://trends.google.com/trends/explore?date=today%205-y&ge...
One way for a process to not be ergodic in the mean is when there's some sort of barrier, as sibling comments allude to.
Another is if the overall mean value is picked randomly each time the process starts, but is different each time the process runs. So for example personal monthly expenditures are not ergodic in the mean, because some people are born into circumstances that make them wealthy, and they will on average spend more each month than people not born into such good circumstances.
The ensemble average will tend towards people's average spending, while the temporal average will tend towards each individual's spending.
A million players each placing a single bet will have an expectation of losing the house edge.
A single player placing a million bets has an expectation of $0.
The fact that the aggregate and the single entity Experience different expectations despite both placing a million bets is what makes this ergodic.
Nassim Taleb also talks about this quite a lot: https://youtu.be/91IOwS0gf3g
TL;DR: while a single investment may be ergodic, portfolio management (the math behind weighting successive and concurrent investments/bets) is not, as it has a strong dependence on all prior states.
Ergodicity is less about memorylessness and more about the constraints on transitions into this or that state. A system is ergodic if "anything that can be an outcome, eventually will happen".
The article mentions fractional Kelly is a hedge. But what fraction is optimal to use? That is also unknowable.
Finance folks, correct me if I’m wrong, but the Kelly Criterion is rarely used in financial models but is more a rule of thumb that says roughly if you have x $ and probability p, in a perfect world you should only bet y amount. But in reality y cannot be determined accurately because p is always changing or hard to measure.
The Kelly criterion is an optimization of capital growth (its logarithm) method/guide. Not using it doesn't change its correctness.
But yes you need to know the advantage/the edge you have. Like with pricing methods eg for European options for Black Scholes you need to know the volatility and there is no way to know it, you estimate. This is where all the adjusting for bias and ML comes in.
I don’t think it is used in this way. It swings too much with a given p.
No, not generally. Since it's a quadratic function we're optimising, it's surprisingly flat at the top. Sure, there are some bets where the edge is tiny and 0.01 percent is a large proportion of that, but that doesn't invalidate the Kelly criterion – by what other criterion would you determine the appropriate bet size?
> is more a rule of thumb that says roughly if you have x $ and probability p, in a perfect world you should only bet y amount.
It applies far more broadly than to binary bets. It tells you how to allocate your spending optimally across any number of opportunities, based on joint probability of outcomes.
Both of your misconceptions are common, and they are addressed in the article linked in the submission: https://entropicthoughts.com/the-misunderstood-kelly-criteri...
https://github.com/obrhubr/kelly-criterion-blackjack/blob/ma...
I think it shows that Blackjack is not even theoretically winnable over time if you have to pay for information on the count in the form on minimum bets. The ideal case it that you bet $0.49 for every $1,000 in your investment pool when the count is extraordinarily high.
Even if you hack the casino's cameras so you know the count without having to be at the table, your reward is a growth rate that is very low per hand.
I wrote this game a few years ago, and it didn't use to have this bug. I can't take the time to figure it out now, but clearly something went wrong along the way!
[1]: Fortunately it's easily reproducible: put down the minimum in all first 12 turns and then look at the weirdness that is turn 13! https://xkqr.org/ship-investor/ship-investor.html?seed=883