In Part I of his introduction to Kelly Criterion, Ganchrow takes on the topic of Expected Value vs Expected Growth.

A question I'm often asked is how exactly expected value differs from expected growth. The difference is somewhat subtle but understanding it is essential to risk management in general and the Kelly criterion in particular.

The question frequently arises in the context of the idea that betting one's entire bankroll implies -100% bankroll growth. (That's 100% bankroll shrinkage -- a bankroll that shrinks to \$0.) What's more, if you bet your entire bankroll in one go, 100% bankroll shrinkage is implied regardless of both the probability of the bet winning (as long as it wins less than 100% of time) and the odds paid out on the bet (as long as the odds are less than infinity).

Think about that for a moment, because it's an important point: If when you bet you wager your entire bankroll each time then you expect your bankroll to eventually shrink to zero.

Well, at least it should be important, but the truth is doesn't really get us any closer to understanding what exactly bankroll growth is and how it differs from expected value. We’ll get back to this later.

Let's start with a brief review of expected value.

The notion of expectation is central to probability and statistics and may be thought of as an average with an extra syllable. If you were to flip a coin 10 times then you could expect it would land on heads 5 times and you could expect it would land on tails 5 times. In reality of course the coin’s not always going to land on heads exactly 5 times out of 10 (in fact it would only do so about 24.6% of the time), but if you were to repeat the experiment (flipping a coin ten times) many, many times over then on average it would land on heads 5 times each trial.

The same thought process is also applicable to sports. If the Yankees can be expected to win a particular game 60% of the time, then this would mean that if the exact same game were repeated under the exact same conditions across many, many parallel universes, we would expect the Yankees to win 60% of those encounters.

So let’s say you bet \$1 straight up that the Yankees are going to win that game. Now that’s quite obviously a good bet. But just how 'good' is it?

That’s where expectations come in with sports betting. If you made the same bet in each of those parallel universes you’d win \$1 60% of the time, and lose \$1 40% of the time. Now let’s say that there are actually 1,000,000 of these universes. Exactly how much money would you make? Well, in 600,000 of those universes you’d make \$1 for a total of \$600,000 dollars, and in the remaining 400,000 of those universes you’d lose \$1 in each game for a total of \$400,000 dollars. So you'd receive \$600,000 and would pay out \$400,000 meaning that your total profit would be \$200,000. Winning \$200,000 across 1,000,000 means on average you would have won \$200,000 / 1,000,000 games = \$.20 per game.

Now of course 1,000,000 is just a made up number in this context. There aren’t really 999,999 other universes where we could make such a bet. This bet can only be made once. But that doesn’t actually matter in the world of statistics. Whether you can make this bet only one time or you can make it multiple times the expectation per game is precisely the same, namely 20%.

So to summarize, the expected value of a bet is the amount we would receive on average if we were to repeat the exact same bet a very large number of times. As such, the expected value of a bet is a a metric by which one might judge the relative attractiveness of that bet. If one bet has an expected value of 5% (meaning that for every \$10,000 we bet we would expect to win \$500) and another has an expected value of 10% (meaning that for every \$10,000 we bet we would expect to win \$1,0000), then we would tend to think that one would prefer the latter bet to the former.

But there's a bit of a difficulty here -- namely, expected value ignores any consideration of the relative likelihoods of given outcomes alone. For example a \$10,000 bet on a 0.0000000000000000000000000000000000001% likelihood event paying out at +110,000,000,000,000,000,000,000,000,000,000,000,000,000 odds corresponds to an expected value of 10% (+\$1,000). But who among us would be willing to essentially throw away \$10,000 on such a long shot? To put it in perspective you'd be about 1,870 times more likely to win the the New Jersey State Lottery five times in a row, than you would be to win this particular bet. Does it really matter that if by some fluke of nature you actually did win you'd have an unfathomably huge amount of money? If you're like most people, the answer is probably not.

So now here's the difficulty ... there's no way whatsoever to account for this very real phenomenon of preferences by appealing to the theory of expected value alone.

(Enter stage right, expected bankroll growth.)

One major problem with the proposed bet is that for most people, \$10,000 represents a rather large chunk of one’s bankroll to be throwing away on a bet that’s nearly certain to lose. But while a \$10,000 bet is probably too large a quantity to risk on this bet, there’s still a sufficiently small dollar amount that most people would be willing to risk to make this bet. Granted, for most people that dollar amount would be somewhere in the neighborhood of a tiny fraction of a penny, but it nevertheless would still be a positive dollar amount.

The fundamental issue with bets such as these is that, despite being positive EV, placing them is an excellent way to go broke. The apparent contradiction is easily reconciled. If you were to repeat this bet once in each of a gigantically huge number of parallel universes, in nearly all of the universes you’d lose your bet, but in a tiny, tiny, tiny, tiny, tiny fraction of those universes you’d have win the bet and that win quantity would make up for all the losses plus an additional 10% of the amount risked.

The fact is that most people just aren’t willing to live through billions of trillions worth of bets just to have a vanishingly minuscule probability of winning a huge odds bet once. So while the bet may have positive expected value, the expected outcome is for your bankroll to shrink by \$10,000 each time the bet’s made. If your bankroll were \$1,000,000 and you made the bet 100 times, you could expect to be broke after the 100th bet (even though your expected value would be 10% × \$1,000,000 = +\$100,000).

So let’s look at some more practical numbers. Assume you’re considering at a bet that wins with 50% probability and pays out at odds of +200. Further assume your total bankroll is \$100,000 and that you want to place 1% of your bankroll on this wager.

Question: Where do you expect your bankroll to be after 2 wagers?

Answer: There are 4 possible outcomes after placing two wagers:

• Win both bets.
• Win 1st bet, lose 2nd bet
• Lose 1st bet, win 2nd bet
• Lose both bets

Now because winning and losing the bet are both equally likely, all 4 outcomes occur with equal probability, namely 25%. Recall that you’d be betting 1% of your bankroll on each bet and would be paid off at odds of +200. Therefore, your ending bankroll under each of the 4 outcomes would be:

• B = \$100,000 × (1 + 2×1%) × (1 + 2×1%) = \$104,040
• B = \$100,000 × (1 + 2×1%) × (1 - 1%) = \$100,980
• B = \$100,000 × (1 - 1%) × (1 + 2×1%) = \$100,980
• B = \$100,000 × (1 - 1%) × (1 - 1%) = \$98,010

(The derivation of these equations is simple. Every time you win your bankroll would grow to 102% of its previous value, and every time you lose your bankroll would shrink to 99%.)

The expected value from betting in this manner would be 25%×\$104,040 + 25%×\$100,980 + 25%×\$100,980 + 25%×\$98,010 = \$101,002.50. To calculate expected growth, we would first need to recognize that given our 50% win probability, our expected outcome would be to win a bet and to lose a bet (# of wins = 50% × 2 bets, # of losses = 50% × 2 bets). Therefore our expected growth would be that associated with that outcome (with expected growth, the relative ordering of wins/losses is irrelevant), namely \$100,980.

Therefore, the expected value from the two bets is \$1,002.50 or 1.0025%, and the expected growth is \$980 or 0.9800%. Notice that expected value is higher than expected growth -- this is what you’re always going to see. Expected value will always be higher than expected growth (except for probabilities of 0 or 100%, we’ll they’ll be equal) because a few relatively large, relatively uncommon outcomes will increase EV. Another way to think about this is by realizing that the worst case scenario is losing everything one time over., while the best case scenario would be winning your bankroll infinity times over - - in other words you while your maximum possible profit is unlimited, your maximum possible loss is limited to your bankroll.

So in this instance our expected outcome would be a bankroll of:

B* = \$100,000 × (1 + 2×1%)2×50% × (1 - 1%)2×50% = \$100,980,

implying expected bankroll growth of

E(G) = \$100,980/\$100,000 = 0.9800%

It should be readily apparent our expected outcome after n bets would be a bankroll of:

B* = \$100,000 × (1 + 2×1%)n×50% × (1 - 1%)n×50% = \$100,000 × (100.48881%)n,

implying expected bankroll growth of

E(G) = (100.48881%)n -1.

By extension, our expected outcome after just 1 bet would be:

B* = \$100,000 × (1 + 2×1%)50% × (1 - 1%)50% = \$100,488.81

And our expected bankroll growth would be

E(G)= (1 + 2×1%)50% × (1 - 1%)50% - 1 = 0.48881%

(This last result bears a little discussion. We can talk about expected growth after only 1 bet in the same manner as we can talk about expected value after just one bet. In the same way as we’d never see a real result equal to our expected value, we’d never actually see growth after one bet equal to expected growth. This should cause absolutely no concern.)

So let’s generalize our results with expected outcomes and growth. Given a starting bankroll of B0, decimal odds of O, a win probability of p, and a bet size of X (as a percentage of starting bankroll, B0), the bankroll associated with the expected outcome from placing the bet would be:

B* = B0 * (1 + (O-1) * X)p * (1 - X)1-p

And expected growth would be:

E(G) = (1 + (O-1) * X)p * (1 - X)1-p - 1

Expected value, you’ll recall, would be:

EV = p*(O-1)*X - (1-p)*X = (pO - 1)*X

Q: So let’s look at a concrete example: What are the expected value and bankroll growth associated with a bet equal to 1% of bankroll paying out at -110 and winning with probability 54%?

A:
EV = 1% × (54% × 1.909091 - 1) = 0.03091% of bankroll
E(G) = (1 + 0.909091 × 1%)54% × (1 - 1%)46% - 1 = 0.02638% bankroll growth.

Q: Now let’s consider the same terms, but in the case of a player placing a bet 25% of bankroll. What would expected value and growth be in this case?

A:
EV = 25% × (54% × 1.909091 - 1) = 0.7727% of bankroll
E(G) = (1 + 0.909091 × 25%)54% × (1 - 25%)46% - 1 = -2.1510% bankroll growth = 2.1510% bankroll shrinkage

So think about these results for a moment. We have a positive expectation bet and hence, quite naturally, the more we bet on it the more we expect to make. However, if we were to wager too much on this bet then we’d expect our bankroll to shrink by 2.1510% per wager (were we to place this positive expectation bet 32 times, for example, we’d expect our bankroll to depreciate roughly a half).

So this should help elucidate the huge odds bet above. No matter how positive EV a bet might be, if you bet too much on it then you expect your bankroll to shrink. This is the concept to which people are referring when they talk about "money management." Even if you could pick NFL spreads at 75% (which you can’t), were you to bet too much, you'd expect to head towards bankruptcy.

So as a limiting case let’s look at one more example, the example of betting one’s entire bankroll mentioned at this start of this article: win probability = p, bet size = 100% of bankroll.

EV = 100% × (pO - 1) = pO - 1 (EV > 0 for p > 1/O)
E(G) = (1 + (O - 1))p × (0)1-p - 1 = -100% (for p < 1 and O < 8)

So what does this tell us? Well for one thing it tells us that even if you were the 'best handicapper ever,' were you to risk your entire bankroll on every bet, you would expect to go broke. More generally, it illustrates the concept that looking solely at expected value as a metric for the attractiveness of a given bet is not the proper way to maintain long term growth.

In the next part of this article we’ll discuss how one might use the concept of expected growth to determine bet size. This is the essence of the Kelly criterion.