Luck plays a part in almost every boardgame. In fact, I tend to dislike games that have no luck at all. Some people like games of pure strategy, but I find that the most experienced player usually wins. The outcome is often a foregone conclusion so the game lacks any excitement.

The problem with luck is that it can be very difficult to account for in games, but even luck follows certain laws: the laws of probability. Knowing these laws can give you an advantage when deciding whether to take a risk on a roll of the dice. Odds of success can vary enormously.

I have a background as a Maths teacher, so I thought we could have look at a few basic principles (without becoming too mathematical!). Probability is one the most counterintuitive and surprising areas within mathematics and it was always one of my favourite topics to teach. Welcome to Probability for Gamers 101.

### Law 1: Luck always balances out

The first, and most important, thing to realise about luck (as a gamer) is that it always balances out. If I flip a coin, I have no idea which side it will land on. It’s equally likely to land heads or tails (we say each possible outcome has a 50-50 chance), so who knows?

However, if I flip a coin 100 times, I can tell, with much greater certainty, how many heads and tails I’ll get. I would expect to get about 50 of each. I might get a few more heads than tails (or the other way round), but not many. Getting 60 heads and 40 tails, for example, would be very unlikely.

I might get lucky and get several heads in a row at the start, but every swing of luck one way will be balanced by a swing of luck in the other direction *as long as you flip the coin enough times*.

That last bit is really key. The more you flip the coin, or roll the dice, or draw cards, the more the luck balances out. So in a game like Stone Age, where you’re rolling lots and lots of dice throughout the game, there is a lot less luck involved than you might think.

The more you roll dice, the

lessluck there is.

It’s easy to think that the more you roll dice, the more luck there is in the game, but actually the opposite it true! The more you roll dice, the *less* luck there is in the game because the luck will balance out.

### Law 2: Luck doesn’t discriminate

Have you ever heard anyone declare, “Dice hate me!”? It always surprises me how many people feel that they are inherently unlucky (or lucky, but I find this less common).

It’s not possible for some people to be luckier than others. At least, not in the long run (see Law 1). Someone could obviously get lucky for one play of Catan, but if they play enough games of it, they will encounter just as much bad luck as good luck (roughly).

We notice negative consequences more than positive ones.

There are a couple of reasons why someone might feel unlucky compared with an apparently lucky competitor. The first is often referred to as Murphy’s Law. Simply stated, this is the psychological observation that we notice negative consequences more than positive ones.

From an evolutionary point of view, being acutely aware of actions that result in bad outcomes (eg. swimming in crocodile-infested waters or eating poisonous mushrooms) leads to a longer life expectancy. We’re hard-wired to notice situations where everything went wrong more than situations where it all went our way.

The second reason why we might feel someone else is luckier than us though, is because they may be better than us at calculating risk…

### Law 3: All risks were not created equal

Any game that involves luck and has you making decisions is naturally going to involve risk. Good games usually provide ways to mitigate the risk, but knowing which risks are worth taking and which aren’t provides a significant strategic advantage.

I’m going to illustrate this with an example from Poker. Poker is a very strategic game. If you play enough of it (and play it well!), Law 1 allows you to virtually eliminate the risk long term. A key requirement is knowing when to bet and when not to bet though.

I’m going to ignore the possibility of bluffing for this example. Imagine there are only two of you left. You can each see some of your opponent’s cards, but not all of them. Being an experienced Poker player, you have worked out that your odds of winning are 1 in 3, ie. you would expect to have a winning hand in that situation a third of the time. Should you bet?

The intrinsic odds don’t matter.

As it stands, you don’t have enough information to make that call. The intrinsic odds don’t matter. What matters is how the odds compare with the size of your bet and the size of the pot (how much you stand to win).

If you have to bet $1 to see the cards and there is $2 in the pot, then you shouldn’t bet. You are risking $1 to gain $2, but you will only win a third of the time. In the long run, if you kept making the same bet in the same situation, for every $3 you spent, you would win $2 (three bets costs you $3, but you only win one out of three). Not a very good return!

However, if you have to bet $1 and there is $4 in the pot, then you should bet! For every $3 you spend in the long run in that situation, you would win $4. While there are obviously other considerations in play in a real game of Poker, all competent Poker players will mentally perform this kind of calculation before considering anything else.

It applies to any situation which involves risk though. Work out the odds of success and imagine repeating the same decision hundreds of times. Compare what you stand to gain with the frequency that you would win (from the odds) and then you can decide if it’s worth it.

Ah, but how do you actually calculate odds? Well, this is where we probably need some proper Maths. Not too much, but we’ll have a go in Luck – Part 2.

## Leave a Reply