The largest portion of the standard FPL week is spent preparing for upcoming gameweeks. Within this, there are a multitude of ways we can aid preparation: in-person discussions; social media interactions; conducting our own analysis; listening to podcasts; reading articles; watching videos; re-watching the games and much more.
Unfortunately, when preparing for an upcoming gameweek, we are vulnerable to an array of psychological processes that may result in us making ineffective decisions and mistakes that we ultimately regret. The goal in this first section is to provide you with the relevant knowledge to identify these problems before they occur, and the relevant tools to deal with them before they develop. This should allow you to prepare for each upcoming gameweek in the optimal fashion and consequently, make the most effective decisions possible.
Digesting FPL Content
This first chapter of this book tackles the issues that may arise with digesting FPL content produced by other FPL managers. Indeed, as ‘serious’ managers we spend a lot of time reading articles and threads from other managers, listening to our favourite podcasts, or watching our favourite video creators. This allows us to buy players and adopt approaches that perhaps we would not have considered alone. The great managers will acknowledge the abundance of quality content that is available to them, but the very best managers will know how to integrate this with their own knowledge and decision-making.
Within this chapter, I will introduce you to one of the main themes of this book: cognitive biases. As FPL managers, it is vitally important to be masters of our own minds. We must find ways to improve the effectiveness of our decision-making and, in doing so, propel ourselves up the rankings and beat our mini-league rivals. One way of doing this is by being aware of, and learning to control, our cognitive biases.
Cognitive bias is the umbrella term used to describe the tendency for us to perceive information based on our own experiences and prior beliefs, which can result in the distortion of information, unreasonable or inaccurate interpretation, and flawed decision-making. In other words, the process by which objective information is interpreted subjectively and often inaccurately. The discovery of cognitive biases by Tversky and Kahneman (1974) resulted in a large cohort of literature exploring the topic.
Prior to this, we believed that humans were always driven by rules of logic and probability, contributing to rational thinking (Lieder, Griffiths, Huys, & Goodman, 2018), when in fact, the very existence of cognitive biases suggests that we are subject to irrational and biased cognitive processes, that can potentially result in sub-optimal reasoning and decision-making. In other words, cognitive biases show us that we are not perfect!
Within the overarching term ‘cognitive bias’, there are multiple sub-biases. There are hundreds of cognitive biases explored in current psychological research, however, in this book, we will cover 28 of those most applicable to FPL. Before we begin, I would like to draw your attention to the most ironic cognitive bias of all – the bias blind spot. The bias blind spot is the cognitive bias of recognising the impact of biases on the judgement of others, while failing to see the impact of biases on one’s own judgement – we are blind to our own biases! Pronin, Lin and Ross (2002) refer to this as an ‘asymmetry in perceived susceptibility to biases’. That is, we think we are shielded from, and less susceptible to, the very biases that we highlight in other individuals.
The bias blind spot is demonstrated in a number of studies. For example, Pronin et al. (2002) conducted a series of experiments whereby they asked Stanford University students to rate their susceptibility to biases in relation to other groups. The authors found that the students rated themselves significantly less susceptible to suffering from cognitive biases than: a) the average American; b) the average fellow classmate; c) the average airport traveller (i.e., a stranger). However, the authors did not stop there, as they demonstrated that not only were the participants unable to identify their own biases, they were also largely unaware they were suffering from the blind spot.
The authors demonstrated this by asking participants to rate themselves on six personality dimensions (three positive and three negative) in relation to other Stanford University students. For example, they were asked to rate how considerate they were of others, ranging from ‘much less than the average Stanford student’ to ‘much more than the average Stanford student’.
After rating themselves in relation to their fellow students, the participants were given a piece of paper with the following description and asked to read it:
Studies have shown that on the whole, people show a ‘better than average’ effect when assessing themselves relative to other members within their group. That is, 70-80% of individuals consistently rate themselves ‘better than average’ on qualities that they perceive as positive and, conversely, evaluate themselves as having ‘less than average’ amounts of characteristics they believe are negative.
Participants were then asked to return to their answers and indicate whether an objective resource (with no bias) would give the same response as them. In other words, were their responses objectively accurate? Of the 79 participants that claimed a ‘better-than-average’ status, only 19 (24%) indicated that their responses had been biased. Sixty-three percent claimed that they were entirely accurate, while 13% claimed that they were actually being too modest! Therefore, even after the participants had their attention explicitly drawn to the fact that they may have been biased, less than a quarter were actually able to spot this in themselves.
This research is important as it demonstrates that as individuals we struggle to identify biases in our thoughts and behaviour. Being aware of our difficulty to reflect introspectively (i.e., examining our own thoughts and behaviours) will improve our chances of catching our own biases and refraining from making decisions in a sub-optimal fashion.
It is also worth noting that as you work your way through this book, you may be unable to spot these biases in your own thoughts and behaviours. If you think ‘I don’t do that’, or ‘I am not prone to making that mistake’, you may be demonstrating the bias blind spot.