Our goal was to examine how cognitive biases
(specifically the anchoring bias) impact software engineers’ judgement and decision making and how it might be mitigated.
The anchor bias is powerful and has been widely documented . It arises from being influenced by initial information, even when it’s totally misleading or irrelevant. This strong effect can be very problematic when making decisions or judgements. Even extreme anchors e.g., the length of a blue whale is 900m (unreasonably high anchor) or 0.2m (unreasonably low anchor), influence people’s judgements about the length of whales. Jørgensen has been active in demonstrating that software engineering professionals are not immune from this bias (see his new book on time predictions: free download).
Therefore we decided to experimentally investigate whether de-biasing interventions such as a 2-3 hour workshop can reduce, or even, eliminate the anchor bias. Given the many concerns that have been expressed about under-powered studies
and being able to reliably identify small effects in the context of noisy data, we made four decisions.
Use professional software engineers (there is an ongoing debate about the value of student participants, e.g., the article by Falessi et al.  that is in favour which contrasts with the strong call for realism from Sjøberg et al. . We side with realism).
Use a large sample (n=410).
- Use robust statistics.
In brief, we used a 2×2 experimental design with high and low anchors combined with a de-biasing workshop and control. Participants were randomly exposed to a high or low anchor and then asked to estimate their own productivity on the last software project they had completed (EstProd). Some had previously undertaken our de-biasing workshop while others received no intervention, i.e., the control group.
The interaction plot below shows a large effect between high and low anchor. It also shows that the workshop reduces the effect of the high anchor (the slope of the solid line is less steep) but has far less effect on the low anchor. However, it does not eliminate the bias.
We conclude that:
- We show how professionals can be misled easily into making highly distorted judgements.
- This matters because despite all our tools and automation, software engineering remains a profession that requires judgement and flair.
- So try to avoid anchors.
- But it is possible to reduce bias.
- We believe there are many other opportunities for refining and improving de-biasing interventions.
- We only considered one type of bias.
- Used a relatively simple de-biasing intervention based on a 2-3 hour workshop.
- We don’t know how long the de-biasing effect will last.
 Halkjelsvik, T. and Jørgensen, M. “From origami to software development: A review of studies on judgment-based predictions of performance time”, Psychological Bulletin, 138(2), pp238–271, 2012.
 Falessi, D. et al., “Empirical software engineering experts on the use of students and professionals in experiments”, Empirical Software Engineering, 23(1), pp452–489, 2018.
 Sjøberg, D. et al., “Conducting realistic experiments in software engineering”, IEEE International Symposium on Empirical Software Engineering, pp17–26, 2002.