The Threat Actor You Can't Detect: Cognitive Bias
Sunday June 23, 2019. 10:34 PM , from Slashdot
Long-time Slashdot reader chicksdaddy shares news of a recent report from cybersecurity company Forcepoint's X-Lab, examining how cybersecurity decision-making is affected by six common biases:
For instance, Forcepoint found that older generations are typically characterized by information security professionals as 'riskier users based on their supposed lack of familiarity with new technologies.' However, studies have found the opposite to be true: younger people are far more likely to engage in risky behavior like sharing their passwords to streaming services. The presumption that older workers pose more of a risk than younger workers is an example of so-called 'aggregate bias,' in which subjects make inferences about an individual based on a population trend. Biases like this misinform security professionals by directing their focus to individual users based on their supposed group membership. In turn, analysts wrongly direct their focus to the wrong individuals as sources of security issues.
Availability bias may influence cybersecurity analysts' decision-making in favor of hot topics in the news, which ultimately cloud other information they may know but are not so frequently exposed to; leading them to make less well-rounded decisions. People encounter 'confirmation bias' most frequently during research. By neglecting the bigger picture, assumptions are made and research is specifically tailored to confirm those assumptions. When looking for issues, analysts can often find themselves looking for confirmation of what they already believe to be the cause as opposed to searching for all possible causes.
The fundamental attribution error also plays a significant role in misleading security analysts, Forcepoint found. This is manifested when information security analysts or software developers place blame on users being inept instead of considering that their technology may be faulty or that internal factors contributed to a security lapse.
The report also cites what it calls the framing effect. 'Security problems are often aggressively worded, and use negative framing strategies to emphasize the potential for loss.'
Read more of this story at Slashdot.
Oct, Thu 29 - 23:48 CET