9. May 2026
Are digital scams out of control because academia and industry can’t or won’t work together?
Human society has a long history of cheats. Cheaters act to improve their lot as individuals; in evolutionary terms, they look to gain control over resources that could ultimately lead to more reproductive success. I would wager that the trait of cheating is an intrinsic part of human society and has been with us since we first stepped onto the African plain. Scams, fraud, and other cyberattacks are just modern versions of the cheats of old.
Research into the evolutionary mechanisms underlying group cooperation may offer insights and measures to prevent cheating. Coupled with technological advances in digital scam detection, this could be a much-needed winning duo. The question is: can technologists, behavioral scientists, and evolutionary psychologists work together to stop cheaters from beating the system?
Scams and fraud are a major problem that needs fixing
Scams have always been so. However, in the digital age, we have seen an unprecedented increase in digital scams, reflecting the ubiquitous use of the internet. Data from the Global Anti-Scam Alliance shows that 54 percent of adults worldwide have experienced scams, with 23% losing money. The amount lost is staggering: $1.03 trillion worldwide in 2024. Aside from the financial costs, the impact of scams on individuals is negative, including heightened stress, loss of confidence, and family tension. Scams are felt not only at the individual or kin level but also at the group level. In the workplace, scams are costing UK businesses around 7.4% of their annual revenue. Employees impacted by cybercrime also experience stress, loss of confidence, and, in some cases, depression.
To understand how behavioral science can help to mitigate scams and fraud, we need to understand how these nefarious activities work. Deception, manipulation, and fraud have always been components of scams. The assistance of AI, however, has taken scamming to new heights of scope, plausibility, and evasiveness, while utilizing automation to add speed and scalability.
AI may be a new entrant to the cybercrime landscape, but it is being used for the same purposes as scams of old. One of the cleverest ways AI can assist scammers is through deepfakes in video or voice. Deep fakes are used to impersonate and manipulate victims, tricking them into believing they are talking with known and trusted persons. A recent example involved a fake video call that tricked an employee at UK engineering firm Arup into sending $25 million to attackers.
Manipulating human behavior and trust gives cybercriminals power over their victims. No matter how the scam is executed, in real life or using AI-assisted technology, the same behavioral ruses are used: leverage trust, then apply pressure points to manipulate individuals into performing a task that benefits the scammer.
Fortunately, we humans have evolved behaviors that benefit us as individuals and the groups we belong to. Behavioral science and evolutionary psychology offer hope to the world of cybersecurity. As such, scientists and technologists are presented with one of the most confounding and complex problems in human history - how to effectively stop scams and fraud. The use of multidisciplinary approaches is hailed as the answer to complex, human-centric cybersecurity challenges, but do industry and academia make good bedfellows?
The importance of a multidisciplinary approach to cybersecurity
Until the mid-twentieth century, scientific research was monodisciplinary. However, collaborative research is now generally encouraged in academia. The use of science in cybersecurity has a more recent history.
Security awareness training and phishing simulation platforms are already based on research into human psychology. Companies that provide security awareness training typically promote ‘behavior-based’ training to change risky behavior. So why even think about enhancing training models by leveraging behaviors that promote group cooperation? The psychology of security awareness is based on educating employees (mainly) to identify scam signals, such as a sense of urgency in a message, the fear of missing out (FOMO), and tell-tale signs of impersonation. The training is highly individualized; much of it is tailored to each individual based on training outcomes.
A renowned cybersecurity guru, Bruce Schneier, wrote a paper titled "The Psychology of Security," in which he carefully laid out how humans perceive and respond to risk. He concludes, “Perhaps by understanding how our brains process risk, and the heuristics and biases we use to think about security, we can learn how to override our natural tendencies and make better security trade-offs.” The paper was published in 2008, yet here we are, over 15 years later, and scammers are more successful than ever.
Other research into security behavior has involved the ‘Big Five’ personality model. More recent studies, such as a 2023 doctoral thesis from Demjaha, “Co-design and modeling of security policy for cultural and behavioral aspects of security in organizations,” have identified human-centered research gaps. The thesis talks about ‘herd behaviors’ and other group factors. The thesis also highlights the role of culture in “understanding complex socio-technical systems and identifying causes of problems involving people and processes.” A core conclusion of the thesis is that security awareness amongst employees takes time and energy and can impact their work.
The problem in applying psychology to security risk management may lie in research centered on individualism, which constrains the scope of the project. We are not islands. Security behavior research must embrace the notion of group behavior, how employees and others operate within the group, rather than just as individuals.
The evolutionary tenets underlying the behaviors we display at work and in our communities may provide deeper, more actionable insights into how to manage and control scams. Understanding the drivers behind behaviors like cheat deterrence, tit for tat, altruistic punishment, and conformism may offer the way forward in developing more systematic, naturalistic, and effective measures to detect and prevent scams.
The evolution of group cooperation and its links to scams
You would be unusual if you had not felt the weight of attacks on individuals over the last few years. But we humans are attempting to cooperate, coming together to warn about scammers and using technology to counter attackers. If you turn to social media, especially community-focused platforms like ‘Nextdoor’ in the UK, you will see attempts by members of the community to warn of online (and IRL) scammers. Cheat detection is shared with the community, and honesty is hailed as a prosocial behavior. The notion of group cooperation is evident, with members actively identifying cheaters and helping fellow group members stay safe.
Research into the mechanisms and behaviors underlying group cooperation may yield critical insights into how to mitigate scams. The area of group cooperation has been researched and debated for decades. The seminal work by Trivers in 1971, ‘The Evolution of Reciprocal Altruism,’ laid the foundation stones for the discipline. One of the hotly debated mechanisms of group cooperation is Cultural Group Selection (CGS). CGS concerns the evolution of cultural traits that benefit the group and enable it to outcompete other groups without those traits. Behaviors underpinning CGS include Kin Selection, in which cooperation between related individuals and associated traits, such as nepotism, improve the overall fitness of those individuals, and reciprocity, involving prosocial behaviors that lead to cooperation. Of course, human groups are highly complex and comprise varying roles that interact in hierarchies and across interrelated nodes, which go way beyond pairwise interactions. Research by Tomasello et al. develops the idea that group cooperation is based on ‘mutualistic collaboration’. Social norms are described in the paper as ‘mutual expectations’ to behave in an accepted manner, as Tomasello puts it, “to do your part (or else!).”
Social norms encourage people in a group to perform actions that fit the group's needs. In my own research into the application of proverbs to develop social norms, proverbs relating to conformism were frequently found in the paremiological minimum (the most commonly known proverbs in a society). When in Rome, do as the Romans do.
With social norms comes punishment. Altruistic punishment flies in the face of the idea of the ‘selfish gene’ popularized by Dawkins in his book of the same name. Altruistic punishment involves an individual punishing noncooperators at a cost to themselves to benefit the group.
The mechanisms of group cooperation are all good to know, but humans have evolved behavior to handle natural environments, not digital ones. The combination of human behavior and trust manipulation within a digital domain has created a new environment that humans must navigate. Scammers are taking full advantage of this fact. The research into the behaviors that underpin group cooperation must be used by technologists as part of holistic measures to prevent scams.
Developing programs for group-based security awareness training
Cybercriminals prey on human reactions to situations such as FOMO and manipulate trust. Security awareness training uses these attack tactics to help individuals identify scams. However, the increase in scams indicates that there is still some way to go in preventing such attacks. There has been much exploration of individuals' cybersecurity behaviors, but less on how group behaviors impact cybersecurity. Within groups, such as local communities or workplace settings, group behaviors such as cheat detection and deterrence, and altruistic punishment could be leveraged to help mitigate the success rate of scams. Security awareness programs already incorporate the notion of a ‘security culture’ to describe how a group of individuals can change their behavior when confronted with cybersecurity threats. However, further research into the concept of a security culture should expand to include current knowledge of the behaviors involved in the evolution of group cooperation. The cybersecurity community must collaborate with behavioral science, especially researchers studying the evolution of group behavior, to help end this war of attrition.
Conclusion
Anyone with a digital life knows that digital scams are commonplace. Every digital step we take makes us vulnerable to the whims of a fraudster. Digital scams often overlap our day-to-day lives too, with romance and impersonation fraud rife. As someone who straddles both the cybersecurity sector and evolutionary psychology worlds, I believe that a multi-disciplinary approach to scam prevention is needed. The statistics prove that technology alone, even AI-assisted approaches, still fail to stop these personalized cyberattacks from harming us. Individualistic security awareness training programs are somewhat successful, but they could be enhanced by incorporating research on the evolution of group cooperation. Understanding and deploying the research of evolutionary psychologists and other behavioral scientists may hold the key to slamming the door on scammers. Companies that develop security awareness tools already incorporate some aspects of behavioral science. However, a deeper dive into evolutionary mechanisms underpinning human behavior may provide more appropriate and effective guidance for anti-scam measures. Moving beyond individualistic psychology to incorporate group cooperation will help to ensure that all avenues are explored in handling this epic problem of modern scams.
