Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2022 Dec 23;77(1):45–55. doi: 10.1007/s41449-022-00346-2

Human-centred cyber secure software engineering

Cybersicherheit durch menschzentrierte Softwareentwicklung

Karen Renaud 1,2,3,4,
PMCID: PMC9788869  PMID: 36589596

Abstract

Software runs our modern day lives: our shopping, our transport and our medical devices. Hence, no citizen can escape the consequences of poor software engineering. A closely-aligned concern, which also touches every aspect of our lives, is cyber security. Software has to be developed with cybersecurity threats in mind, in order to design resistance and resilience into the software, given that they are often rooted in malicious human behaviour. Both software engineering and cyber security disciplines need to acknowledge and accommodate humans, not expect perfect performances. This is a position paper, delineating the extent of the challenge posed by this reality, and suggesting ways for accommodating the influence of human nature on secure software engineering.

Practical Relevance: Socio-technical systems are made up of people, processes and technology. All can fail or be suboptimal. Software itself, being designed, developed and used by humans, is likely to malfunction. This could be caused by human error, or by malice. This paper highlights this reality, taking a closer look at all of the possible sources of malfunctioning technology. By doing so, I hope to infuse the management of socio-technical systems with an understanding and acknowledgement of this reality.

Keywords: Software Engineering, Cybersecurity, Human Factors

Introduction

Consider that every aspect of our lives is influenced by invisibly functioning software. For example, when we shop, software runs the tills; it ensures that the temperature remains constant, and so-called “smart” devices track our movement through the store to identify the most effective product placement strategies. Our hospitals use Internet-connected devices to carry out tests, and our airplanes and cars have embedded computers. Underlying all of this is software, and such software is developed by software engineers.

Software engineering is a challenging, complex and error-prone activity. Sometimes, flawed and/or insecure software is produced (Clark 2021; Collins 2009). This has been the case ever since the first software code was written (Martin 2022). Why does this happen? IBM Software Engineer Rupert Brooks wrote a book titled: “The Mythical Man Month” to help those working in software engineering to understand the human problems of this craft. He was one of the first software engineers to draw attention to the importance of human limitations and management and the consequences of ignoring these.

When software has errors or vulnerabilities, it can lead to widespread harms—and can even lead to loss of life. The widespread 2017 WannaCry attack brought many of the UK’s National Health System board to a near standstill, diverting funds to recovering from the attack, delaying surgeries and other treatments. The 2021 Pipeline attack disrupted the flow of critical petroleum products across the East Coast of America1, and led to increased gasoline prices. The consequences of these attacks are non-trivial.

This paper seeks to explore the extent of the problem, and to provide suggestions for ameliorating the impact of human fallibility on the software engineering process. The following research questions will be addressed:

RQ1

How does human fallibility impact the secure software development process?

RQ2

How can the impact of human fallibility be ameliorated during the secure software engineering process?

The next section will first introduce the core concepts of “software engineering” and “cyber security”. Sect. 3 will review related research about secure software engineering. Sect. 4 then addresses RQ1, Sect. 5 considers RQ2, and Sect. 6 concludes.

Definitions

Software engineering is defined by Humphrey (1988) as “the disciplined application of engineering, scientific, and mathematical principles and methods to the economical production of quality software” (p. 82). The definition makes it clear that the process is challenging, with many opportunities for errors to introduce bugs into the software system.

Zhang et al. (2005) points to the need for a focus on usefulness, utility, and usability during the software engineering process. Usefulness of a computer system is related to the extent to which it assists users in a achieving their desired goals (Nielsen 1993). To deliver usefulness, the system has to have the correct functionality built into it (utility) (Grudin 1992) which allows users to achieve their goals with effectiveness, efficiency and satisfaction (usability) (ISO, in Bevan 2001).

Secure software engineering is even more challenging, given that the developer has to anticipate attack vectors, and foil the efforts of a globally distributed and innovative cyber criminals. Even if the software has been rigorously tested and is error free and usable, that does not mean that it cannot be compromised by cyber criminals in the future. Previously unknown so-called “zero-day” vulnerabilities are continuously discovered and exploited by hackers. If software starts to behave erratically, it could be due to errors but also reflect an intrusion by cyber criminals exploiting a vulnerability (Peisert et al. 2021; Zetter 2014). The widespread 2017 WannaCry and Stuxnet attacks exploited this kind of newly-discovered vulnerability (Mohurle and Patil 2017; Langner 2011).

Storey et al. (2020) argue that there is a need for research that aims to understand the human and social aspects of software development practice. Without this understanding, and targeted interventions that build on this, flawed software systems will result, systems will fail, and cyber criminals will compromise software systems.

Related research

To consider the current state of play, a search of the research literature was carried out, using the keyword: secure and “software engineering”. Papers returned by Google Scholar and Scopus were filtered to ensure that only those addressing the research question were retained for analysis (66 papers). Thematic analysis was used to extract themes and to group papers into these themes. The following theses emerged:

  1. Software Design and Implementation:
    1. The need for better techniques for formulating desirable security properties (Devanbu and Stubblebine 2000). This might include the use of security patterns (Van Niekerk and Futcher 2015).
    2. Developers needing to be trained to engage in secure software engineering and to detect vulnerabilities in their software (Braz et al. 2021; Essafi et al. 2006; Walden and Shumba 2006; Jayalath et al. 2020; Kanniah and Mahrin 2016; Arora et al. 2021; Stamat and Humphries 2009; Hein and Saiedian 2009).
    3. The need for security to be part of the software engineering process from design through to implementation is highlighted (Mouratidis et al. 2005; Mellado et al. 2007; McGraw 2004; Devanbu and Stubblebine 2000; Moyón et al. 2020; Kreitz 2019).
    4. The need for empirical research in secure software engineering (Cruzes and ben Othmane 2017).
  2. Software Deployment
    1. The need to pay attention to secure configuration of software systems (Sayagh et al. 2018), which the authors contend does not receive as much attention as it should.
  3. Software Maintenance and Evolution:
    1. A particular challenge is long-term maintenance of software systems, which becomes difficult because of “imprecise, incomplete and arbitrary documentation” [p. 320] (Villarroel et al. 2005). Devanbu and Stubblebine (2000) calls for the development of “automated, robust, flexible infra-structures for post-deployment system administration” [p. 225].
    2. The need to ensure that systems remain secure as they evolve over time (Felderer et al. 2014).
  4. Mentioning user issues:
    1. Software Engineers:
      i. The cognitive demand on software engineers developing secure software is overwhelming (Apvrille and Pourzandi 2005; Giorgini et al. 2005; Mellado et al. 2007; Miller et al. 2013).
      ii. Most research focuses on the technical aspects of software engineering, neglecting human factors (Storey et al. 2020) e.g., Khan et al. (2022).
      iii. Importance of collaboration and team dynamics highlighted: (Arora et al. 2021), and of the need for communication between stakeholders during the development process (Kanniah and Mahrin 2016).
    2. End Users: Need to consider the impact of security measures on end users (Flechais et al. 2003).

It is clear from this review that not much attention has been paid to the impact of the human factor in secure software engineering beyond pointing out that the entire process is challenging. Certainly, there is no explicit review on the impact of the human factor in the software engineering process.

We now proceed to address the two research questions.

How our humanity influences secure software engineering

Whenever humans are involved in any process, errors are certain to occur (Reason 1990; Adams and Sasse 1999) and sometimes bad actors will behave maliciously and compromise the system’s functioning (De Cremer 2009). The established fields of medicine and civil engineering (± 1960) have long acknowledged this reality (Berry 2022; Gawande 2010; Xie and Qu 2018; Whittle and Ritchie 2000). Human tendencies to make mistakes and to act maliciously can also compromise the software engineering process. Yet, for most of the history of software engineering, this human “factor” has been neglected (Sutcliffe 1997; Storey et al. 2020).

This section addresses the first research question: How do human factors impact the secure software development process?

To answer this question, we now consider the full range of different stakeholders (developer, managers, deployers, end users), and the impact of human factors in each different case.

Software developers

The software developer is at the core of the software engineering process, and the impact of their humanity has not received very much attention from researchers. To frame our discussion, we considered two lenses: personality and human needs.

In considering the first lens, we discovered that a number of studies had investigated the types of personalities of people who go into a career of software engineering. However, Cruz et al. (2015) found that “the evidence is weak and in many cases inconclusive” [p. 108]. Hence, instead of looking at personality, let us consider how a software engineering career satisfies the human needs of software developers: mastery, autonomy and relatedness (Ajzen 1991), and how their humanity influences the satisfaction of these. It is worth noting that the same approach could be used for the other stakeholders. However, the primary focus here is on software developers.

Mastery/Competence

Software engineering is a career that requires continuous learning, is creative and immensely satisfying when the project has been completed and the product prepared for deployment. Software developers, as keen problem solvers, enjoy putting pieces of code together to produce a working piece of software (Wynekoop and Walz 2000; Groeneveld et al. 2020). Brooks (1975) says: “The programmer, like the poet, works only slightly removed from pure thought-stuff” (p. 7).

Russo et al. (2022) find that software engineers have a higher need for cognition than the general public. The software engineer might be the adult who used to love to build lego as a child (Bialski 2017). Certainly, Stoilescu and Egodawatte (2010) explain that successful professionals enjoying “playing” with computers. As such, the software developer career is almost custom built to satisfy these individuals’ needs for mastery.

Yet, there are also challenges. Brooks (1975) explains that the need for perfect performance is one of the most difficult for humans to adapt to. Computer code does not tolerate any imperfections, so programmers have to get used to going over their code repeatedly in order to arrive at that level of perfection which is required.

Errors in software code are colloquially referred to as “bugs”. This term was coined by Grace Hopper, a United States Navy rear admiral. She found an insect (termed a bug) in a computer, which was interfering with its functioning. She promptly pasted it into the log book, where it remains for posterity. Ever since, programmers have referred to software errors as ‘bugs’ and their elimination as ‘debugging’. Brian Kernigan2, one of the early giants of the software engineering field, said: “Debugging is twice as hard as writing the code in the first place”. Debugging is the price programmers play for experiencing the pleasure of the creative programming process. Programmers who possess the ability to persevere and detect these bugs (Stolee et al. 2011) will achieve the level of mastery that is required, and are likely to remain in software engineering.

The testing phase is undertaken by others, and is trying for the programmer. He/she has spent countless hours developing the software and debugging it. Such personal investment makes him/her feel a sense of ownership: their code is “their” thought-child (as described by Brooks). That being so, it is hard for them not to feel defensive when a tester uncovers errors. It points to the programmer’s failure and can be perceived as an attack. Often, programmers are under tremendous pressure from their managers to produce the software to unrealistic deadlines. In these cases, testing may be seen as a luxury. If it is done, many errors are likely to be uncovered. This is an unwanted outcome for any software developer.

The (recent) mandate to code securely challenges their sense of mastery. Arvind Krishna, CEO of IBM3 said that: “Cybersecurity is the issue of the decade. I think that is the single biggest issue we all are going to face”. While he was talking about the role of the chief information officer, this applies equally to the software engineer. The need to develop secure software is relatively recent and many who have been in the industry for some years will not have been trained to do this.

Even if software developers master the complex software development process, the need to develop secure software might well not come as easily. Developing secure software requires developers to anticipate vulnerabilities, which requires something of a different mindset (Harvey et al. 2016). Anu et al. (2020) suggest that in many cases vulnerabilities that are exploited by hackers can be traced to the same handful of programmer errors. It might be that traditional software engineering training has not yet incorporated sufficient training in preventing vulnerabilities, as highlighted in the previous section. In this case, their perceived mastery is illusory—something Johari calls an “unknown, unknown” (Shenton 2007).

In relatively rare cases, software engineers might behave maliciously, for a variety of reasons (Warkentin and Willison 2009). They might insert so-called “malware” into software, which can be exploited at will (Wu 2020). On the other hand, sometimes software developers can be pressured by their managers into inserting “back doors” to ease subsequent access (Osterweil 2016). If they have not had ethical training, or feel unable to resist for fear of losing their jobs, the resulting software is inherently insecure and will permit intrusions.

Software is routinely tested to ensure that it delivers the required functionality, and is usable by its target user group. It is also often tested for known top security risk vulnerabilities4, so-called penetration testing. However, an oft-neglected kind of testing is to reveal deliberately-introduced malware (Agrawal et al. 2010).

Hence, software engineers seem to satisfy their mastery needs in their careers, but sometimes find such satisfaction frustrated by the intensely demanding expectation of near perfection in their discipline, and the need to code defensively to prevent breaches by cyber criminals.

Autonomy

With respect to autonomy, it is harder for software engineers to satisfy this need. The programmer does not set his/her own objectives, goals or the technologies and tools to be used. The creativity they so revel in is actually severely constrained. Programmers are similar to actors and actresses—they are the visible agents, but others set the scene and finely choreograph their actions. Brooks (1975) points out how painful this dependence is for the creative programmer, driving towards mastery of their field, especially when requirements are poorly defined or unrealistic.

The other challenge to deal with is continuously changing requirements. Such changes are unpleasant and not eagerly anticipated (Harker et al. 1993). The software engineer has to learn to tolerate uncertainty, and to not have their need for autonomy satisfied (Kalliamvakou et al. 2017). This is difficult for most humans, and many people become burnt out as a consequence of long-term uncertainty (Kuhn et al. 2009). Such burn out can lead to widespread negative outcomes for the software developer and organisation (Nesher Shoshan and Sonnentag 2020).

Relatedness

This term is defined by Merriam-Webster as “having close harmonic connection—used of tones, chords, or tonalities”. In essence, people need to spend quality time with other people—especially their loved ones and friends (McLeod 2007). Yet, the software engineering career is characterised by overwork and burnout (Afzal 2016). This will prevent them from spending time with others and from being able to relax when they do.

There is a myth that suggests that software developers are anti-social and uncommunicative—and this argument could be used to argue that it is acceptable for software developers to work excessive hours alone staring at a screen. That this is indeed a myth is highlighted by Chattopadhyay et al. (2021), who found that developers would use vlogs to challenge the misconceptions and stereotypes around their identities. Indeed, as Rodeghero et al. (2021) show, developers have a keen sense of their need to be socially connected to their colleagues and programmers with greater openness perform better in their jobs (Salleh et al. 2014; Rehman et al. 2012). If they do not get this, they will experience feelings of vulnerability (Hawkley and Cacioppo 2010), which will impact social cohesion and information sharing between employees (Searle and Renaud 2023).

Summary

It is clear that software developers’ jobs are unlikely to satisfy their human needs. In this situation, they will not perform optimally and if they become burnt out many negative consequences will occur, which will affect their ability to produce bug free and secure code. Managers can alleviate the issues mentioned in this Section if they are aware of them.

Managers

Inadequate management and a lack of resources can be the cause of software development failures (Oz 1994; Aeon et al. 2021; Linberg 1999). It is also the case that many of those who manage software engineering teams do not have much technical expertise themselves (Dzuiba 2010; Kalliamvakou et al. 2017). Hence, they fail to understand the complexity of the software development process and produce unrealistic schedules (Linberg 1999).

Kalliamvakou et al. (2017) provides an extensive list of attributes of great managers of software developers. These include building a team culture, fostering communication and being available to team members. In terms of effective management, Wang and Lai (2001) point to the crucial nature of requirements management, one of the key responsibilities of managers. Wang and Lai explain that whereas changing requirements are a fact of life, the changes have to be managed (Bhatti et al. 2010), not merely passed down to developers.

Assal and Chiasson (2019) find that security-related software development issues often stem from a lack of organizational or process support in terms of incorporating security info software development tasks. This is something managers ought to provide, but it seems that they do not do this.

Given the need for software developers to know how to develop secure software, it is also the responsibility of the manager to ensure that they get the necessary training to help them to develop these skills. Making them aware of their lack of knowledge in this area is a clear management responsibility, because an appreciation of the lack of expertise in this area is the first step towards a willingness to having their code tested to uncover vulnerabilities (Howard and Lipner 2006). This allows “unknown unknown” to become a “known unknown”.

Afzal (2016) argues that many developers do not leave their jobs, but rather leave bad managers. This points to the crucial role of management in the software engineering process.

Deployers

Given the opportunity for error during every stage of the software development process, there is a very high likelihood of hidden bugs remaining in software after deployment. Those who deploy software should have a rigorous maintenance process in place. As users report issues, these can then be addressed.

However, some errors are intermittent and hard to prove. When software produces anomalous outcomes, it is often the case that the first reaction is denial or to blame the operator (Clark 2021). Indeed, the latest manifestation of this kind of denial occurred in the Post Office case in the UK (Renaud et al. 2021a). Over the course of two decades, the Horizon software rolled out by the UK’s Post Office generated phantom transactions, and the Post Office unquestionably jumped to the conclusion that reported shortfalls were due to fraud (Wallis 2021). Over 700 innocent post masters were prosecuted, some were incarcerated, and many lost their livelihoods. The Post Office steadfastly refused to consider the fact that their software might be malfunctioning, which was indeed the case.

There is often naïvety amongst those who deploy software without understanding the mechanisms of software engineering. Many believe that software always works correctly (Crown Prosecution Service 2017; UCL Laws 2021), perhaps because the alternative is unpalatable and would complicate their lives and jobs. Even so, the human factor influencing the development of software should be acknowledged so that a measure of realism is injected into the deployment process.

End users

In 1999, Adams and Sasse wrote a seminal paper titled “Users are not the Enemy”. This paper played a crucial role in introducing the need to acknowledge the role of humans in cyber security to industry and academia. These authors are psychologists. Even so, both software engineers and cyber security researchers benefited from collaborations with psychologists since Adams and Sasse highlighted the need to accommodate the human in cybersecurity.

Even so, many still refer to the human as the “weakest link” two decades later. Zimmermann and Renaud (2019) demonstrate that this attitude is not merely anecdotal, but is entrenched—almost an unquestioned stereotype. In fact, while humans do indeed make errors, and sometimes behave insecurely, Fig. 1 shows the wide range of sources of issues that can cause an adverse software-related incident.

Fig. 1.

Fig. 1

Summary of Human Factors affecting Software Engineering (Upper row is reflects unintentional, while the lower is intentionally malicious)

Zusammenfassung menschlicher Faktoren, die sich auf die Softwareentwicklung auswirken (obere Reihe spiegelt unbeabsichtigt wider, während die untere absichtlich böswillig ist)

Humans sometimes make errors that subvert security measures. Moustafa et al. (2021) enumerate a number of these, which include poor password management, oversharing, being deceived by social engineers not updating software on their devices. Much of this is due to a lack of awareness, but also because users do not have the tools that could help to eliminate these behaviours, most of which are purely coping strategies. For example, password managers alleviate password memorial loads, and consequently improve password strength. However, they have not really diffused through the population (Alkaldi and Renaud 2022) and few organisations provide these to their employees. There are also tools to help people to spot Phishing messages more effectively, but these, too, are not widely used by organisations. A focus on the supporting rather than blaming is likely to reduce insecure behaviours (Renaud et al. 2021b).

An overview of the literature dealing with the impact of human nature on the secure software engineering process is provided in Table 1 in the Appendix, and in Fig. 1.

Table 1.

Secure Software Development Issues

Probleme bei der sicheren Softwareentwicklung

Plan Analyse Design Develop Test Deploy Maintain
Programmers Not recruiting enough coders (Lehtinen et al. 2014) Not understanding deployment context & constraints (Dyson and Longshaw 2004)

Staff Inexperience (Shahzad et al. 2011)

Not understanding how the software interacts with existing systems (Bosch 2010)

Lack of Cooperation (Lehtinen et al. 2014)

Insecure coding (Du and Mathur 1998)

Tradeoffs between reuse and redevelopment (Basili and Perricone 1984)

Inadequate test cases (Meenakshi et al. 2014)

Not understanding users (Borenstein 1991)

No audit trail or error logging (Ko et al. 2007)
Project Managers

Flawed Planning (Pinto 2013)

Underestimating the amount of time needed for planning (Brooks 1975)

Weak Task Backlog (Lehtinen et al. 2014)

Poor monitoring (Brooks 1975)

Underestimating effort (Gray et al. 1999)

Lack of Resources (Lehtinen et al. 2014)

Shortening testing time (Westland 2002)

Optimism Bias & Over-promising (Pinto 2013)
Deployer Immature Requirements (Shahzad et al. 2011) Creeping user requirements (Harker et al. 1993) Pressure for Deployment (Pinto 2013; Jones 1993) Poor Training (Wallis 2021) Blaming users for errors (Clark 2021)
Security Not integrating security throughout the life cycle (Khan et al. 2021; Mouratidis et al. 2005) Incomplete Security Requirements (Sultan et al. 2008) Not designing for security (Sultan et al. 2008)

Implementation Errors (Sultan et al. 2008)

Trying to bolt security on after development (Pearlson and Huang 2022)

Inadequate Testing (Sultan et al. 2008) Architectural security flaws (Abeyrathna et al. 2020) Not staying on top of new threats (Bernsmed et al. 2022; Devanbu and Stubblebine 2000)

Implications for practice & research

Fig. 2 depicts all the roles and considerations that come into play in the human-centred cyber secure software engineering process.

Fig. 2.

Fig. 2

Human-Centred Cyber Secure Software Engineering Triangle

Menschenzentrierte Cybersicherheit Software-Engineering-Dreieck

Now, we address the second research question: “How can the impact of human fallibility be ameliorated during the secure software engineering process?”

Practical implications

There are some practical implications if we are to engender the production of secure software systems:

  1. Managers should:
    1. ensure that software engineers are supported in their need to develop mastery as much as possible, since this appears to be the need that this discipline is particularly suited to satisfy.
    2. Software engineers ought to be trained not merely to secure the system against known threats, but also to anticipate vulnerabilities that could be exploited, and then to actively prevent these.
    3. Support the engineers in crafting systems that accommodate human tendencies to make mistakes. Given them time to allocate to usability and security-related activities.
    4. Do not pressure developer to the extent that they are unable accommodate human fallibility and malice.
  2. Deployers should:
    1. test software regularly to ensure that newly emergent vulnerabilities are addressed,
    2. be realistic in accepting that software is seldom perfect and considering this possibility before blaming end users when anomalous outcomes occur.

Research implications

There are many avenues for research in this area. Three urgently need attention:

  1. Software developers need to be supported more effectively in developing secure software, as highlighted in the research (Sect. 3), and in order to satisfy their personal mastery needs. This is a challenging area because the goalposts change all the time as new exploits emerge, and developers are under pressure to complete the. product so that it can be deployed as soon as possible. This reality should be built into the training programmes and also into their delivery milestones. Developing tools to support managers would be helpful. Both training and tools would be fruitful avenues for future research.

  2. There is a naivety amongst the general public about the complexity of software development and the challenges of keeping software running correctly and without being compromised. In order to inject a measure of realism into expectations of software, there is a need to develop a public relations campaign and some short courses for those in other walks of life to help them to develop an appreciation of the complexities of software engineering. This requires a rigorous research endeavour from communications researchers.

  3. There is a need to change the industry mindset away from “user as problem”. This has occurred in other more mature fields such as safety (Dekker 2018) and medicine (Berry 2022). Those doing research in the software engineering domain have a role to play in adapting the lessons from these fields for the software engineering discipline.

Conclusions

The software engineering field is huge and enjoys attention from many researchers. The cyber security field, too, is extensive. Here, I have sought to provide an overview of the human-related aspects of the secure software engineering domain. The biggest lessons I would like readers to take away are the following: (1) wherever humans are involved in any endeavour, it benefits us to learn from those who understand human nature best: psychologists. We should not expect perfect performance from any human, and accommodate that in our expectations of human stakeholders in the software engineering domain. (2) No one benefits if a profession is shrouded in mystery. During the pandemic, everyone became familiar with terms such as coronavirus, spike protein and cytokine storm. This new familiarity helped public health officials to communicate with the general public. The software engineering profession should start helping the public to understand the complexity of their profession too. If they do, everyone will benefit.

Funding

Open access funding provided by University of Strathclyde.

Appendix

Footnotes

References

  1. Abeyrathna A, Samarage C, Dahanayake B, Wijesiriwardana C, Wimalaratne P. A security specific knowledge modelling approach for secure software engineering. J Natl Sci Found Sri lanka. 2020;48:1. [Google Scholar]
  2. Adams A, Sasse MA. Users are not the enemy. Commun ACM. 1999;42(12):40–46. doi: 10.1145/322796.322806. [DOI] [Google Scholar]
  3. Aeon B, Faber A, Panaccio A. Does time management work? A meta-analysis. Plos One. 2021;16(1):e0245066. doi: 10.1371/journal.pone.0245066. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Afzal R (2016) 6 reasons, software developers quit your company. https://www.linkedin.com/pulse/6-reasons-your-software-developers-quit-company-raheel-afzal/. Accessed 20 Dec 2022
  5. Agrawal H, Alberi J, Bahler L, Conner W, Micallef J, Virodov A, Shane RS. 2010-MILCOM Military Communications Conference. Piscataway: IEEE; 2010. Preventing insider malware threats using program analysis techniques; pp. 936–941. [Google Scholar]
  6. Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50(2):179–211. doi: 10.1016/0749-5978(91)90020-T. [DOI] [Google Scholar]
  7. Alkaldi N, Renaud K. MIGRANT: modeling smartphone password manager adoption using migration theory. Data Base Adv Inf Syst. 2022;53(2):63–95. doi: 10.1145/3533692.3533698. [DOI] [Google Scholar]
  8. Anu V, Sultana KZ, Samanthula BK. 2020 IEEE International Symposium on Software Reliability Engineering Workshops (ISSREW) Piscataway: IEEE; 2020. A human error based approach to understanding programmer-induced software vulnerabilities; pp. 49–54. [Google Scholar]
  9. Apvrille A, Pourzandi M. Secure software development by example. IEEE Secur Privacy. 2005;3(4):10–17. doi: 10.1109/MSP.2005.103. [DOI] [Google Scholar]
  10. Arora V, Vargas EL, Aniche M, van Deursen A. Secure software engineering in the financial services: a practitioners’ perspective. arXiv. 2021 doi: 10.48550/arXiv.2104.03476. [DOI] [Google Scholar]
  11. Assal H, Chiasson S. Proceedings of the 2019 CHI conference on human factors in computing systems. 2019. ‘Think secure from the beginning’ A survey with software developers; pp. 1–13. [Google Scholar]
  12. Basili VR, Perricone BT. Software errors and complexity: an empirical investigation. Commun ACM. 1984;27(1):42–52. doi: 10.1145/69605.2085. [DOI] [Google Scholar]
  13. Bernsmed K, Cruzes DS, Jaatun MG, Iovan M. Adopting threat modelling in agile software development projects. J Syst Softw. 2022;183:111090. doi: 10.1016/j.jss.2021.111090. [DOI] [Google Scholar]
  14. Berry P. Necessary scars. Boca Raton: CRC Press; 2022. [Google Scholar]
  15. Bevan N. International standards for HCI and usability. Int J Hum Comput Stud. 2001;55(4):533–552. doi: 10.1006/ijhc.2001.0483. [DOI] [Google Scholar]
  16. Bhatti MW, Hayat F, Ehsan N, Ishaque A, Ahmed S, Mirza E. 2010 International conference on computer information systems and industrial management applications (CISIM) Piscataway: IEEE; 2010. October. A methodology to manage the changing requirements of a software project; pp. 319–322. [Google Scholar]
  17. Bialski P (2017) I am not a hacker. https://www.alexandria.unisg.ch/260938/. Accessed 6 Dec 2022
  18. Borenstein NS. Programming as if people mattered. Princeton: Princeton University Press; 1991. [Google Scholar]
  19. Bosch J. Proceedings of the fourth European conference on software architecture: companion volume. 2010. Architecture challenges for software ecosystems; pp. 93–95. [Google Scholar]
  20. Braz L, Fregnan E, Çalikli G, Bacchelli A. 2021 IEEE/ACM 43rd International Conference on Software Engineering (ICSE) Piscataway: IEEE; 2021. Why don’t developers detect improper input validation? pp. 499–511. [Google Scholar]
  21. Brooks R. The mythical man month. Boston: Addison Wesley; 1975. [Google Scholar]
  22. Chattopadhyay S, Ford D, Zimmermann T. Developers who vlog: dismantling stereotypes through community and identity. Proc Acm Human-computer Interact. 2021;5(CSCW2):1–33. doi: 10.1145/3479530. [DOI] [Google Scholar]
  23. Clark M (2021) Bad software sent postal workers to jail, because no one wanted to admit it could be wrong. https://www.theverge.com/2021/4/23/22399721/uk-post-office-software-bug-criminal-convictions-overturned. Accessed 13 Feb 2022
  24. Collins T (2009) Twenty five years of government IT project failure. https://www.computerweekly.com/news/1280091277/Twenty-five-years-of-government-IT-project-failure. Accessed 13 Feb 2022
  25. Crown Prosecution Service (2017) Computer records evidence. https://www.cps.gov.uk/legal-guidance/computer-records-evidence. Accessed 13 Feb 2021
  26. Cruz S, da Silva FQ, Capretz LF. Forty years of research on personality in software engineering: a mapping study. Comput Human Behav. 2015;46:94–113. doi: 10.1016/j.chb.2014.12.008. [DOI] [Google Scholar]
  27. Cruzes DS, ben Othmane L. Empirical research for software security. Boca Raton: CRC Press; 2017. Threats to validity in empirical software security research; pp. 275–300. [Google Scholar]
  28. De Cremer D. On understanding the human nature of good and bad behavior in business: a behavioral ethics approach. 2009. [Google Scholar]
  29. Dekker S. Just culture: restoring trust and accountability in your organization. Boca Raton: CRC press; 2018. [Google Scholar]
  30. Devanbu PT, Stubblebine S. Proceedings of the Conference on the Future of Software Engineering. 2000. Software engineering for security: a roadmap; pp. 227–239. [Google Scholar]
  31. Du W, Mathur AP. 21st National Information Systems Security Conference. 1998. Categorization of software errors that led to security breaches; pp. 392–407. [Google Scholar]
  32. Dyson Paul, Longshaw Andrew (2004) Architecting enterprise solutions: patterns for high-capability internet-based systems. John Wiley & Sons
  33. Dzuiba T (2010) Why engineers hop jobs. http://widgetsandshit.com/teddziuba/2010/05/why-engineers-hop-jobs.html. Accessed 6 Dec 2022
  34. Essafi M, Labed L, Ghezala HB. 10th WSEAS International Conference on COMPUTERS (CSCC ’06) 2006. Addressing software application security issues. [Google Scholar]
  35. Felderer M, Katt B, Kalb P, Jürjens J, Ochoa M, Paci F, Breu R. Evolution of security engineering artifacts: a state of the art survey. Int J Secur Softw Eng (ijsse) 2014;5(4):48–98. doi: 10.4018/ijsse.2014100103. [DOI] [Google Scholar]
  36. Flechais I, Sasse MA, Hailes SM. Proceedings of the 2003 workshop on New security paradigms. 2003. Bringing security home: a process for developing secure and usable systems; pp. 49–57. [Google Scholar]
  37. Giorgini P, Massacci F, Mylopoulos J, Zannone N. 13th IEEE International Conference on Requirements Engineering (RE’05) Piscataway: IEEE; 2005. Modeling security requirements through ownership, permission and delegation; pp. 167–176. [Google Scholar]
  38. Gray AR, MacDonell SG, Shepperd MJ. Proceedings Sixth International Software Metrics Symposium (Cat. No. PR00403) Piscataway: IEEE; 1999. Factors systematically associated with errors in subjective estimates of software development effort: the stability of expert judgment; pp. 216–227. [Google Scholar]
  39. Groeneveld W, Jacobs H, Vennekens J, Aerts K. Non-cognitive abilities of exceptional software engineers: a Delphi study. In: Zhang J, Sherriff M, Heckman S, Monge A, Cutter P, editors. Proceedings of the 51st ACM Technical Symposium on Computer Science Education. New York: ACM; 2020. pp. 1096–1102. [Google Scholar]
  40. Grudin J. Utility and usability: research issues and development contexts. Interact Comput. 1992;4(2):209–217. doi: 10.1016/0953-5438(92)90005-Z. [DOI] [Google Scholar]
  41. Guwande A. The checklist manifesto. New York: Picadur; 2010. [Google Scholar]
  42. Harker SD, Eason KD, Dobson JE. Proceedings of the IEEE International Symposium on Requirements Engineering. Piscataway: IEEE; 1993. January. The change and evolution of requirements as a challenge to the practice of software engineering; pp. 266–272. [Google Scholar]
  43. Harvey I, Bolgan S, Mosca D, McLean C, Rusconi E. Systemizers are better code-breakers: self-reported systemizing predicts code-breaking performance in expert hackers and naïve participants. Front Hum Neurosci. 2016 doi: 10.3389/fnhum.2016.00229. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Hawkley LC, Cacioppo JT. Loneliness matters: a theoretical and empirical review of consequences and mechanisms. Ann Behav Med. 2010;40(2):218–227. doi: 10.1007/s12160-010-9210-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Hein D, Saiedian H. Secure software engineering: learning from the past to address future challenges. Inf Secur Journal: A Glob Perspect. 2009;18(1):8–25. [Google Scholar]
  46. Howard M, Lipner S. The security development lifecycle. Microsoft Press. 2006 doi: 10.1007/s11623-010-0021-7. [DOI] [Google Scholar]
  47. Humphrey WS (1988, April) The software engineering process: definition and scope. In Proceedings of the 4th international software process workshop on Representing and enacting the software process, pp 82–83
  48. Jayalath LM, Dharshana KAC, Rathnayake RMTP. Towards secure software engineering. South asian Res J Eng Technol. 2020;2(6):45–53. doi: 10.36346/sarjet.2020.v02i06.001. [DOI] [Google Scholar]
  49. Jones C. Sick software. Computerworld. 1993;27(50):115. [Google Scholar]
  50. Kalliamvakou E, Bird C, Zimmermann T, Begel A, DeLine R, German DM. What makes a great manager of software engineers? Ieee Trans Softw Eng. 2017;45(1):87–106. doi: 10.1109/TSE.2017.2768368. [DOI] [Google Scholar]
  51. Kanniah SL, Mahrin MN. A review on factors influencing implementation of secure software development processes. J Comput Syst Eng. 2016;10(8):3022–3029. [Google Scholar]
  52. Khan RA, Khan SU, Khan HU, Ilyas M. Systematic mapping study on security approaches in secure software engineering. IEEE Access. 2021;9:19139–19160. doi: 10.1109/ACCESS.2021.3052311. [DOI] [Google Scholar]
  53. Khan RA, Khan SU, Khan HU, Ilyas M. Systematic literature review on security risks and its practices in secure software development. IEEE Access. 2022;10:5456–5481. doi: 10.1109/ACCESS.2022.3140181. [DOI] [Google Scholar]
  54. Ko AJ, DeLine R, Venolia G. 29th International Conference on Software Engineering (ICSE’07) Piscataway: IEEE; 2007. Information needs in collocated software development teams; pp. 344–353. [Google Scholar]
  55. Kreitz M. Security by design in software engineering. Acm Sigsoft Softw Eng Notes. 2019;44(3):23–23. doi: 10.1145/3356773.3356798. [DOI] [Google Scholar]
  56. Kuhn G, Goldberg R, Compton S. Tolerance for uncertainty, burnout, and satisfaction with the career of emergency medicine. Ann Emerg Med. 2009;54(1):106–113. doi: 10.1016/j.annemergmed.2008.12.019. [DOI] [PubMed] [Google Scholar]
  57. Langner R. Stuxnet: dissecting a cyberwarfare weapon. IEEE Secur Privacy. 2011;9(3):49–51. doi: 10.1109/MSP.2011.67. [DOI] [Google Scholar]
  58. UCL Laws (2021) Justice for sub-postmasters in the post office case. https://www.youtube.com/watch?v=Qk_P8AHaf24. Accessed 7 Sept 2021
  59. Lehtinen TO, Mäntylä MV, Vanhanen J, Itkonen J, Lassenius C. Perceived causes of software project failures—an analysis of their relationships. Inf Softw Technol. 2014;56(6):623–643. doi: 10.1016/j.infsof.2014.01.015. [DOI] [Google Scholar]
  60. Linberg KR. Software developer perceptions about software project failure: a case study. J Syst Softw. 1999;49(2–3):177–192. doi: 10.1016/S0164-1212(99)00094-1. [DOI] [Google Scholar]
  61. Martin D (2022) 11 of the most costly software errors in history. https://raygun.com/blog/costly-software-errors-history/. Accessed 6 Dec 2022
  62. McGraw G. Software security. IEEE Secur Privacy. 2004;2(2):80–83. doi: 10.1109/MSECP.2004.1281254. [DOI] [Google Scholar]
  63. McLeod S. Maslow’s hierarchy of needs. Simply Psychol. 2007;1:1–18. [Google Scholar]
  64. Meenakshi D, Naik JS, Reddy MR. Software testing techniques in software development life cycle. Int J Comput Sci Inf Technol. 2014;5:3729–3731. [Google Scholar]
  65. Mellado D, Fernández-Medina E, Piattini M. A common criteria based security requirements engineering process for the development of secure information systems. Comput Stand Interfaces. 2007;29(2):244–253. doi: 10.1016/j.csi.2006.04.002. [DOI] [Google Scholar]
  66. Miller S, Appleby S, Garibaldi JM, Aickelin U. Towards a more systematic approach to secure systems design and analysis. Int J Secur Softw Eng (ijsse) 2013;4(1):11–30. doi: 10.4018/jsse.2013010102. [DOI] [Google Scholar]
  67. Mohurle S, Patil M. A brief study of WannacCry threat: Ransomware attack 2017. Int J Adv Res Comput Sci. 2017;8(5):1938–1940. [Google Scholar]
  68. Mouratidis H, Giorgini P, Manson G. When security meets software engineering: a case of modelling secure information systems. Inf Syst. 2005;30(8):609–629. doi: 10.1016/j.is.2004.06.002. [DOI] [Google Scholar]
  69. Moustafa AA, Bello A, Maurushat A. The role of user behaviour in improving cyber security management. Front Psychol. 2021 doi: 10.3389/fpsyg.2021.561011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Moyón F, Méndez D, Beckers K, Klepper S. International Conference on Product-Focused Software Process Improvement. Cham: Springer; 2020. How to integrate security compliance requirements with agile software engineering at scale? pp. 69–87. [Google Scholar]
  71. Nesher Shoshan H, Sonnentag S. The effects of employee burnout on customers: an experimental approach. Work Stress. 2020;34(2):127–147. doi: 10.1080/02678373.2019.1577312. [DOI] [Google Scholar]
  72. Nielsen J. Usability engineering. New York: AP Professional; 1993. [Google Scholar]
  73. Osterweil LJ. Be prepared. Acm Sigsoft Softw Eng Notes. 2016;41(5):4–5. doi: 10.1145/2994205.2994210. [DOI] [Google Scholar]
  74. Oz E. When professional standards are lax: The CONFIRM failure and its lessons. Commun ACM. 1994;37(10):29–43. doi: 10.1145/194313.194319. [DOI] [Google Scholar]
  75. Pearlson K, Huang K. Design for cybersecurity from the start. MIT Sloan Manage Rev. 2022;63(2):73–77. [Google Scholar]
  76. Peisert S, Schneier B, Okhravi H, Massacci F, Benzel T, Landwehr C, Mannan M, Mirkovic J, Prakash A. Perspectives on the SolarWinds incident. IEEE Secur Privacy. 2021;19(2):7–13. doi: 10.1109/MSEC.2021.3051235. [DOI] [Google Scholar]
  77. Pinto JK. Lies, damned lies, and project plans: recurring human errors that can ruin the project planning process. Bus Horiz. 2013;56(5):643–653. doi: 10.1016/j.bushor.2013.05.006. [DOI] [Google Scholar]
  78. Reason J. Human error. Cambridge: Cambridge University Press; 1990. [Google Scholar]
  79. Rehman M, Mahmood AK, Salleh R, Amin A. 2012 International Conference on Computer & Information Science (ICCIS) Piscataway: IEEE; 2012. Mapping job requirements of software engineers to Big Five Personality Traits; pp. 1115–1122. [Google Scholar]
  80. Renaud K, Bongiovanni I, Wilford S, Irons A. PRECEPT-4-justice: a bias-neutralising framework for digital forensics investigations. Sci Justice. 2021;61(5):477–492. doi: 10.1016/j.scijus.2021.06.003. [DOI] [PubMed] [Google Scholar]
  81. Renaud K, Musarurwa A, Zimmermann V. ICCWS 2021 16th International Conference on Cyber Warfare and Security. 2021. February. Contemplating blame in cyber security; pp. 309–317. [Google Scholar]
  82. Rodeghero P, Zimmermann T, Houck B, Ford D. 2021 IEEE/ACM 43rd International Conference on Software Engineering: Software Engineering in Practice (ICSE-SEIP) Piscataway: IEEE; 2021. Please turn your cameras on: remote onboarding of software developers during a pandemic; pp. 41–50. [Google Scholar]
  83. Russo D, Masegosa AR, Stol K-J. From anecdote to evidence: the relationship between personality and need for cognition of developers. Empir Softw Eng. 2022;27(3):1–29. doi: 10.1007/s10664-021-10106-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  84. Salleh N, Mendes E, Grundy J. Investigating the effects of personality traits on pair programming in a higher education setting through a family of experiments. Empir Softw Eng. 2014;19:714–752. doi: 10.1007/s10664-012-9238-4. [DOI] [Google Scholar]
  85. Sayagh M, Kerzazi N, Adams B, Petrillo F. Software configuration engineering in practice interviews, survey, and systematic literature review. Ieee Trans Softw Eng. 2018;46(6):646–673. doi: 10.1109/TSE.2018.2867847. [DOI] [Google Scholar]
  86. Searle R, Renaud K. Trust and vulnerability in the cybersecurity context. 2023. [Google Scholar]
  87. Shahzad B, Al-Ohali Y, Abdullah A. Trivial model for mitigation of risks in software development life cycle. Int J Phys Sci. 2011;6(8):2072–2082. [Google Scholar]
  88. Shenton AK. Viewing information needs through a Johari Window. Ref Serv Rev. 2007;35(3):487–496. doi: 10.1108/00907320710774337. [DOI] [Google Scholar]
  89. Stamat ML, Humphries JW. Proceedings of the 14th Western Canadian Conference on Computing Education. 2009. Training≠ education: putting secure software engineering back in the classroom; pp. 116–123. [Google Scholar]
  90. Stoilescu D, Egodawatte G. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms. Comput Sci Educ. 2010;20(4):283–300. doi: 10.1080/08993408.2010.527691. [DOI] [Google Scholar]
  91. Stolee KT, Elbaum S, Sarma A. 2011 International Symposium on Empirical Software Engineering and Measurement. Piscataway: IEEE; 2011. End-user programmers and their communities: An artifact-based analysis; pp. 147–156. [Google Scholar]
  92. Storey MA, Ernst NA, Williams C, Kalliamvakou E. The who, what, how of software engineering research: a socio-technical framework. Empir Softw Eng. 2020;25(5):4097–4129. doi: 10.1007/s10664-020-09858-z. [DOI] [Google Scholar]
  93. Sultan K, En-Nouaary A, Hamou-Lhadj A. 2008 International Conference on Information Security and Assurance (isa 2008) Piscataway: IEEE; 2008. Catalog of metrics for assessing security risks of software throughout the software development life cycle; pp. 461–465. [Google Scholar]
  94. Sutcliffe A. Task-related information analysis. Int J Hum Comput Stud. 1997;47:223–257. doi: 10.1006/ijhc.1997.0118. [DOI] [Google Scholar]
  95. Van Niekerk J, Futcher L. IFIP World Conference on Information Security Education. Cham: Springer; 2015. The use of software design patterns to teach secure software design: an integrated approach; pp. 75–83. [Google Scholar]
  96. Villarroel R, Fernández-Medina E, Piattini M. Secure information systems development—a survey and comparison. Comput Secur. 2005;24(4):308–321. doi: 10.1016/j.cose.2004.09.011. [DOI] [Google Scholar]
  97. Walden J, Shumba R. 2006 Annual Conference & Exposition. 2006. Integrating secure development practices into a software engineering course; pp. 11–792. [Google Scholar]
  98. Wallis N. The great post office scandal. Bath: Bath Publishing; 2021. [Google Scholar]
  99. Wang Q, Lai X. Proceedings Second Asia-Pacific Conference on Quality Software. Piscataway: IEEE; 2001. December. Requirements management for the incremental development model; pp. 295–301. [Google Scholar]
  100. Warkentin M, Willison R. Behavioral and policy issues in information systems security: the insider threat. Eur J Inf Syst. 2009;18(2):101–105. doi: 10.1057/ejis.2009.12. [DOI] [Google Scholar]
  101. Westland JC. The cost of errors in software development: evidence from industry. J Syst Softw. 2002;62(1):1–9. doi: 10.1016/S0164-1212(01)00130-3. [DOI] [Google Scholar]
  102. Whittle B, Ritchie J. Prescription for murder: The true story of Harold Shipman. London: Warner; 2000. [Google Scholar]
  103. Wu J. Cyberspace mimic defense. Basel: Springer; 2020. Preface; pp. v–xi. [Google Scholar]
  104. Wynekoop JL, Walz DB. Investigating traits of top performing software developers. Inf Technol People. 2000;13(3):186–195. doi: 10.1108/09593840010377626. [DOI] [Google Scholar]
  105. Xie L, Qu Z. On civil engineering disasters and their mitigation. Earthq Eng Eng Vib. 2018;17(1):1–10. doi: 10.1007/s11803-018-0420-6. [DOI] [Google Scholar]
  106. Zetter K. Countdown to zero day: Stuxnet and the launch of the world’s first digital weapon. New York: Broadway Books; 2014. [Google Scholar]
  107. Zhang P, Carey J, Te’eni D, Tremaine M. Integrating human-computer interaction development into the systems development life cycle: a methodology. CAIS. 2005;15(1):29. [Google Scholar]
  108. Zimmermann V, Renaud K. Moving from a “human-as-problem” to a “human-as-solution” cybersecurity mindset. Int J Hum Comput Stud. 2019;131:169–187. doi: 10.1016/j.ijhcs.2019.05.005. [DOI] [Google Scholar]

Articles from Zeitschrift Fur Arbeitswissenschaft are provided here courtesy of Nature Publishing Group

RESOURCES