1. Why Designing for Usability and Security is Hard
1.1 Empowering the System Builders
The effect of Information Technology on our lives can be seen all around us. The increasing ubiquity of technology has also led academic researchers to re-think our interaction with it. In a 2009 Communications of the ACM article [], several leading Human Computer Interaction (HCI) researchers noted that our relationship with computers has changed so radically since the fields inception that even the term HCI needs a rethink. In the past, we could reasonably assume that IT involved desktop computers and users in commercial organisations. Nowadays, systems are as ubiquitous as the people who use them, who are increasingly connected with different people and other systems. In such a connected world, the users of technology have incalculable opportunities to intentionally or unintentionally interact with a myriad of systems.
One question which has yet to be answered is how much Information Technology has empowered the work of those who build it. Media reports about the growth of high technology industry go almost hand-in-hand with reports about the impact of threats to it. For example, a report commissioned by the UK government estimated that the cost of cyber crime to the UK economy is 27 billion per annum []. While the methodologies used to devise this figure are debatable, the increased burden of expectation on system designers is not. As consumers, we expect systems to be attuned to the physical and social contexts within which they are used. As a corollary, we would also like our systems to be as secure as they are usable but, as we have discovered, threats to, and vulnerabilities within, this complex network of people and technology make this a challenging task for system builders.
1.2 Ubiquitous Technology
Systems can be made vulnerable through a variety of factors, ranging from the accidental introduction of incorrect code, through to an overly complex user interface which may be misused or circumvented. Those who might take advantage of these vulnerabilities have capabilities and motivations which may be unknown to the designers who inadvertently introduced them, together with different abstractions for what the vulnerabilities are and how they can be exploited. So, while our expectations for technology innovation continue to be exceeded, the quality of these systems security and usability often falls short.
There is no obvious reason why designing secure and usable systems should be so difficult, especially when guidance on applying Security and Usability Engineering best practice is no longer restricted to the scholarly literature. Nielsen claimed that cost was the principal reason why Usability Engineering techniques are not used in practice [].
1.3 Integrating Processes
Problems arise when considering how to use these approaches as part of an integrated process. Accepted wisdom in Software Engineering states that requirements analysis and specification activities should precede other stages in a projects lifecycle [] suggest that such stages should be devoted to high-level analysis of the system to be secured. Invariably, the decision of what concern to put first is delegated to the methodology followed by a designer. The designer has many approaches to choose from, some of which include treatment for security or usability concerns. To date, however, no approach treats both security and usability collectively, beyond treating them both as generic qualities contending with functionality.
When weighing up the approaches available, and the effort needed to apply them in their developmental contexts, designers may even choose to simply ignore them. Designers may believe that their knowledge about user goals and expectations negate the need for applying usability design techniques, or their understanding of the systems risks and mitigating controls negates the need for security analysis. In such cases, developers may believe Security and Usability Engineering approaches are useful, but they may not believe the pay-off justifies their cost.
1.4 Growing Interests in Usable Security
There is mounting evidence that the design of usable and secure systems is worthy of specific attention. The US Department of Homeland Security ranked usable security as one of the top cyber-security research topics for governments and the private sector []. Despite this interest in developers , yet there remains a lack of guidance available to designers about how to design usable systems at a sufficiently early stage in the design process. Fortunately, despite the lack of guidance, the Security Requirements Engineering and HCI communities have proposed a number of individual techniques forming the basis of integrated design approaches. In theory, specifying and designing secure and usable systems involves carefully selecting the right techniques from each community. In practice, each technique is founded in different, potentially conflicting, conceptual models. The level of tool-support for these techniques also varies considerably, and there has been little work on integrating these tools and the conceptual models which underpin them.
The knowledge gleaned integrating design techniques and tools also leads to research contributions beyond the design of usable and secure systems. While there are academic fora devoted to integrating security and software engineering activities, e.g. [], there has been little work describing how usability design techniques can be usefully applied to designing secure systems. It is, therefore, possible that the results of integrating design techniques and tools may lead to design innovation in this area.
1.5 IRIS and CAIRIS as Exemplars for Usability, Security, and Requirements Engineering Process and Tool Integration
The book explores how existing techniques and tools might be integrated and improved to support the design of usable and secure systems. It shows how concepts from Usability, Security, and Software Engineering can be harmonised to support the design of secure and usable systems, discusses the characteristics of tool-support needed to support the design of secure and usable systems, and considers how User-Centered Design techniques be improved to support the design of usable and secure systems.
In achieving these goals, the book presents IRIS (Integrating Requirements and Information Security): a process framework guiding the selection of usability, security, and software design techniques for design processes. To complement IRIS, the book also presents CAIRIS (Computer Aided Integration of Requirements and Information Security): a software platform for supporting these processes. I formally introduce both IRIS and CAIRIS in Part 2 of this book.
In Software Engineering, the term system design encompasses a broad range of activities; these range from scoping an early vision of what a system needs to do, through to developing models of software components and interfaces. We, therefore, primarily limit our focus on design to the early stages of a systems development for two reasons. First, the term design often refers to a plan or an outline of work, upon which a structure is built []; agreeing and specifying the nature of this plan is both required, and something best carried out as early as possible. Second, each discipline contributing techniques to the design of usable and secure systems argues for their own approaches preceding all others. Consequently, there is value in exploring how early design techniques interoperate together.