1.1 Introduction
The approach to software development in the 1950s and 1960s has been described as the Mongolian Hordes Approach by Brooks [ The method or lack of method was applied to projects that were running late, and it involved adding a large number of inexperienced programmers to the project, with the expectation that this would allow the project schedule to be recovered. However, this approach was deeply flawed as it led to inexperienced programmers with inadequate knowledge of the project attempting to solve problems, and they inevitably required significant time from the other project team members.
This resulted in the project being delivered even later, as well as subsequent problems with quality (i.e. the approach of throwing people at a problem does not work). The philosophy of software development back in the 1950/1960s was characterized by:
The completed code will always be full of defects.
The coding should be finished quickly to correct these defects.
Design as you code approach.
This philosophy accepted defeat in software development and suggested that irrespective of a solid engineering approach, the completed software would always contain lots of defects and that it therefore made sense to code as quickly as possible and to then identify the defects that were present, so as to correct them as quickly as possible to solve a problem.
In the late 1960s, it was clear that the existing approaches to software development were deeply flawed and that there was an urgent need for change. The NATO Science Committee organized two famous conferences to discuss critical issues in software development []. The first conference was held at Garmisch, Germany, in 1968, and it was followed by a second conference in Rome in 1969. Over fifty people from eleven countries attended the Garmisch conference, including Edsger Dijkstra, who did important theoretical work on formal specification and verification. The NATO conferences highlighted problems that existed in the software sector in the late 1960s, and the term software crisis was coined to refer to these. There were problems with budget and schedule overruns, as well as the quality and reliability of the delivered software.
The conference led to the birth of software engineering as a discipline in its own right and the realization that programming is quite distinct from science and mathematics. Programmers are like engineers in that they build software products, and they therefore need education in traditional engineering as well as the latest technologies. The education of a classical engineer includes product design and mathematics. However, often computer science education places an emphasis on the latest technologies, rather than on the important engineering foundations of designing and building high-quality products that are safe for the public to use.
Programmers therefore need to learn the key engineering skills to enable them to build products that are safe for the public to use. This includes a solid foundation on design and on the mathematics required for building safe software products. Mathematics plays a key role in classical engineering, and in some situations, it may also assist software engineers in the delivery of high-quality software products. Several mathematical approaches to assist software engineers are described in [].
There are parallels between the software crisis in the late 1960s and serious problems with bridge construction in the nineteenth century. Several bridges collapsed or were delivered late or overbudget, due to the fact that people involved in their design and construction did not have the required engineering knowledge. This led to bridges that were poorly designed and constructed, leading to their collapse and loss of life, as well as endangering the lives of the public.
This led to legislation requiring engineers to be licensed by the Professional Engineering Association prior to practicing as engineers. This organization specified a core body of knowledge that the engineer is required to possess, and the licensing body verifies that the engineer has the required qualifications and experience. This helps to ensure that only personnel competent to design and build products actually do so. Engineers have a professional responsibility to ensure that the products are properly built and are safe for the public to use.
The Standish group has conducted research (Fig. However, the comparison between 1995 and 2009 suggests that there have been some improvements with a greater percentage of projects being delivered successfully and a reduction in the percentage of projects being cancelled.
Fig. 1.1
Standish reportresults of 1995 and 2009 survey
Fred Brooks argues that software is inherently complex and that there is no silver bullet that will resolve all of the problems associated with software development such as schedule or budget overruns []. Poor software quality can lead to defects in the software that may adversely impact the customer and even lead to loss of life. It is therefore essential that software development organizations place sufficient emphasis on quality throughout the software development lifecycle.
The Y2K problem was caused by a two-digit representation of dates, and it required major rework to enable legacy software to function for the new millennium. Clearly, well-designed programs would have hidden the representation of the date, which would have required minimal changes for year 2000 compliance. Instead, companies spent vast sums of money to rectify the problem.
The quality of software produced by some companies is impressive. These models focus on improving the effectiveness of the management, engineering and organization practices related to software engineering and in introducing best practice in software engineering. The disciplined use of the mature software processes by the software engineers enables high-quality software to be consistently produced.