Rapid Development System.

Page Updated: Mon, 23 Sep 2019 23:40 AEST (UTC+1000)

Overview

Our Software Development System is based largely on Steve McConnell's Rapid Development.

Steve McConnell's System, as detailed in his 1996 text Rapid Development: taming wild software schedules, will last as a timeless classic.

Rapid Development is research based. McConnell gathers poignant data from numerous studies on software development. From this research, a view of existing systems, and an overview of best practice in software development, McConnell straightens out the field.

One major difference between McConnell's Rapid Development System and other systems, even those created after 1996, is that he details a grab bag of methods from which one might choose, depending on the unique characteristics of the project. Other Systems, by contrast, tend to commend a single set of methods for all projects, without sensitivity to the unique characteristics. For example McConnell details 12 lifecycle methods (some are used to illustrate bad practice): Waterfall; Code-and-fix; Spiral; Waterfall with overlapping phases; Waterfall with subprojects; Waterfall with risk reduction; Evolutionary Prototyping; Staged Delivery; Design-to-Schedule; Evolutionary Delivery; Design to Tools; Commercial Off-the-Shelf software.

Broadly, Rapid Development entails establishing schedule priorities among: Faster development; reducing schedule slip risk; or making progress visible. Different projects require different priorities.

Further, different projects will have competing aims on the "Trade Off Triangle": Cost; Product (it's features); and Schedule. We establish which two are important for the project. Developers must have latitude with at least one of these aims. For example, if there is a requirement for a tight schedule and a low budget then the developer needs to be free, with client consultation, to cut unnecessary features from the specification. If, on the other hand, low cost and a feature rich product is important, then a longer schedule or a less predictable schedule will be necessary.

Once priorities and aims have been established management fundamentals need to be applied. This includes, among many things, knowing how to estimate the size, effort and schedule of the software project; how to structure a development team; and how to choose a Software Lifecycle.

All throughout risks must be managed and classic mistakes avoided.

Example Insights from Rapid Development

McConnell's Rapid Development is filled with many insights.

Estimation

For example, with regard to Estimation:

Estimation can only be given within ranges that become more precise as the project proceeds.

McConnell 1996 p168 referencing Boehm 1995.

Why? You can't give an estimate it until you've defined what "it" is (McConnell 1996 p167). Defining what "it" is continually refined throughout the software development lifecycle. (McConnell 1996 p167).

There are all sorts of estimation methods, the best is using accurate data from previous projects (not mere memory);

Once the project is sufficiently defined estimating each of the individual tasks and then adding them up will become the most accurate project estimate available. (Symons 1991, referenced in McConnell 1996 p179)

Lifecycle Method

Regardless of which lifecycle method you choose the phases identified in Classic Waterfall are essential to software development. (McConnell 1996 p143).

Despite the unsuitability of Classic Waterfall for most, if not all, projects it's phases are not dispensable. Different lifecycle methods just combine them in different ways. To succeed you can't avoid these phases.

How long until Specification Sign off?

From data within McConnell's Rapid Development we can derive an approximation of how long each development phase should take, when a project is going well:

Time of each lifecycle phase for small projects. Time of each lifecycle phase for large projects.

Understanding the diagram:

Small projects are about 2,500 lines of code, Large Projects about 500,000 lines.

Data for the diagram:

These figures are an amalgam of Steve McConnell's figures on p122 of Rapid Development (which don't include the requirements phase) together with requirements phased based on, "Requirements specification takes between 10 percent and 30 percent of the elapsed time on a project." (Boehm, 1981 quoted by McConnell 1996, p124). The "Spec Signed Off" Milestone is Softmake's insertion.

Although not emphasized in McConnell's System Softmake is a firm believer in treating the "upstream" development phases, Requirements, Architecture Design, Detailed Design as phases whose chief aim is the milestone of a specification to be signed off by the client.

With this approach and McConnell's data we can furnish the developer with a stunning piece of information:

When done properly getting to specification sign off will take 40% to 60% of total project time.

If a developer where only allowed to know one thing about development done right then it would have to be this point. This is the chief tool needed to guard against those invariable solicitations, from clients and project managers (without software development training) toward the classic mistakes of: skimping on upstream activities; code-like-fuck; and overly optimistic schedules.

Quality Assurance

Consider further example insights from McConnell's Rapid Development method regard to quality assurance:

Comparing quality assurance techniques. Defects detected: Inspections 60 - 90 %; Code reading, twice as many as execution testing; Walkthroughs 30 - 70%; Execution, up to 60%

Understanding the diagram:

"Inspections" are a formal technical review where: work is individually reviewed prior to the meeting by "reviewer" developers; during the meeting the author describes the work, the errors are indicated by reviewers, errors are recorded; after meeting an error report is written by the moderator and error correction actions assigned.

"Code Reading" is a technical review where the author hands out code to other developers. They read the code and report back to the author errors.

"Walkthroughs" are "any meeting at which two or more developers review technical work with the purpose of improving quality." (McConnell 1996, p73)

Data for the diagram (from McConnell 1996):

  • "Inspections find from 60 to 90 percent of the defects in a program." p74
  • "Code reading detected about twice as many defects per hour of effort as [execution] testing." p74
  • "Walkthroughs can find between 30 and 70 percent of the errors." p74
  • Defect detection rate of execution testing is "...often less than 60 percent." p73

In short, while execution testing can't be dispensed with technical reviews will be the more powerful quality assurance technique.

Classic Mistakes

McConnell enumerates a handy list of "classic mistakes".

To achieve rapid development you need to avoid making any big mistakes. Some ineffective development practices have been chosen so often, by so many people, with such predictable, bad results that they deserve to be called "Classic Mistakes".

McConnell 1996, p39.

From the 36, or so, that McConnell identifies one could argue the following are the top 10:

  1. Code-like-fuck programming. (Just jump in and start coding a solution.)
  2. Skimping on upstream activities: requirements analysis, architecture, & design.
  3. Overly optimistic schedules.
  4. Lack of user input.
  5. Feature Creep.
  6. Noisy, crowded offices.
  7. Omitting necessary tasks from estimates.
  8. Skimping on Quality Assurance.
  9. Requirements Gold Plating (Requirements deemed essential when they are not.)
  10. Planning to catch up later.

References