Some days, you wish you had telepathy. You just know that your development staff is holding back in some way, but you don't know how to get them to communicate. Is the project in trouble, but they're afraid to tell you?
Since your software development staff won't tell you what they're really thinking, I asked them to confide in us instead. I posed a single question to professional programmers and testers: If you could get your CIO to understand just one thing about software requirements, what would it be? From the answers, I collated the five things at the top of their minds. If you grok these concepts, you will win the respect and support of your programming department (prizes you'll also earn for understanding the term "grok"), and you'll optimize the chance of success for your next software project.
I must warn you that this will shock some developers. Most of them don't expect you to have a clue.
1. The inconvenient checkbox: Understand the role of requirements
Many development projects are handicapped from the start. The requirements are vague and subject to interpretation, require intimate knowledge of the business to interpret correctly, and aren't prioritized. "It's a classic garbage-in, garbage-out situation," says James Pulley, director of professional services at PowerTest. "Poor requirements are provided to a development organization that either does not question them or receives hostile glares from the business community when clarification is sought. Items requiring insight or interpretation are interpreted one way by the requirement writer, a second way by development, possibly a third way by QA."
It's critical to establish some sort of process for documenting software requirements--what one developer plaintively described as avoiding an ongoing guessing game. Yet, say many developers, managers often forget why they gathered the software requirements in the first place. QA tester Darrel Damon complains that some organizations act as though requirements are "an inconvenient checkbox that has to be gone through for legal purposes." In one project on which he worked, for example, management paid little attention to maintaining and updating the requirements. "It was as if the requirements checkbox had been checked, so we could move past requirements," he says. "When requirements did change, there was no formal notification to the rest of the team."
This was more than a political issue. During testing, Damon entered a defect report when he found a specific discrepancy between the application and the requirement. The business analyst said it wasn't a defect; the application was right and the requirement document was wrong. "I argued that it was a defect regardless, because defects are not just in the app. I stood my ground and refused to close the defect. I finally got a meeting with the business owner; the requirement was right and the app was wrong. [The analyst's] comment about it not being a defect because it was 'only a documentation issue in requirements' spoke volumes about the value placed on requirements."
But that's the easy item. Most developers say their CIO understands the importance of requirements. It's what happens after that where things get ... interesting.
2. Don't throw it over the wall: The right people should define the requirements
You want software requirements that help the development staff create an application that gives the user joy. To achieve that goal, you need to get the right people into the room. Many corporations depend on a business analyst to elicit information from the user, to document it in a company-approved way, and then to throw the paperwork over the cube wall to developers who rarely (if ever) interact with the people who will be using the software.
Instead, says developer Dave Nicolette, "CIOs should use business analysts in a role more in keeping with the job title: to analyze business processes and identify opportunities for improvement. ... On software development projects, business analysts should not get between the customers and the developers. When they do, they only cause confusion."
The most important person in the requirements-gathering process is the user. Daniel Corbit, senior software engineer at CONNX Solutions, says, "Software requirements are not dictated from the top; they are gathered from the bottom. If they do not model the real-life business process of how a company does its work, then they are doomed to fail in execution. ... The key part of the equation is to carefully interview everyone who uses the tool and everyone who will be impacted by the tool to find out what it needs to do."
Put the person who will use the software in the room with the person who will create the software--which you may note is also a foundation of agile software development. Find out what the user needs to accomplish. This doesn't necessarily mean that you need to define what each screen looks like. Carlton Nettleton, a software developer and agile coach, points out that "meeting a series of requirements is not the same thing as meeting our customer's goals. Tell me what the customer's goals are; even better, bring the customer in and he/she can tell me in person. We can then figure out what types of requirements are needed. Then, when we are ready to execute, let us plan around our customer's goals, not around the requirements."
However, this isn't a one-time visit. The stakeholders have to get and stay involved. According to Ellen Gottesdiener, principal consultant at EBG Consulting and author of The Software Requirements Memory Jogger and Requirements by Collaboration, it's important for the CIO to ensure that technical and business communities collaborate on requirements, early and often. The classic approach is to throw a Marketing Requirements Document over the wall. Instead, she says, "A bridge must be built, to collaborate between technical and IT stakeholders. Get personally involved in this effort. Find out what works in your organization to actively and productively engage customers in requirements development."
Gottesdiener offers a real-world example. A project started, ran and failed three times. The organization tried internally once; then it outsourced it; finally it tried again internally with another IT manager. From the start, Gottesdiener explained, the business sponsor was disengaged, and participation by the users and business experts was sparse. But the solution was necessary, for both business and political reasons, so the organization decided to try again. This time, however, it conducted a retrospective to examine the project in a deep and significant way; both product and process were explored. Gottesdiener says that this time around, "They see the role management has played in colluding, not sharing information, infighting, lack of transparency. They examine the frustrations of not getting users to participate in requirements development and validation. The whole story gets put, literally, on the wall in an open manner. They decide as a team what they would need to do to be successful. It involves starting with full-time customer resource for a fixed time frame, requirements workshops, reviews, prototypes and ongoing retrospectives for each milestone. And of course, management helping them make all this happen with the right people and money, at the right time." According to Gottesdiener, the fourth time was the charm: The group delivered on time, on budget--a first for the department.
When you gather the project stakeholders, be sure to include the testing organization in the process. Performance testing expert Jim Pensyl believes strongly that testers should be involved at project kickoff. He says, "Only the testing organization can tell you if the requirements are testable. Why not then use [their] deliverables for the source of truth in status and estimation refinements?" Developer David Gelperin agrees: "Experienced testers and technical writers should be active participants in the cross-functional teams tasked with requirements development."
This is an important time to listen to the development team's feedback. Says Jared Richardson, author of Ship it! A Practical Guide to Successful Software Projects, "If you don't include us on the time line generation, don't expect us to meet the time line. If the general contractor on a building project doesn't ask the brick mason or the electrician how much time they need, how do they expect to generate a realistic schedule? They can't. If you don't include developers and testers when you're generating your time lines, don't think you'll hit them."
Even after you get the right people to talk about what the software needs to do, you still have another hurdle to overcome: getting those requirements recorded with the right amount of detail. Where do you draw the line between micromanagement and detailed instructions?
3. Superficially complete: Define requirements with "enough" detail
Although developers uniformly insist that they want the right amount of detail in the software specifications, they often disagree on how much is "enough." One set of developers wants the minimum: a rough idea of the business problem to be solved. Anything more than that, they feel, is meddling with developer creativity, and a waste of time. One developer gave the example of "an automated system to help us do our taxes" as a fine specification.
At the other extreme is the explicit requirements document. Peter Nairn, a software tester in the United Kingdom, says he worked on one successful project that had superlative requirements documentation. In addition to a rigorous definition, Nairn says, the requirements were prioritized within the requirements document itself. "There were three levels of priority: Must, Shall and Should. Must meant the system could not go live without; Shall meant that the system would be degraded by not having it and there would be financial penalties for not meeting the requirement; and Should meant that these were nice to haves and would enhance the system." The requirement spec thus read something like this: "The system Must [1.1] have the ability to do xxx and Shall [2.3] be able to yyy and Should [3.3] be able to zzz." The numbers in square brackets referenced an appendix to the document with definitions. Explains Nairn, a little wistfully, "The whole thing made traceability easy, prioritization easy, phasing of implementation easy--and costing a nightmare! Once the requirements were agreed, phases agreed, project costed and price agreed, everyone knew what was going to be done."
That doesn't mean that every software requirements document should be 200 pages long. The appropriate level of detail varies by job description, by application domain, and certainly by corporate culture. Even then, you should expect contradictions.
Your quality assurance department usually wants a lot of detail. Jim Hazen, a professional QA tester in Colorado, wants requirements defined well enough for him to write test cases. That means, he explains, "They have a level of information beyond the typical, 'We want an application to do X, Y and Z.' They should have information that states more of what the requirement is to do (the What) and the way it is to do it (the How)."
The software requirements are supposed to enable developers to improve application consistency (especially in large projects) and to reduce the guessing game of, "What does the user want?" Damon worked on one project that used Use Cases as its requirements repository. However, he says, they weren't the classic use case. "All they contained were example screen shots and comments about the content of the screens. Business rules were sometimes included, sometimes not. And absolutely no standards were applied." For example, a simple-sounding requirement for fields to be "numeric only" could be interpreted in several ways. One developer displayed an error message in a dialog box; another cleared the field if alpha characters were typed; yet another disabled all keys except numbers when the user was on that field. All three met the "requirement," and all were included in the same application. "Talk about end-user confusion factor!" remarked Damon.
Beware, however, of requirements documents with multiple purposes. It's easy for these to become corporate documents rather than product specifications--which may have conflicting roles. Typically, a contract with outside software providers has customers sign off on software requirements before implementation begins. As one consultant pointed out, because the stakeholders (among them the CIO) need this to happen as soon as possible, the signed requirements are written to be suitable for customers and contracts, not for developers and testers. According to the consultant from Latvia, later in the process, developers and testers find the requirements unspecific and ambiguous, and they don't reflect deep knowledge of the business. The developers discover that the requirements aren't technically achievable and are barely testable. Sometimes, once the developers are under way, they'll learn the requirements are simply wrong. And, he says, "While that's the signed document (and the requirements phase is marked as 100 percent complete in a master schedule), it is not going to be changed anymore, no matter how poor it is."
One developer I met online, Malcolm, cynically observed that, "What happens is that the specification looks superficially complete, but in fact contains hidden inconsistencies, glossing over, omissions or impossible conditions. The document, if it is binding, then becomes an impediment to the project, plus a source of irritation."
One solution may be to rely more on business goals than exacting technical details. Gottesdiener recommends that companies break out of the "system shall ..." paradigm for specifying requirements. She says, "We've got to stop relying on long, tedious textual requirements documents. Insist on smarter documentation." Instead, rely on models built in collaboration with business folks, such as requirements workshops. According to Gottesdiener, "Customers 'playing around' with their needs with cheap, fast and low-fidelity prototypes facilitates the meta-pattern underlying requirements development: evolution. Evolution is chaos, with feedback."
One developer suspects the preference for more-or-less detail may depend on the nature of the environment. "If you're in a situation where there is a high level of trust, a loose expression of requirements allows the team (including the customer or nearest surrogate) to zero in on the real requirements through some amount of iteration." In his view, the need for excruciating detail is characteristic of an environment in which the best political defense was to show that the code did what the requirements said--even if the best commercial value was otherwise. "To do that, the requirements must be explicitly definitive." If you're unsure if your development department has achieved that measure of trust, this agile developer suggests that "requirements should begin with some kind of preamble giving intention. Sometimes, two different implementations might completely satisfy the raw requirements, but one might be ever so much more idiomatic to the underlying business motivation and thus more maintainable and extensible."
So, how do you know how much detail is "enough" for a software specification? There probably isn't a single right answer. The savvy manager will recognize or create an appropriate corporate culture--which may mean asking developers and testers, during job interviews, "How detailed do you prefer application requirements to be?" Because if a CIO thinks the requirements documentation should be one way and the development team wants it another way, friction is inevitable.
It's bad enough to deal with developers who can be counted on to quibble over the amount of detail put into the software requirements. But now we get to their major concern: the perception that CIOs are poor at dealing with changing requirements.
4. Working from ignorance: Recognize that requirements change
While developers disagree vehemently about the optimum granularity of software requirements documentation, they are in complete agreement about another aspect of the process: The requirements will change. According to developers, however, managers and CIOs apparently prefer to imagine that the software will adhere to the requirements document even when it's wrong or misleading. The clueful CIO will understand this key fact, as well as the importance of building a change management process into the development lifecycle so that changes can be controlled and dealt with.
The first issue in changing requirements is the notion of estimating project effort. Brian Marick, an independent consultant on agile methods, points out that every project starts from ignorance; you have the least amount of data available with which to make sane decisions. "If you knew now what you'll know tomorrow, the decision would be better," Marick points out. This isn't an excuse for procrastination, per se, but a philosophy of waiting where appropriate. That often means removing detail, such as deciding only on the broad product direction, and specifying details one piece at a time. However, he cautions, "This approach requires careful attention to feedback from the real world, so that you in fact take into account tomorrow's information when making tomorrow's decision, and constantly improving your ability to react to change and recover from mistakes. It does no good to know what the right decision is tomorrow when you do not have the resources to implement it."
Scott Ambler, Practice Leader Agile Development within the IBM Methods group, agrees that getting a "solid" estimate up front is a naïve desire. "That one decision motivates a whole bunch of really bad practices, such as big requirements up front (BRUF), big design up front (BDUF), and judging the project team on whether they meet the budget instead of whether they achieved great ROI."
Poor or underspecified requirements aren't necessarily a sign of incompetence. An experienced software architect and lead developer at a CMMI Level 5 company suggests that requirements are inadequate for conveying everything necessary up front. "It's partly because the users are not always able to express all their actual needs; partly because requirements do not describe current work practices, but rather describe expectations for the future system; and partly because requirements have to address many business-related aspects, such as vendor selection, cost estimation and contractual obligations," he says.
But whether it's a contractual requirement to dot all I's and cross all T's, or simply the way you learned to scope out every project, your programming staff wants you to recognize that change is the norm rather than the exception. Agile developer Dave Nicolette says, "We need to turn away from the traditional approach of trying to nail down all the detailed requirements up front, and then imposing so-called 'change control' processes designed not to control change, but to discourage it. Instead, CIOs should actively and firmly support methods that embrace change and deal with it gracefully, as agile and lean methods are meant to do."
Since requirements change over time, Hazen points out, the process needs to be managed appropriately. There will be reviews and sign-offs again. The project time line must be revisited to determine if deadlines remain realistic. At some point, you do have to freeze the changes and move forward. Says Hazen, "Scope creep is a killer for projects, especially ones that are time constrained, which the vast majority are."
There is no cutoff point where requirements stop changing, believes developer Stefan Steurs, but many CIOs assume a point exists when everything is perfect and coding may commence. When developers, testers and users get involved in reviews, development, testing, prototyping, piloting and other activities, they feed the discovery process with new elements, some of which can be very disruptive. Adds Steurs, "This means you need decent change management. You want to know throughout the development of the product if the change to the requirements is converging or whether it remains disruptive. ... The CIO has to know what is going on, and it's time for the next [management] level down to start being honest instead of saying that the clothes of the emperor are very, very nice."
Developers passionately wish that CIOs and other managers would build change management into the development process and make it safe. Otherwise, you put more than the project at risk. Instead, some managers whitewash the situation or software methodologies. Luiz, who works at the Brazil office of one of the biggest consulting companies in the world, says that his firm uses waterfall techniques but casts it in more modern-sounding terms. "We estimate the size of the system using [function point analysis] at the very beginning of each project and use this estimate to sign the contract. So usually in the first two weeks we know exactly how much this system is going to cost and how long it will take to develop it. Of course, we are wrong most of the time and usually we are on the wrong side of 'wrong,' which means we underestimated the complexity of the system, and it's going to cost more and take longer to develop than we initially thought. To protect us, we assume that every requirement change is a potential way to grab more of the customer's money by charging extra and overestimated 'code-monkey hours' for even trivial changes. Sometimes this works, meaning that we could profit a lot and the customer hates us just a little bit after all this. Many more times, the customer just hates us, we hate our jobs, and our bosses hate us for [their] not being able to go to Aspen this year."
You can be part of the problem, too. Geoffrey Slinker, a software developer for more than 20 years, says he is most irked by the tendency for a software feature mentioned or proposed by the CIO, or any CXO, to instantly become a required feature. "The problem doesn't lie in the proposed feature," he says. "Often, the feature is a good idea. The problem lies in the disruption that is caused. The proposed feature becomes a high priority just because a CXO made the proposal. ... Even if the statement is an off-the-cuff comment during a demonstration towards the end of the development cycle, the statement can be interpreted as an action item and cause a chain reaction of meetings, changes and re-prioritizations." Remember that your voice carries, Slinker cautions; don't let your position disrupt the prioritization of software requirements.
Getting the software requirements right is only the first step, though. After the developers and testers have started work, it's the CIO's job to ensure that the project stays on track, and that the result adheres to the original promise. And oh boy, does that open up a whole new set of developer foot-stomping
5. Carpet yanking: Pay attention to the people on the front line
Most developers aren't asking you to know all the details of a given project. Some believe if a CIO is worrying about specific line items in a software requirements spec, she's micromanaging, and she isn't paying attention to the right tasks. But the developers do want the CIO to pay attention to what they're doing, what they're telling you, and--perhaps most importantly--what isn't being said.
Get out of the office. Talk to people. Manage by walking around. Find out whether the software requirements are being instantiated in the real world. Steurs wants his CIO to ask questions and listen carefully. Says Steurs, "The CIO has to realize that if there is no bad news, there is something very wrong. Smiling people nodding 'Yes' in meetings is not a sign of great intelligence at work."
But don't pretend to listen if you aren't going to take action. Richardson says, "Don't ignore our feedback if you ask for it. That's not empowering. It's pretending to include us before yanking the carpet out from under our feet."
Your testing organization is also an early-warning system. According to Pensyl, development staff may propagate positive reports during R&D, but if the test group is vocal at this point in the project, complaining of incomplete, missing and ambiguous requirements, you should take it as a clue that the team is working with a poor foundation. Pensyl says, "Project truths transition from being based upon factual evidence at the beginning of the project, to truths and decisions being based upon perceptions and reaction as the project progresses. The only untruth at the beginning of the project is and was the marketing promise date. All else at the beginning was truth based upon fact at that point. A promise date should never be issued without proper estimating. All project teams should be given equal credibility for their estimates."
If you want the software requirements process to improve, says Pulley, attach a reward to doing so. He suggests that CIOs find a way to tie a very large bonus percentage to the quality of the delivered application, six months after the release. "People are dollar motivated," he points out. "If you incent people to deliver by a given date, with few penalties for quality, then the application shall be delivered by that date. Make the pool substantially large. Place a small percentage of the bonus pool on the delivery date. Allow product support to draw from the bonus pool. If the project meets the date, but delivers a poor quality product, then the bonus pool will disappear. If, on the other hand, the application date slips, but the resulting delivery is rock solid and requires only a small amount of support beyond training, then the bonus pool should reward greatly."
Perhaps you can't achieve telepathy with your development staff. That may be beyond our current level of technology. But if you put these five techniques into practice, your developers may be fooled into believing that you can, indeed, read their minds.
Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.