Project Management Book

...chapter 12 continued


We are all aiming for no errors in live running - they can have expensive consequences.

Finding errors in system test and user acceptance test isn't quite as expensive but is still expensive and the more errors there are the longer testing will take and the more it will cost. Ideally there would be no errors to be found in system test or user test.

If programmers can find errors when testing units of code it's cheaper but certainly isn't cheap - ideally there would be no errors to be found in unit testing.

Finding errors by inspecting requirements and design documents is very cost effective but certainly isn't free. Ideally there would be no errors in these documents.

The only ultimate aim a quality programme can have is to end up with perfection: no error is ever made at any stage of any project. Now of course we will never reach that promised land. But each time we do a project we should try and get a bit closer, and project by project try to drive the errors out earlier in the lifecycle - the earlier they're found the cheaper they'll be to fix.

This implies we need a plan: how will we find errors early, how will we know we're succeeding, how will we reduce the causes of errors?


Quality planning

In the Project Definition Document (PDD) or stage agreement, under the heading 'quality plan' we could say: 'in this project we will not check anything, we will not do any testing, the system will go live completely untried and the errors will be found by our customers.' That is a perfectly valid quality plan. It's not a very good one, but it is a very clear statement of what we intend to do, or rather not to do, quality-wise.

But we hope you'd be saying something more like this: 'we will check the requirements and UFD documents like this... we will analyse error causes like this... we'll measure quality like this... and these are the team's quality related targets...'.

In a small project deciding what should be checked by whom is not too difficult. In a large project deciding who should check what and building quality checking tasks into the plan can be complicated, which is why some project managers have a specialist - we'll call him the quality leader - inside the team (we stress inside the team) who helps decide what level of checking is appropriate, builds quality tasks into the plan and makes sure the team execute the quality tasks. More of this role later.

Would you agree that the sooner we can find an error the cheaper it will be to fix it? Then perhaps we should measure the time gap between when errors are introduced and when they are found and if it's on average a short gap financially reward the project team, and if it's a long gap (and all the errors are found near or after the end of the project) financially penalise the team.


Quality in all projects

Within a company it could be that some projects deliver excellent quality while others deliver terrible quality. The chances are that those producing good quality are checking up front, the others aren't.

How might we get all projects to include appropriate quality checking activities in their plan? Or should we leave it to project managers and hope they plan quality tasks in - and let the cowboys and the inexperienced learn the hard way? One way would be to have a rule that at the start of each stage the project manager must review his plan with project support, as we suggested in that chapter, and project support must agree appropriate quality tasks have been built into the plan. (We stress appropriate. In a simple project done by experts very little checking will be needed. In a complex project staffed by novices a lot of very thorough checking will be required.) Again, this is not about showing project support some boilerplate document called 'Quality Plan', in means project support reviewing the task by task work plans to assess whether in their judgement quality tasks are properly planned in. And a further rule might be that the sponsor will not give the go ahead unless project support are satisfied. In this arena policing is sometimes required to bring the laggards on board.


Error cost in business case

How about this: at the start of a project the project manager must estimate, perhaps based upon past projects, how many errors might be found in live running. He must multiply by an average cost per error (i.e. roughly what it costs the organisation - business costs and IT costs - each time a bug is found in a live system) and include the resulting amount of money as a cost item in the project's business case.

Imagine you're the sponsor and the project manager presents his cost breakdown to you and says: "...and based upon past projects we expect 50 errors in live running at an average cost to fix of £10,000, so the bug fixing cost will be around £500K." What would your reaction be? You may well ask the project manager what he could do to reduce that cost of failure. Suddenly the sponsor is interested in quality. The project manager then describes how he could make that more like 20 errors and £200K by investing up front in getting it right. And you will need the sponsor's support for this because much of that up front cost will fall upon his people - the users, as we shall see.


Team quality briefing

It is a good idea to have a kick off meeting at the beginning of each project stage. Invite the sponsor along to say a few well-chosen words, including these: "Team - user people and IT people - I don't really know what program bugs are but I do know that each one you deliver to me will cost me something like £10,000 to sort out, so I don't want too many of them please."

The team know the sponsor is interested in quality, not just any old thing delivered on the right date. The ground is now fertile for the project manager to run through the quality plan: "This is how we're going to try and meet the sponsor's requirement: we are going to check our work like this... during the project we are going to measure and analyse errors like this... we are going to analyse and address error causes like this... and, team members, these are your individual quality objectives upon which your next appraisal will partly depend." The team know the sponsor wants good quality and also that it is in their personal interest to deliver good quality. We will discuss team members' targets later in this chapter.


Inspections

An inspection is a meeting to identify errors and omissions in a piece of work. One cannot sit down and inspect a 1000 page document in one meeting. Twenty pages is about as much as can be done in one inspection. So that's 5 inspection meetings if you're producing a 100 page document and 50 inspections if you're producing a 1000 page document. Already you're thinking it's too expensive - hold on.

We do not wait until all 1000 pages are written before inspecting. Each section is inspected as soon as it is produced. Because the team are essentially checking each other's work, inspections are also a great aid to understanding what's going on elsewhere in the project.


The inspection is attended by the author, a moderator to chair the meeting and a number of inspectors - we'll see in a moment who they might be.

Suppose you have written 20 page section of a user functions design document. You send it to the moderator and the inspectors. Each inspector will have, say, a 2 hour task in their plan to check, on their own, the section for errors and omissions.

There is then a meeting - another task in each attendee's work plan. The moderator asks those present: "how long did you spend preparing for this inspection?" If they all say "no time at all" the inspection is cancelled. If you were the moderator and someone said they had spent 3 hours preparing but you knew they were fibbing and they had spent no time at all, what would you do? You might complement them and ask them if they would quickly run through all the errors they found. They won't try that again. The moderator has to make sure inspections are done properly, we're not just going through the motions.

Sometimes moderators will invite someone to read the material out loud. Other times it might be a case of: "Paragraph one any comments? Paragraph two..?" Whichever approach is taken we all know where we are. If someone pipes up and says: "the part number is missing from this web page layout" and we all agree that it should be there, fine. But if a long debate threatens to break out the moderator guillotines it. This is a meeting to record errors and possible errors, not a meeting to have long discussions about how things should be corrected.

The author makes corrections after the meeting and, strictly speaking, the moderator should check that they have. Again, this correction task will be in the author's work plan as we saw in the planning chapter. Occasionally there are so many errors that major rework is needed and the moderator decides a re-inspection is required. One hopes this does not happen too often but clearly the overall plan has to allow for a certain percentage of re-inspections.

The moderator records the total cost of the inspection - preparation hours plus meeting hours - and the number of errors found. Only real errors are counted. Generally, if the error would have resulted in a bug in the final system it is counted as an error. Spelling mistakes in internal documents that will not have any subsequent effect on the project might be noted for correction but not counted in the statistics. How might we use this data in the future? If we discovered from this simple measurement that it typically costs about 2 hours work to find and correct errors by inspecting the UFD and we also know it takes about 10 hours work to find and correct similar errors in system test - in other words it's five times cheaper to find and fix errors via UFD inspection than via system testing - that should drive us almost subconsciously to invest more in UFD inspections next time. But if you never measure this you'll never know which is the cheapest way of finding errors and you won't even have the evidence to persuade yourself to invest at the right time, let alone the evidence to persuade anyone else.

In principle any project deliverable can be inspected but the biggest bang for the buck will usually be had by inspecting requirements and UFD documents.


...next



Project Management Book
Copyright M Harding Roberts 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024
This book must not be copied either as is or in modified form to any website or other medium
www.hraconsulting-ltd.co.uk


Home   Sitemap   Contact Us   Project Management Articles   Project Management Book




Privacy Policy and Cookies