Thursday, December 11, 2008

"I know it doesn't work but it's done" - a story about the definition of done

Picture courtesy of orinrobertjohn@flickr
Some time ago I was talking to engineers responsible for some part of the software and I was asking when they will be ready for production. In other words I asked them when their features will be ready. They told me that they are ready now. What was my surprise when I tried to test their software and discovered that about 50% of cases were not working at all. So I asked them why they told me that they are "done" when they haven't even implemented half of the planned features? They answered me: "We know it doesn't work but it's done - we implemented something so it's done. Now we have to work on quality i.e. implement rest of the features."

When I heard this I think I might have looked like the lady from the picture. I couldn't believe someone can think like this - if we implement one use case out of one hundred we can consider the project done? The rest is the "quality"? I don't think so.

In this post I'll try to explain once again what is definition of done and why it's so important to have the same definition at least among all the people involved in the development of a single project.

Let's define "Done"

I would say that there is no one, good and universal definition of done. You can find some discussions in the Internet about it but you can see that everyone has it's own variation. So do I - in my view the most important things are (I will use "user story" word meaning every variation of request, use case, user story, etc.):
  • user story has to be implemented, today (no 99.9% is accepted)

  • user story has to be tested and no known bugs should exist

  • user story is ready to go into production, today

  • user story has to be ready to be presented to customer, today

Some explanation

User story has to be implemented - means that the code has to be committed to the version control system (like CVS, SVN, etc.), documentation should be available on Wiki or in the VCS, etc. It means that the output of the work done (whatever the work to be done was) must be available for anyone in the company to be downloaded in some way and checked. There must not be "I have it on my box - will publish it soon". Work must be committed and available for others.

User story has to be tested and no known bugs should exist - means that if you know about any bugs in the user story you're going to deliver - it's not done. If it exists in some subpart of the user story and you really need to deliver the working stuff, maybe you should split this user story into two, smaller ones. You must not deliver bugs to your customer - I'm talking about bugs you're aware of.

User story is ready to go into production - it means that it is ready to be deployed at any time from the time you stop talking. The wise thing would be if the working software is already deployed and tested in the production system - if it works - it's really done.

User story has to be ready to be presented to customer - it means that within 30 minutes (max) you are able to prepare presentation of working software to your customer. Of course, it requires you to have list of acceptance test ready and you know how to demo your software. The last point of it is very very important. Remember about it when defining all your user stories - you have to know HOW TO DEMO USER STORY - it will probably help you defining acceptance tests (e.g. user adds new item to the database using HTML form, then goes to the search panel and is able to find newly created item by its name, ....).

Wrap up

As I mentioned above good and universal definition of done probably does not exist but at least many resources agree on base principles. My definition of done is simple but I consider it quite powerful.

If you are interested in diving more into the subject I would recommend those two links from ScrumAlliance:
What do you think about my definition of done? If you have your own I would gladly read about it. Please share your opinions here.

Originally published on AgileSoftwareDevelopment.com

No comments: