DONE DONE Criteria
I am currently working with one of the most enthusiastic teams that I have had a chance to work within my career. And since we all are remote, we miss out on those water cooler discussions a lot of times. Or as I like to call them — “Chai pe charcha”. Where sometimes, you also get a chance to align on team rituals. For e.g., Feature definition. Which, I as a QA, would like us to have an agreement on as a team. Another one is to come up with a team “DONE DONE Criteria”.
DONE DONE criteria also echo with me as something one should be able to answer “Are you travel-ready”? It is a journey where one station is going live to production and it doesn’t stop there, then comes your biggest responsibility of customer satisfaction, over the air updates, how different are you with a competitor in the market and so on it goes and goes on.
Over an experience have learned that if you ask a developer that feature is done it is more on asking a politician about a “straight answer”.
I also keep hearing in standup updates that story is done only tests are pending or the only pipeline is failing, other way it is dev done. So we also struggle to define when we call a story done. Is it dev done, desk check done, QA done, product signed off or deployed to production?
So what is Definition of Done and how we implement in our current assignments:
Is it going to production live?
Is it release toggle clean up?
Is it test automation done?
Are all NFR’s done?
All traceability, alerting and logging is done?
Is compliance approval done?
Have heard saying that story is meeting the acceptance criteria, Is this scenario mentioned in acceptance criteria of stories?
It helps to have an agreement within a team internally and externally both. Once we have an alignment we can drive what we are expecting from each role. It also helps to run an effective feature/story kickoffs and desk checks.
Steps to define DOD: Getting all your stakeholders on the same page while defining the definition of done is not an easy task. We have different ideas within the team and across cross-functional teams. The trickier part was to align and getting buying from everyone. Without an agreement, we can’t ship the product in a most leaner manner. We also want to avoid confuse stakeholders and their different expectations. Moreover the accountability of internally
- Create a team level DOD
- Create a product level DOD
- Checklist of DOD only as a guideline, not a tick mark list
- Have a realistic DOD based on the present need of team align with business
- DOD ownership
- Clear roles and responsibilities of each role for DOD
DOD checklist: This is just a guideline and it could vary from team to team and account to account
- Test scenario’s added and signed off by Product owner
- Unit tests/widget tests/integration tests done
- Test coverage above 95% other way fail the CI pipeline
- Code review is done if not pairing
- Functional tests done
- All NFR’s done based on story/feature
- Story/epic signed off by Product Owner
- Auditor and compliance approved
- Wiki is updated with new workflows and state diagram
- Path to Production/Release strategy defined
One thing to always remember that DOD is team specific,story-specific and it also needs a periodic revision over a course of time-based on feedbacks.
At last, a DOD is all about team confidence: Why should I care about DOD, the one and only reason are to share an honest vision about the quality of the product across the board.