At work we're preparing for an upcoming customer review of our project. It ought to be a fairly standard thing, but after a program reorganization this year and last year, we lost a solid century of systems engineering experience to attrition and were organized under managers with a solid zero seconds of systems engineering experience. So: not ideal, but not impossible. There is opportunity in change.
Anyway, suffice it to say that it didn't work out like that. Selah.
One thing that came up in a dry run for the review presentation is that some of the values we were using to explain our status on the project (basically just what percentage of our work was completed) no longer made much sense. Over the course of the summer, they started to drift a little—which is understandable if your leaders don't know what the systems engineering status numbers mean. And that's just in the general sense—every program's definition of what done means is insane in its own way. With experience you learn to just deal with that, and try to stay on the top side of your board as the waves hit.
As the review approached, panic set in and the definition of done for required task statuses started to drift week to week, day to day, faster than the team's ability to digest the changes. The culmination was a series of review slides with new terms,
numbers that didn't add up. General confusion. Human sacrifice. Dogs and cats living together. Mass hysteria. I still don't understand how we're going to straighten it out.
Back up a step. Last year I started working on a side project to help the main project. I knew, from experience, that as the final review approached, there wouldn't be time to constantly calculate status, never mind do the harder, more abstract work of deciding what status means. That has to be worked out before you start the calculation. So I ended up writing a few thousand lines of code that could query our databases and tell us where we were, from the top level of doneness to the doneness of each individual thing we needed to do. It was the first real software project I had executed in my life—maybe my second or third favorite professional accomplishment.
First was arguing with teammates about definitions of done. And I mean arguing in a positive way—presenting my case about how things should be defined, and being right about some things and wrong about others, each convincing the other until a steady state was reached. I don't like to be told I'm wrong—hate it—but it's a satisfying feeling to relax and open up to the possibility and then believe it when it's true. And with code to lock in the definitions, the definition can be enforced. I can't believe that's not obvious. But some people prefer to run the calculation by hand ("by hand", well, by Excel, but not necessarily the same way every time).
But the downside of working in a stodgy industry is that code is magic at best, totally made up at worst. The new regime hated the idea of code. There was a separate division that handled tools. We can't waste our time with that. Real quote: I thought your code was just making up numbers.
It's all a little bit of drama, but the point is this: you should lay out definitions in an algorithmic form and get the team to buy off on them. If the customer doesn't buy off on them and want their own definitions—that's fine, it's just an interface to your own definitions. And, for the love of whatever you find holy, automate the things that don't need an interpersonal relationship. Make your own tools to do it, because you learn the nuances of the problem by making tools—definitions that appeared simple on the face can be recognized as their complicated selves when you have to work through all of the conditions. Besides, it's more fun to create than to consume.