How does one measure progress of a technology team. How do we say if we are better delivering good, valuable software than we were last sprint or last year?
There is plenty of of KPI or OKRs going around in the world of tech, but I wanted to share my experience coming up with an overarching goal for a group of around 90 people.
Joining an established team with history we wanted to come up with something that will contribute to the quality (customers said stability of production was a main problem) and then also makes us go faster (as budgets were shrinking inevitably with plenty to do). The group already had established communities of practice, which synchronized Analysts, Developers and Testers, who worked in self-organizing teams and they had their own goals and continuous improvement goals. But how do we align entire group of 10 teams so that their efforts maximize returns?
One of the most successful initiatives, which brought so many people together was a goal to shorten the release cycle, from an absurd quarterly releases, to monthly ones and then on-demand, when ready.
The goal was quite acceptable by everyone, it turns out it is really easy to convince any person that releasing more often is a good thing. The benefits are different for different groups, but there is something valuable for any stakeholder.
- Client will get more frequent updates and reduce lead time for changes
- Developers enjoy less last-minute changes, as the next release is round the corner
- Managers get more opportunity to release and don’t have to crumble adjusting demand delivery to a stretched release dates
- DevOps folk get more stable environment and less firefighting
- Testers will inevitably have to focus on automating as much as possible, but will in turn reduce manual, repetitive tasks
Once the goal kicks in you realize there is tons of stuff to do to make it happen, and most of it is automation. The doubt kicks in, and in some organizations there might be external pressure not to release often (as it generates overhead or creates risk). In fact opposite is true. Releasing each 3 months is a terrible risk of big change going live all at once. We observed our defect leakage to production and incidents plummet in the process and our production to thrive in uptime.
Each community of practice and team chipped in a bit and it was really easy for them to do so. Majority of the improvements can contribute to shortening the release cycle – anything that automates testing (manual regression testing especially), simplifies the release process, divides monolith into smaller more manageable components, streamlines your demand and backlog so it can be easily translated into specific release trains, etc.
The group overachieved the goal starting to release monthly after just 6 months of improvements, rather than projected 12 months. They then switched to ‘what if we release more often?’ or even ‘What if we go towards continuous delivery goal’?. This has led to even more impressive KPI results and customer feedback improved tremendously.
The simple goal led to a fantastic self-improvement journey transforming legacy app team into a modern software delivery house.