August 31, 2016
In my related article on Hybrid Agile, I pointed out the misuse of the term hybrid agile and teased apart what people meant or intended to mean when they used hybrid agile. Most people who are using or pursuing hybrid agile are actually seeking a mix of traditional/waterfall/predictive and adaptive/agile approaches. Examples of this include Water-Scrum-Fall and ScrummerFall. As noted in my article, these would best be called blended approaches of predictive and adaptive. They are not strictly hybrid agile.
In fact, based on PMI’s 2018 Pulse of the Profession, 23% of organizations are using a hybrid of agile and predictive. Not surprisingly, 47% still use predictive or waterfall approaches which means that 70% of organizations are still firmly grounded in predictive approaches.
In my previous article, we explored some of the reasons that people pursue the blended approach and it is related to waterfall thinking and culture. They are hoping to get something better than predictive but unfortunately, they are not getting the benefits they would get if they used agile only approaches.
In this article, I would like to explore which methodology to use for which projects, identify some of the pain points associated with blended approaches, and finish with some recommendations on what you should do.
Many experts believe that the characteristics of the project can be used to determine the approach. (I personally believe that using agile approaches make sense for most things and wouldn’t use a predictive approach unless there was a pressing reason).
Those experts refer to analysis by Ralph D. Stacey, in his book Strategic Management and Organizational Dynamics, the Challenge of Complexity. Stacey created a set of regions or domains by plotting requirements alignment vs. technology certainty.
You could simplify this by thinking of it as knowing WHAT is to be built on the vertical, and knowing HOW to build it on the horizontal. [Note that this model is very similar to Dave Snowden’s Cynefin model, though Snowden provides a lot more nuance and guidance around what to do for each of the domains.]
Using this model, projects are broken down into Simple, Complicated, Complex and Anarchy.
When you have certainty about what you are building and how to build it, you are in the simple domain and you should use the predictive lifecycle. The predictive lifecycle involves requirements and planning up front with single pass execution in a series of steps or phases.
The predictive approach work best when the requirements are not likely to change and it is work you’ve done before. That is, there is some certainty on WHAT you are building as well as HOW to build it.
For this type of work, a sequential single pass approach works fine. You gather requirements and plan the work up front, design a solution that meets all the requirements, develop and test it, and then deploy it to your customers. If you are lucky and things don’t change much, you just might succeed. Studies by the Standish Group show that only one in four waterfall projects succeed.
For those projects that fall into the complicated and complex domains, predictive approaches will almost certainly fail. Predictive approaches don’t support change. They don’t provide a method to harness requirement changes. They also don’t allow for learning, for example, for the team to learn how to solve the business problem. So for these domains, an agile approach is recommended.
The PMI Agile Practice Guide (2017) uses a slightly different version of the Stacy model that plots the frequency of delivery against degree of change. Frequent deliveries help us to validate that WHAT we are building is what the customer needs. Frequency of delivery can also help us validate the how; that we are building it correctly.
This model shows predictive and Agile as opposite ends of the spectrum, and incremental and iterative as possible solutions.
If you been part of PMI or studied for the PMP, you know that the PMI guidance is to use the PMBOK® Guide as a book of “good practices” and tailor those practices to meet the needs of your project. I suspect that this ‘process tailoring’ is one of the underlying drivers for the push to use blended methods. “Just take this list of good practices and pick and choose what you think will work best for you.” Hmm, does anyone else see a problem with this? I see several:
Here are some of the pain points or anti-patterns I have seen when people are trying to mix Agile and Waterfall approaches.
I’ve talked about agile rebranding before in this post. A popular rebranding approaches is to use the sequential phases that are a key aspect of predictive approaches but to label the those phases as sprints. So there is an analysis sprint, a design sprint, a development sprint, testing sprint, etc. This is a complete misuse of the concept of the sprint.
A sprint is a timeboxed development cycle from the Scrum Framework. Sprints aren’t sequential phases.
Calling something by a different name doesn’t change what it is.
Other examples of rebranding including calling your status meeting a daily scrums or daily standup. Another of my personal pet peeves is calling any meeting the Scrum of Scrums. 90% of the time that I hear someone using the term ‘Scrum of Scrums’ they are using it incorrectly to refer to their weekly status meeting or a new meeting they have created for project managers.
Rebranding gives the appearance of change. People sprinkle in bits and pieces from Scrum or other agile approaches so that they can say they are indeed using agile. They pick the pieces they like or the ones they understand and ignore the rest.
Changing the name of something doesn’t change what it is. I can call myself a world class athlete but that doesn’t make me one. And changing the name of something doesn’t make it a more effective approach, or make it more agile.
Another common anti-pattern is to use sprints, but to plan all the sprints out in advance. So you are essentially creating a plan-driven approach.
This is particularly troublesome when the plan is created without the input of the development team, without using velocity or some other measure of team productivity, and when all the sprints are pre-loaded with all the user stories/requirements.
When sprints are planned out in advance, they are usually overloaded and unachievable. So what frequently happens is that the work tends to snow plow with each sprint accomplishing less and less of the original plan. Stories that are not completed in sprint 1 get added to the scope of sprint 2 and things just snowball from there. Imagine how demoralizing that is to a team that continues to fall behind using this new “agile” approach!
The other problem with planning in advance is that it doesn’t support changes or learning. Instead, the set of user stories are all considered ‘must haves’ and no variance is expected or accepted.
Planning up front is a predictive approach that only works if you have certainty about WHAT you are building and HOW you are going to build it. It doesn’t work when you have uncertainty.
This is a tricky one, and closely related to the previous item. Many organizations will argue that in their business, there are fixed dates that cannot be missed.
There are certainly many situations where fixed dates for delivery may be needed to meet customer needs or comply with regulations. However, if you have a fixed date and you tell the team members that they have to hit the date without regard for their velocity and without flexibility of the scope, then you have the classic death march project. This is particularly egregious when the team did not have input to the plan or did not agree to the scope and dates.
When delivery dates and scope are fixed, the team has few options for hitting those dates. The plan may be completely unrealistic or fail to accommodate learning or change. The team may feel set up for failure.
The team members that don’t quit will likely use the only lever they have and that is to cut quality to hit the date. Cutting quality to hit the date is likely to lead to problems in production and a buildup of technical debt. And that will likely lead to more fixed delivery dates, emergencies and death march projects. And the cycle will continue.
“What gets us into trouble is not what we don’t know. It’s what we know for sure that just ain’t so.”
— Mark Twain
One of the great lessons from Eric Ries and the Lean Startup approach is that we are frequently wrong in our assumptions and beliefs. And the only way to be sure about our assumptions is to test them.
We get in trouble when we assume that we know exactly what the customer needs or that we have all the information up front. In fact, Ries would argue that it is better to assume we are wrong and that we don’t know everything. That means, our top priority should be to deliver frequently and get feedback from our customers to validate our assumptions. If we haven’t built time for learning and flexibility to accommodate change into our approach, then we don’t really have agility.
This is not so much an anti-pattern as an annoying and predictable response that some organizations take to agile. Those that are wedded to predictive/waterfall approaches tend to put a label on agile. By labeling it as pure or religious or militant, they make it seem unreasonable and use that to justify sticking with the status quo, or some vague practical or pragmatic approach.
This has been well documented as one of Craig Larman’s laws of organizational behavior.
“…any change initiative will be derided as “purist”, “theoretical”, “revolutionary”, “religion”, and “needing pragmatic customization for local concerns” — which deflects from addressing weaknesses and manager/specialist status quo.”
— Craig Larman
Change control seems to be an obsession with some individuals and is a common objection to agile approaches. Success with a predictive approach is highly dependent on the ability to control change and reduce or eliminate the dreaded “scope creep”.
Scope creep, like ‘pure agile’, is another negative label meant to communicate an undesirable or negative outcome and to justify status quo behaviors. “We had a lot of scope creep on this project” certainly does not sound like something we would want to happen.
Using the term scope creep implies that the customer or requester is using some sort of trick to take advantage of the team. It’s like the customer is getting away with something.
The reality is, that most changes or scope creep are simply a reflection of our inability to foresee exactly what is required in a dynamic and highly competitive world. Change could reflect the things that we have learned, changes in the marketplace and even insights that you get from the customers by showing them what you have built so far. These are things that would be impossible to determine up front when planning the project.
So why label this as scope creep?
This is one of those fundamental differences between predictive and adaptive approaches. In order to succeed with a predictive approach, we need to control the change so that we don’t have costly rework. We need to stick to the original plan, even if we have to defer features that the customer really needs until the illusive phase two. BTW we all know that ‘phase two’ will never happen.
Adaptive approaches recognize that it is impossible for anyone to know the requirements in their entirety up front. Adaptive approaches place a premium on satisfying the customer, and not simply sticking to the original plan.
The fundamental conflict here is that predictive / waterfall approaches require you to lock the scope. Otherwise, you will have costly rework or perhaps never finish your project.
Rather than make scope creep a negative, Agile approaches ‘harness change for the customer’s competitive advantage’. Agile expects that during the project you will learn more about both WHAT the customer needs and HOW you are going to deliver it. Locking down scope doesn’t make any sense in a context where you don’t know everything up front. And if you know everything up front, then you are in the simple domain and you should use a predictive approach.[For more on Scope Creep, please see my related post Why Scope Creep is Complete Bullshit.]
Lastly, use of a hybrid or blended approach doesn’t address the fundamental mindset change needed to use Agile effectively and get the benefits of agile. Blended approaches are not going to be consistent with the 4 Agile Values and 12 Agile Principles. Here are a few examples:
It simply doesn’t work because we are talking about two fundamentally different and oppositional mindsets about people and teamwork.
So what do you do to avoid these anti-patterns? Here are my recommendations.
When deciding what development approach to use, be clear about your goals. Be honest about your reasons for wanting to use a blended approach. Is it about optics? Are you striving to meet an unreasonable organizational goal like “by the end of this year, 50% of our projects will use Agile”?
Perhaps you see a blended approach as a way you are going to evolve or transition from predictive to adaptive. As noted previously, this agile transition approach is recommended in the PMI Agile Practice Guide (2017).
“A gradual transition involves adding more iterative techniques to improve learning and alignment among teams and stakeholders. Later, consider adding more incremental techniques to accelerate value and return on investment to sponsors.”
— PMI Agile Practice Guide (2017)
More about using blended approaches as an agile transformation strategy in a moment.
Perhaps you trying to create a special blend of waterfall and agile techniques that will yield better results than either approach alone? A recent survey of PMI Chicagoland members showed that 81% felt that some combination of waterfall and agile approaches should be used.
I think they are misguided. They may get better results than using waterfall alone but are unlikely to improve on a more pure agile adoption. They won’t get the business agility that their organization needs to compete.
Using Agile approaches is straightforward, once you know what you are doing. And that requires learning. The best learning is hands on experience in a true agile environment. Creating a true agile environment in your waterfall/predictive world may be difficult and may require bringing in outside help. It is unlikely that you can read the Scrum Guide and effectively apply it without help.
You can also learn from others in the form of classes, conferences or meetups. Blogs and books can provide a lot of additional context.
It’s not helpful to pick and choose the parts of it that you like or understand and want to implement. Unfortunately, PMI continues to promote the idea of mixing and matching traditional and agile approaches in their 2017 Agile Practice Guide. Whether out of lack of understanding, or a stubborn resistance to let go of traditional approaches, they seem to think it is OK to take bits and pieces and mix them together.
Don’t take bits and pieces of Scrum. Try to avoid a Frankenstein mix on one project. If you do use a mix on one project, be clear about what you are adding from each and why. Don’t relabel something to call it agile.
Again, match the approach to the project. If you have a project with uncertainty around the requirements and how you will deliver it, use adaptive approaches. Not a blend.
Don’t pick and choose parts of the Scrum framework. Scrum is a framework for organizing teams and building solutions based on empirical process control and the Agile Values and Principles. Adding bits and pieces of Scrum to something else doesn’t make it Agile.
Sprinkling Agile into waterfall is not an evolutionary approach to move toward agile. Unfortunately, the PMI Agile Practice Guide (2017) cites these blended approaches as a transition strategy or a way to evolve from predictive to adaptive approaches.
“Try these new techniques on a less risky project with a medium- to low-degree of uncertainty. Then, when the organization is successful with a hybrid approach, try more complex projects that require more of those techniques to be added. This is a way to tailor the progressive hybrid transition to the organization’s situation and specific risks and the team’s readiness to adapt and embrace the changes.”
— PMI Agile Practice Guide (2017)
In my opinion, you don’t evolve to agility. There are too many deep seated beliefs and organizational processes that support and reinforce predictive/waterfall ways of working.
You don’t gradually adopt the 12 agile principles. How would you even do that, would you set a goal to adopt just 2 or 3 principles per quarter?
I actually had a client that stated that they would like to use 10 of the 12 agile principles. Wait, what?
An evolutionary approach is hopeful but misguided and ultimately it will fall short. Plus, an evolutionary approach will incur the cost of change (no matter how gradual) without gaining the benefits. This makes it easy for the naysayers and resistors to say, “see, I told you agile wouldn’t work here”.
It would be far better to start with a small project well suited to adaptive methods, like a customer portal or website launch. Form a pilot team and use the Scrum Framework as it is, without modification. Run a true pilot and give it a fair shot. Then use those results to determine how to proceed.
Using a blended approach represents a half measure. You have one foot firmly in each camp. You get the cost and not the benefit.
Instead, do one of two things. If it make sense for you, stay with predictive. Change is slow, costly and disruptive. Stick with what you know.
Or, go all in on agile. Start with a pilot, but do a real pilot.
Don’t take half measures. Don’t rebrand things as agile unless you are prepared to make a change. You will only do yourself and the organization a disservice. Everyone in your organization will be able to tell the difference.
Some organizations have had success by leveraging different methods or modes for different projects. This is different than blending. Bi-Modal means we have different ways of executing projects and we will choose the one that makes sense on a project by project basis.
One of my clients is successfully using bi-modal approaches. They have a playbook and process guide for traditional/predictive projects and a different playbook for using agile. They don’t blend the two within one project, though they do have programs that include a mix of agile and predictive projects.
If you are using using water-Scrum-fall to deliver, make an honest assessment of how that is working. Many organizations have found that using water-Scrum-fall means that they are taking 18-24 months or more to go from customer request to satisfaction of that request. For most organizations today, that is unacceptable.
A technique that organizations have found helpful is to use value stream analysis to evaluate and improve their process. This involves mapping out all the steps from customer request to the point where you’ve delivered the solution to that customer. For each step, you look at the value-added (from the customer perspective), time to execute, quality, and delays and waiting time. [You can read more about using value stream analysis here.]
Value stream mapping can be quite revealing. When I’ve conducted this with clients, we looked at how even a simple project can take up to 24 months. The actual development time for the project was actually less than 2 months. The rest of the time was spent in upfront planning, justification and committee meetings, or in long test and deployment cycles after the solution was built.
The long delivery cycles has a knock-on effect that change is almost certain and the original request may not satisfy the customer after 24 months. So we are wasting precious company resources on work that is no longer valuable.
No matter what approach you are using – predictive, agile or what you call hybrid agile – stay focused on achieving the business goals and satisfying the customer.
Many people lose sight of the achievement of the business goals and just focus on delivering to the plan. This is especially prevalent with predictive approaches. And traditional status reporting in organizations reinforces this approach.
The most recent PMI Pulse of the Profession showed that 1 in 3 projects do not meet the original business goal. If you don’t meet the goal, does it matter if it is on time and on budget?
Further, organizations lack appropriate methods to track whether or not they are achieving or going to achieve the benefits of the project. At a recent client, they pushed back when I suggested that they incorporate benefits tracking into their PPM tool. They said that executives were not interested in tracking benefits. They had invested heavily in tracking budget, scope and timeline, and stoplight reporting, and none on determining whether they were achieving the business goal of the project!
And that is not an isolated incident. PMI recognized this lack of focus on benefits tracking in the 2018 Pulse of the Profession:
”Despite the proven value of benefits management, our Pulse data reveals that a staggering 83 percent of organizations lack maturity with benefits realization.”
— PMI 2018 Pulse of the Profession
This is the problem with the belief system that your job is to deliver to the plan, rather than achieve the business goal.
I hope that this exploration of hybrid agile and blended approaches is helpful to you. I welcome your comments – please share your own experience with blended approaches.