The way of working history – part 4/5 – software projects have problems

This is the fourth part of a series of blog posts regarding the way of working history.

I believe many of you already have heard of the problems that large software projects experienced at the introduction of software development. Problems that have no equivalence in hardware development. In fact, it was sometimes said that for software development, you should multiply the time with pi and the cost with pi square! This led to an update of the Cone of Uncertainty by Barry Boehm 1981 [1], where he stated the estimate variance 0,25x to 4x for software development projects [1]. Steve McConnell in 2006 [1], when he also coined the expression Cone of Uncertainty, states that the decrease of estimate variance closer to the end of the project can not be guaranteed. The updated Cone of Uncertainty with values for software product development looks like this:


Hardware projects making Technology Push, normally with pre-development not a part of the project, started many decades earlier and of course they also faced their problems, still until this time, especially now when more speed was needed. But, even though silos with KPIs due to KPIs inherent sub-optimisation is inappropriate for high speed, the problems for hardware projects were never of the same magnitude as for software projects that normally were Consumer Pull. Importantly enough to add – software development used the same waterfall way of working.

From my own software engineering time, the easy answer is that software is more complicated. There are more degrees of freedom and therefore also more prone to errors. The architecture is ‘hidden’ and can be your own entanglement, and it is much easier to skip an integration point if you are late. Other things that are considered today are Daily Build, Continuous Everything, etc. In hardware you know that you need several prototypes with integration points due to the complexity of following legal aspects like EMC, lightning, mechanical issues, IP, but also visible quality for example. Your architecture is directly shown as well, and the customers don’t like an entanglement of cables. But, in software development it is possible to think that maybe this time we don’t have the big bang when we only do the very last integration. Because this time is different; our specifications are better and our staff do not make any mistakes anymore. Right.

Another answer that we already has touched is that hardware development many times is more of Technology Push and software development more of Consumer Pull. And there is a big difference if you make this distinguishment.

Because, the easy answers above and the problems for waterfall way of working to meet the needed speed for Technology Puch are all about efficiency, you need to do the things right in the project, since you already know what to do (pre-development and innovations are already done). Consumer Pull is a lot of effectiveness uncertainties also inside the project, you need to do the right thing because the requirements with the customers are not always set. Consumer Pull was there almost from the beginning for software development, but not to the same degree for hardware development, even though it is coming more and more in today’s market.

So, at the start-up of large software projects it was not only sometimes a new technology in the picture, we had also immaturity from both software development side and the customer side. The customer side didn’t know what to ask for and the software development side didn’t know that the customer side didn’t know what to ask for. We can also add the end user, to make it more complicated, but you get the point. And if we are not clear on our customers’ need, how in earth can we then choose technology? And then, with all this uncertainty, a contract was made. And contracts must be kept. In the Cone of Uncertainty it will look like this when comparing hardware and software development projects:


This means that from the beginning of the software development projects, we had not only uncertainty, but also complexity in the picture, since the requirements were not yet set. Most of the time, this meant only an increased uncertainty, but it could also mean that in the beginning of the software projects we need to start to gain knowledge during a long period of time in order to know what to do. And with the long phases of waterfall way of working, a considerable amount of time then passed. This complexity increased the uncertainty, but if the complexity could not be solved fast enough, it also meant that more of the projects failed, which also was the case, even though it become better over the years, mostly for smaller projects [2]. If we also consider the above about complexity, an update of the Cone of Uncertainty would look like this:


We have an red area where it is not healthy to be, and if we take a comparison with the Cynefin™ framework [3] it means that we believe we are in the ordered domains, but in reality we are in the Complex domain. This goes hand in hand with the fact that the estimate variancy of 0,25x to 4x for software development projects are only best case scenario, meaning that with complexity out of control, it will easily be worse. Under the x-axis the red colour is faded due to it is more likely that the complexity will add considerable time, even though a disruptive innovation sometimes could be a game changer.

When we have complexity, we need to have enough cross-functionality competence in the same boat, as well as the customer and end user from the beginning, work trans-disciplinary and do as short iterations as possible in order to gain knowledge. And then we make the perfect specification and our programmers make no mistakes, so we can follow the waterfall way of working. I would be happy if this was only me joking. But, this was/is too common, if you did not want to change the waterfall way of working. And very few companies at that time did, but fortunately more and more companies today are changing their mindset. This will give us the following picture:

The picture makes no sense. First you learn that you need to sit everybody together and hopefully decrease some complexity, but then you still don’t talk to each other in the coming phases. But, waterfall was the current best way of working, and perfect for contracts written in stone, with all their specifications up-front. And of course, everything needs to follow the plan, no matter what. This was better than nothing, but of course not good enough, with considerable time and cost overdraft as aftermath.

Now we need to put everything together, but we must not forget how the market has changed until today. It is all wrapped up in the next blog post, part five and the last one in this series about the way of working history. C u then.

 

References:

[1] Boehm, Barry. The Funnel Curve. The Cone of Uncertainty.
https://en.wikipedia.org/wiki/Cone_of_Uncertainty

[2] The Standish Group. CHAOS Report 2015.
https://www.infoq.com/articles/standish-chaos-2015

[3] Snowden, Dave. The Cynefin™ framework. Link copied 2018-10-04:
https://www.youtube.com/watch?v=N7oz366X0-8

Leave a Reply