Posts Tagged ‘Plan-driven’

A Historical View to Organizing Software Development [updated version]

1 Comment »

When Alan Turing wrote his book “On Computable Numbers, with an Application to the Entscheidungsproblem” in 1936, he laid the grounds for computer science. The first “universal machines” were developed at Bletchley Park to decipher the Germans’ encryptions during the Second World War.

Between the 40’s and the 70’s computer science became more of a scientific instrument than a well-established business technology. It wasn’t until IBM and Apple introduced the PC and Macintosh computers that computer science began to spread outside the scientific institutions, making their way to the largest companies. It was at this time that software development started to take off and we saw the birth of large companies.

Up until the 70s, programs often were quite simple and were operated only by those who created them. But as systems became larger, it also became more difficult to develop and organize software development.

In 1970, a director at the Lockheed Software Technology Center, Dr. Winton W. Royce, published a paper entitled “Managing the Development of Large Software Systems: Concepts and Techniques”. Dr. Royce presented a more structured method for organizing software development. This technique was inspired by the manner in which fields like Civil Engineering and Manufacturing organized their development.

The basic idea is that everything is done in sequential phases. This means that you need to understand everything in a specific phase before you can start doing the next phase. If you change your mind in a later phase it will cost you and be hard to finish the project in time. First you need to understand all requirements, and then you need to do all the design (big design up front) and so on.

Each phase was handled by specialized groups like business analysts (for defining the requirements), system analysts (for designing the programs), programmers (for developing applications), testers (for testing applications) and deployment personnel (for overseeing operations). These groups communicated mostly in writing, and handed over work from group to group.

Managing software development with the Waterfall Model (I discuss this model later in this section) is to investigate what the system is supposed to do, make plans so that it does what it is supposed to do, and to stick to that plan. This model, however, had its setbacks: first, people learned a lot from the first system requirements until they went into production and were used by users. This made it difficult to take advantage of what was learned in the various processes.

Second, it often took a long time between the requirement phase and the user feedback phase. If you didn’t figure out what the users wanted or the users themselves didn’t know what they wanted, that meant more time and money had to be spent to change or adapt the system to users’ needs.

In defense of Royce, it would be fair to say that he actually did warn that these things could happen and he therefore proposed an iterative way of work. But no one adopted this part of his model. That’s how it came to be called the Waterfall.

When the US Department of Defense needed a software development process, they looked at Royce’s paper and they adopted a part of it (unfortunately they adopted the worst part) and named it DOD-STD-2167 (Department of Defense Standard 2167).

When the NATO later needed a model they thought that if it was the best model the US military could find, then it ought to be adopted. And from there, more and more people adopted the theories of the Waterfall. Even if the US Department of Defense changed the standard in 1995, it remained the basis of what the academic world is teaching to this day.

The rise of plan-driven methodologies
In the 80th and early 90th a myriad of new methodologies where invented that where focusing on design. These gained popularity in the same speed that object-oriented programming languages like C++, ADA and Smalltalk gained practitioners.

Naturally there were design methods before this time, even object-oriented ones but the popularity of C++ created the need for a new approach. Most design methodologies before this were data driven and/or functional in nature. When programming in an object-oriented language they were found to be not adequate.

Methodologies that became popular where Rumbaugh OMT, Booch, Coad-Yourdan, OOSE from Jacobsen and Shlaer-Mellon to name a few. All were quite good in certain areas, but seem to not cover the whole design process. Each methodology had its own type of notation and often only concentrated on a sequence of even in a system.

Because of this it was hard to use only one tool; developers adapted their favorite method and added other tools into their hybrid design methodology, splintering the industry even more. The so-called “Method Wars” arise and people where arguing endlessly about the pros and cons of their adapted methodology.

But in the mid-90th three design creators Jim Rumbaugh, Grady Booch and Ivar Jacobson joined forces at a company specializing in design tools called Rational. They became known as the famous “three amigos”. They declared the “Method Wars” over and soon came out with a first release of the Unified Modeling Language.

The RUP process and other methodologies from this time were based on a plan-driven but iterative assumption. The critics was mostly based on that these methodologies was to document focused. You would still need to understand the whole problem before you started the next step everything should be documented and this created a very big overhead that didn’t create a business value. A large complex problem where documented and explain rather good but less complex and more simple problem directly became as big to administrate.

The rise of agile methodologies
During the late 90th people started to react against the plan-driven models. Many people were very frustrated on the demands presented that developers became more agile to business demands, adapting better to knowledge gained during the project. Plan-driven methodologies were considered bureaucratic, slow, demanding, and inconsistent with the way software developers actually perform effective work.

New ways of work like XP, Scrum, DSDM, ASD, Crystal, FDD and Pragmatic Programming where developed as an alternative to documentation driven, heavyweight software development process. These new methodologies, however, had something in common: they focus on a construction and planning plan in the beginning phase and stay with that plan. Many of the Agile methodologies were inspired by “New Product Development Game”, an article written by Hirotaka Takeuchi and Ikujiro Nonaka and published in the Harvard Business Review in 1986. This article is often used as a reference and could be considered the birth of agile methodologies.

In February 2001 a group of different methodology developers meet in a ski resort in Utah, to talk, ski, and relax and to try to find common ground of what they were trying to accomplish. The output of this weekend became known as “The agile software development manifesto”.

Another movement that has gained substantial support within organization of software development in the 21th century is the Lean Software Development theories with Mary and Tom Poppendieck as the main figures. Lean is based on the system thinking theory, which sees an organization as a system. The system shall fulfill a clear customer-focused purpose in a so productive way as possible. They mean that your purpose is probably not to develop software. Your organizations customers probably want their demand satisfied or a problem fixed. If the customer could solve their problems without software, they would be delighted. They way Leans work is that it analyzes the system that the software shall be used and also how to produce that in a so productive way as possible, that focus on adaptive behavior, knowledge acquisition and knowledge workers.

To summarize the history of organizing software development, I will use the words of agile guru Martin Fowlers: “from nothing, to monumental, to Agile”.


Plan-driven versus Agile = Predictive versus Adaptive

10 Comments »

The inspiration for plan-driven methodologies came from other engineering disciplines such as civil and mechanical engineering. In civil engineering, it is less costly to change requirements during the design stage and it is more expensive to adapt to changes when construction has already started.

Therefore a lot of energy is put into the planning phase. When the drawings are specified and finalized, it is reasonably easy to predict the schedule and budget for the rest of the processes, because both the requirements and the technology are known. Once we have the construction plan, the construction is more predictable. The nature of these projects is to resist changes, because it costs too much to make them after construction has started.

Software development is different.  There is no guarantee that a good design will make construction predictable. In civil engineering, the time spent on construction design is often less than 10% of the project’s total budget.  In software development, it’s more common that over 50% of the total budget is spent on design and understanding requirements. When we adopt the civil engineer’s approach, anything that wasn´t planned causes a problem. We need to reach (B) as planned when the project plan (Time, Money and Scope) was approved.
Predictive vs Adaptive

Construction is less costly in software development.  So why not adapt to change as we learn more during the process? Changes in the requirements during the process of developing a system leads to increased information about the system, and thus enabling better decisions because of this additional information. This is how competitive advantage is gained.

If we could use methodology that allows us to change, we will know more without incurring major costs so it would be a mistake not make the required changes. If our methodology allows this, the output (the system or product) is going to be worth more than the product we would have had have from the original design (C) >= (B).

A Difference View of Success

The plan-driven methodologies defined success as the delivery of agreed functions on time and on budget. But agile methodologies promote the idea that there is something nobler to strive for, because sometimes the customer remains dissatisfied even if he got everything he asked for! This is because he got (B) but at the end of the project, he now wants (C).

Customers also adapt but if they change their mind about what they want and you prefer not to make the requested changes, problems could arise.. Either you or the customer will be disappointed.  Success for Agile methodologies depends on knowing more which makes it better for us to adapt to new goals. This gives the customer what he wants at the end of the project (C).

Conclusion

If customers don’t know what they want when the project ends, how can you design a system in the beginning when you don’t know what the understanding was? If your design isn’t good enough to predict the rest of the work, it won’t be such a good idea to design too much in the beginning of the project.

And if you cannot establish requirements you cannot make a good design and in turn, you cannot get a predictable plan. The Agile approach is to welcome change so if your assumption is proven wrong, you redefine the goal and set new priorities. At the end of the project journey (C) this new goal will be worth more than the first defined goal (B).

Because the customer and stakeholders are allowed to change as they learn more new things, the value of (C) becomes greater than (B). This of course builds on the idea that you manage to deliver (C) according to time and budget restrictions.

The conlusion of 2 be or not 2 be


A Historical View to Organizing Software Development [old obsolete version]

1 Comment »

THERE IS A NEW VERSION OF THIS POST ON HERE!

When Alan Turing wrote his book On Computable Numbers, with an Application to the Entscheidungsproblem in 1936, he laid the grounds for computers science. The first “universal machines” were developed at Bletchley Park to decipher the Germans’ encryptions during the Second World War.

Between the 40’s and the 70’s computer science became more of a scientific instrument than a well established business technology.  It wasn’t until IBM and Apple introduced the PC and Macintosh computers got widely spread outside the scientific institutions and the largest companies. It was now software development really started to take off. A lot of software development companies that are large today where established.

Up until the 70s, programs often were quite simple and were operated only by those how created them. But as systems became larger, it also became more difficult to develop and organize software development.

In 1970, a director at the Lockheed Software Technology Center, Dr. Winton W. Royce, published a paper entitled “Managing the Development of Large Software Systems: Concepts and Techniques“.  Dr.  Royce presented a method of organizing software development in a more structured way. This technique was inspired by the manner in which engineering field a like civil engineering and manufacturing organized their development.

The basic idea is that everything is done in sequential phases. This means that every phase of a project must be completed before the next phase can begin. First, all the requirements are documented, then everything is designed based on the so called “big design up front” and it continued throughout the stages.

Each phase was handled by specialized groups like business analysts (for defining the requirements), system analysts (for designing the programs), programmers (for developing applications), testers (for testing applications) and deployment personnel (for overseeing operations). These groups communicated mostly in writing, and handed over work from group to group.

Managing software development with the waterfall model is to investigate what the system is supposed to do, make plans so that it does what it is supposed to do, and to stick to that plan. This model had its setbacks: first, people learned a lot from the first system requirements until they went into production and were used by users. This made it difficult to take advantage of what was learned in the various processes.

Another setback was it often took a long time between the requirement phase and the user feedback phase. . If you didn’t figure out what the users wanted or the users themselves didn’t know what they wanted, that meant more time and money had to be spent to change or adapt the system to users’ needs.

In defense of Royce, it would be fair to say that he actually did warn that these things could happen and therefore proposed an iterative way of work. But no one adopted this part of his model. That’s how it came to be called the Waterfall.

When the US Department of Defense needed a software development process, they looked at Royce’s paper and they adopted a part of it (unfortunately they adopted the worst part) and named it DOD-STD-2167 (Department of Defense Standard 2167).

When the NATO later needed a model they thought that if it was the best model the US military could find, then it ought to be adopted. And from there, more and more people adopted the theories of the waterfall. Even if the US Department of Defense changed the standard in 1995, it remained the basis of what the academic world is teaching to this day.

During the 90s, new methodologies were developed as a reaction to the plan-driven waterfall model. Plan-driven methodologies reflect either the waterfall model or the iterative model, like the RUP (Rational Unified Process).

These methodologies, however, had something in common: they focus on a construction and planning plan in the beginning phase and stay with that plan. Plan-driven methodologies were considered bureaucratic, slow, demanding, and inconsistent with the way software developers actually perform effective work.

Many people were very frustrated on the demands presented that developers became more agile to business demands, adapting better to knowledge gained during the project.

New methodologies like XP, DSDM and Scrum conquered the world of software development in the 21st century. Many of the agile methodologies were inspired by New Product Development Game, an article written by Hirotaka Takeuchi and Ikujiro Nonaka and published in the Harvard Business Review in 1986. This article is often used as a reference and could be considered the birth of agile methodologies.

To summarize the history of organizing software development, I will use the words of agile guru Martin Fowlers:  “from nothing, to monumental, to agile”.