Posts Tagged ‘Organization of Software Development’

Craftmanship, Industrialization and Knowledge-Based Work

Comments Off on Craftmanship, Industrialization and Knowledge-Based Work

FW TaylorUp until the nineteenth century, most people were engaged in agriculture; others learned a craft like carpentry. They were organized in small communities and worked together from sowing seeds to harvesting, milling, and baking. To be successful they needed to learn and master a specific craft that was passed from person to person, often within the same family. The real master of the community then decided to leave the comfort zone to explore things that no one dared to explore.  Most of the time, these exploratory ventures ended in failure. But sometimes, the adventurer would stumble upon something that opened new doors for them and eventually for future generations as well.

As late as 1870, 80% of Europe’s population worked in agriculture. In 2010, less than 0.8% of Europeans worked in farming.  This demonstrates an enormous productivity gain for farming. And then in the 18th and 19th centuries, a major change happened: manufacturing shifted from manual labor toward machine-based manufacturing. It started with the mechanization of the textile industries and the introduction of steam power. This also changed the organization of work, where people specialized in a particular phase of the work instead of getting involved in the whole process. By specializing in a specific time/phase-based work, Henry Ford was the first to increase productivity in car manufacturing by using assembly lines for mass production.

To make manufacturing as efficient as possible, managers were needed to oversee the whole process. It was during this era that the first management theories and business schools began teaching what Mary Parker Follett (1868-1993) termed the art of getting things done through people”.

 

Lesson Learned from Craftmanship & Industrialization

Craftsmanship is building on experience and can lead us in the right direction, but experience will only take us so far into uncharted territory. In these instances, we must take what we started with and rely on controlled methodologies and engineering tools. Engineering is the application of tools and methodologies to handle the unexplored.

 

Craftsmanship is critical to knowledge based work in terms of providing quality, whereas in industrialization, quantity was the primary measurement. Allan Cooper, author of “Running with the Inmates” and inventor of Visual Basic, said the following about craftsmanship:

 

Craftsmanship is all about quality – it’s all about getting it right, not to get it fast. It’s measured by quality, not speed. It’s a pure measurement, and a delightful one.”

Craftsmen do it over and over again until they get it right. In their training, they build things over and over so they get the experience they need to get it right.”, more info see.

 

I do not fully agree with Allan Cooper because good craftsmanship must strike a balance between quality and time/cost. This demonstrates the concept of personal competence but this also has a disadvantage.  I’d like to quote one of my favorite bloggers – Joel Spolsky – on his view on craftsmanship.  “Craftsmanship is, of course, incredibly expensive. The only way you can afford it is when you are developing software for a mass audience. Sorry, but internal HR applications developed at insurance companies are never going to reach this level of craftsmanship because there simply aren’t enough users to spread the extra cost out.”

 

A craftsman takes pride in his profession, his experience, and his tools. Because his performance is based on his personal competence, he prioritizes the mastery of his tools, upgrading them and improving his work methods in an evolutionary way. Craftsmen prioritize their continuous improvement and appreciate quality because they know that their product becomes more valuable when done the right way. They know that quick-fixes will rarely be successful in the long run, because that can require more work (and less profit to them) to remedy.

The most important lesson learned from industrialization was that deterministic goals and processes are better achieved by investing in structure capital (developing process / routines and establishing them). This especially applies to a process that is 10% design/analysis and 90% production. A changed requirement after the production phase has begun can be very expensive. Software developments of product and unique system aren’t like this design isn’t just a one thing you do before production is started.

For example, when building a bridge, you can’t consider adding a few new highway lanes when the bridge construction has started.  Complex software development isn’t a deterministic process and is therefore more of 50% design (where you figure out what to do and where to go) and 50% production. Following a plan is therefore not the ultimate solution it’s to be prepared and know you domain and tools so that new knowledge present itself you can take the opportunity.
 
 
Conclusions

Throughout history, success factors for work have changed and gone from individually mastered methods of efficiency and quality to a world where craftsmanship was learned and handed down from generation to generation. During the 18th and 19th centuries, the industrial revolution changed the key success factors into pre-defined processes that were mechanized, automated and as efficient as possible.

 

In a world that changes every day, differentiation and uniqueness come from transforming information into a product through creativity and knowledge. The most important considerations for knowledge-based companies are how good they are in getting the best information and then transforming that information into products or services, through their creativity and knowledge. But in a complex world where goals are uncertain and there is more of a need to explore possibilities, success is increasingly dependent on collaboration, creativity, and knowledge.


A Historical View to Organizing Software Development [updated version]

1 Comment »

When Alan Turing wrote his book “On Computable Numbers, with an Application to the Entscheidungsproblem” in 1936, he laid the grounds for computer science. The first “universal machines” were developed at Bletchley Park to decipher the Germans’ encryptions during the Second World War.

Between the 40’s and the 70’s computer science became more of a scientific instrument than a well-established business technology. It wasn’t until IBM and Apple introduced the PC and Macintosh computers that computer science began to spread outside the scientific institutions, making their way to the largest companies. It was at this time that software development started to take off and we saw the birth of large companies.

Up until the 70s, programs often were quite simple and were operated only by those who created them. But as systems became larger, it also became more difficult to develop and organize software development.

In 1970, a director at the Lockheed Software Technology Center, Dr. Winton W. Royce, published a paper entitled “Managing the Development of Large Software Systems: Concepts and Techniques”. Dr. Royce presented a more structured method for organizing software development. This technique was inspired by the manner in which fields like Civil Engineering and Manufacturing organized their development.

The basic idea is that everything is done in sequential phases. This means that you need to understand everything in a specific phase before you can start doing the next phase. If you change your mind in a later phase it will cost you and be hard to finish the project in time. First you need to understand all requirements, and then you need to do all the design (big design up front) and so on.

Each phase was handled by specialized groups like business analysts (for defining the requirements), system analysts (for designing the programs), programmers (for developing applications), testers (for testing applications) and deployment personnel (for overseeing operations). These groups communicated mostly in writing, and handed over work from group to group.

Managing software development with the Waterfall Model (I discuss this model later in this section) is to investigate what the system is supposed to do, make plans so that it does what it is supposed to do, and to stick to that plan. This model, however, had its setbacks: first, people learned a lot from the first system requirements until they went into production and were used by users. This made it difficult to take advantage of what was learned in the various processes.

Second, it often took a long time between the requirement phase and the user feedback phase. If you didn’t figure out what the users wanted or the users themselves didn’t know what they wanted, that meant more time and money had to be spent to change or adapt the system to users’ needs.

In defense of Royce, it would be fair to say that he actually did warn that these things could happen and he therefore proposed an iterative way of work. But no one adopted this part of his model. That’s how it came to be called the Waterfall.

When the US Department of Defense needed a software development process, they looked at Royce’s paper and they adopted a part of it (unfortunately they adopted the worst part) and named it DOD-STD-2167 (Department of Defense Standard 2167).

When the NATO later needed a model they thought that if it was the best model the US military could find, then it ought to be adopted. And from there, more and more people adopted the theories of the Waterfall. Even if the US Department of Defense changed the standard in 1995, it remained the basis of what the academic world is teaching to this day.

The rise of plan-driven methodologies
In the 80th and early 90th a myriad of new methodologies where invented that where focusing on design. These gained popularity in the same speed that object-oriented programming languages like C++, ADA and Smalltalk gained practitioners.

Naturally there were design methods before this time, even object-oriented ones but the popularity of C++ created the need for a new approach. Most design methodologies before this were data driven and/or functional in nature. When programming in an object-oriented language they were found to be not adequate.

Methodologies that became popular where Rumbaugh OMT, Booch, Coad-Yourdan, OOSE from Jacobsen and Shlaer-Mellon to name a few. All were quite good in certain areas, but seem to not cover the whole design process. Each methodology had its own type of notation and often only concentrated on a sequence of even in a system.

Because of this it was hard to use only one tool; developers adapted their favorite method and added other tools into their hybrid design methodology, splintering the industry even more. The so-called “Method Wars” arise and people where arguing endlessly about the pros and cons of their adapted methodology.

But in the mid-90th three design creators Jim Rumbaugh, Grady Booch and Ivar Jacobson joined forces at a company specializing in design tools called Rational. They became known as the famous “three amigos”. They declared the “Method Wars” over and soon came out with a first release of the Unified Modeling Language.

The RUP process and other methodologies from this time were based on a plan-driven but iterative assumption. The critics was mostly based on that these methodologies was to document focused. You would still need to understand the whole problem before you started the next step everything should be documented and this created a very big overhead that didn’t create a business value. A large complex problem where documented and explain rather good but less complex and more simple problem directly became as big to administrate.

The rise of agile methodologies
During the late 90th people started to react against the plan-driven models. Many people were very frustrated on the demands presented that developers became more agile to business demands, adapting better to knowledge gained during the project. Plan-driven methodologies were considered bureaucratic, slow, demanding, and inconsistent with the way software developers actually perform effective work.

New ways of work like XP, Scrum, DSDM, ASD, Crystal, FDD and Pragmatic Programming where developed as an alternative to documentation driven, heavyweight software development process. These new methodologies, however, had something in common: they focus on a construction and planning plan in the beginning phase and stay with that plan. Many of the Agile methodologies were inspired by “New Product Development Game”, an article written by Hirotaka Takeuchi and Ikujiro Nonaka and published in the Harvard Business Review in 1986. This article is often used as a reference and could be considered the birth of agile methodologies.

In February 2001 a group of different methodology developers meet in a ski resort in Utah, to talk, ski, and relax and to try to find common ground of what they were trying to accomplish. The output of this weekend became known as “The agile software development manifesto”.

Another movement that has gained substantial support within organization of software development in the 21th century is the Lean Software Development theories with Mary and Tom Poppendieck as the main figures. Lean is based on the system thinking theory, which sees an organization as a system. The system shall fulfill a clear customer-focused purpose in a so productive way as possible. They mean that your purpose is probably not to develop software. Your organizations customers probably want their demand satisfied or a problem fixed. If the customer could solve their problems without software, they would be delighted. They way Leans work is that it analyzes the system that the software shall be used and also how to produce that in a so productive way as possible, that focus on adaptive behavior, knowledge acquisition and knowledge workers.

To summarize the history of organizing software development, I will use the words of agile guru Martin Fowlers: “from nothing, to monumental, to Agile”.


A Historical View to Organizing Software Development [old obsolete version]

1 Comment »

THERE IS A NEW VERSION OF THIS POST ON HERE!

When Alan Turing wrote his book On Computable Numbers, with an Application to the Entscheidungsproblem in 1936, he laid the grounds for computers science. The first “universal machines” were developed at Bletchley Park to decipher the Germans’ encryptions during the Second World War.

Between the 40’s and the 70’s computer science became more of a scientific instrument than a well established business technology.  It wasn’t until IBM and Apple introduced the PC and Macintosh computers got widely spread outside the scientific institutions and the largest companies. It was now software development really started to take off. A lot of software development companies that are large today where established.

Up until the 70s, programs often were quite simple and were operated only by those how created them. But as systems became larger, it also became more difficult to develop and organize software development.

In 1970, a director at the Lockheed Software Technology Center, Dr. Winton W. Royce, published a paper entitled “Managing the Development of Large Software Systems: Concepts and Techniques“.  Dr.  Royce presented a method of organizing software development in a more structured way. This technique was inspired by the manner in which engineering field a like civil engineering and manufacturing organized their development.

The basic idea is that everything is done in sequential phases. This means that every phase of a project must be completed before the next phase can begin. First, all the requirements are documented, then everything is designed based on the so called “big design up front” and it continued throughout the stages.

Each phase was handled by specialized groups like business analysts (for defining the requirements), system analysts (for designing the programs), programmers (for developing applications), testers (for testing applications) and deployment personnel (for overseeing operations). These groups communicated mostly in writing, and handed over work from group to group.

Managing software development with the waterfall model is to investigate what the system is supposed to do, make plans so that it does what it is supposed to do, and to stick to that plan. This model had its setbacks: first, people learned a lot from the first system requirements until they went into production and were used by users. This made it difficult to take advantage of what was learned in the various processes.

Another setback was it often took a long time between the requirement phase and the user feedback phase. . If you didn’t figure out what the users wanted or the users themselves didn’t know what they wanted, that meant more time and money had to be spent to change or adapt the system to users’ needs.

In defense of Royce, it would be fair to say that he actually did warn that these things could happen and therefore proposed an iterative way of work. But no one adopted this part of his model. That’s how it came to be called the Waterfall.

When the US Department of Defense needed a software development process, they looked at Royce’s paper and they adopted a part of it (unfortunately they adopted the worst part) and named it DOD-STD-2167 (Department of Defense Standard 2167).

When the NATO later needed a model they thought that if it was the best model the US military could find, then it ought to be adopted. And from there, more and more people adopted the theories of the waterfall. Even if the US Department of Defense changed the standard in 1995, it remained the basis of what the academic world is teaching to this day.

During the 90s, new methodologies were developed as a reaction to the plan-driven waterfall model. Plan-driven methodologies reflect either the waterfall model or the iterative model, like the RUP (Rational Unified Process).

These methodologies, however, had something in common: they focus on a construction and planning plan in the beginning phase and stay with that plan. Plan-driven methodologies were considered bureaucratic, slow, demanding, and inconsistent with the way software developers actually perform effective work.

Many people were very frustrated on the demands presented that developers became more agile to business demands, adapting better to knowledge gained during the project.

New methodologies like XP, DSDM and Scrum conquered the world of software development in the 21st century. Many of the agile methodologies were inspired by New Product Development Game, an article written by Hirotaka Takeuchi and Ikujiro Nonaka and published in the Harvard Business Review in 1986. This article is often used as a reference and could be considered the birth of agile methodologies.

To summarize the history of organizing software development, I will use the words of agile guru Martin Fowlers:  “from nothing, to monumental, to agile”.