Archive for September, 2011

Agile means that you are Business Value-Oriented

2 Comments »

Traditional plan-driven methods focus on keeping to the timetable. Agile methodologies focus on consistently delivering maximum business value. New information and knowledge about a problem are more valuable; therefore, Agile methodologies encourage change and try to make changing requirements and systems less painful. If new knowledge is better let’s adapt and not stick to a plan that is not based on that knowledge. To do this, you need to adapt some practices; otherwise you face too many risks. For example, you need to do a lot of automated tests to validate there are no new bugs.

Don’t Waste Money on Features Never Used

If you’re working with the waterfall model and are in the requirement capture phase, tell the stakeholders, “Give me all your requirements now or it will cost you much more later if you change your mind”. Not planning everything from the start will entail additional costs later. The next phase is the design phase and all required drawings of what’s to be built are produced. If new things are learned during the construction phase, going back and changing EVERYTHING becomes too expensive for the customer. So what does the poor customer do? He tells them everything he might need for fear of not getting what he wants. In a 2002 report from The Standish Group, an investigation was made into failed projects to see how much the built-in features were used. They found that 20% of the features were used either always or often. The interesting part was that 45% of the features were never used. There could be many explanations for why features are never used in a system or product. But if you ask stakeholders to give you all the requirements up front, you end up with a lot of things that will never be used. If you deliver work in small pieces and work first on the first priority and also allow stakeholders to re-prioritize work, this problem can be addressed.


Top 10 posts on my blog!

Comments Off on Top 10 posts on my blog!

I have looked on the statistics on what people read on my site and this is currently my most read posts:

Happy reading!


SECI: a Model of Dynamic Organizational Knowledge Creation

2 Comments »

To understand how knowledge is spread throughout an organization, we need to understand the SECI modell by Prof. Ikujiro Nonaka (Hitotsubashi University). When Prof. Ikujiro Nonaka introduced the SECI model (Nonaka & Takeuchi 1996) it became the cornerstone of knowledge creation and knowledge transfer theories. He proposed four ways to combine and convert knowledge types, showing how knowledge is shared and created in organizations. The model is based on two types of knowledge – explicit knowledge and tacil knowledge. Explicit knowledge is visible knowledge, it is easily explained, quantified and documented; tacil knowledge is unseen and grows with habits and hands-on work, but is not easy to share or document it.
The Seci modell
The model also consists of 4 different process situations: Socialization, Externalization, Combination and Internalization.

  1. Socialization
    This process focuses on tacit to tacit knowledge transfer. It’s done when knowledge is passed on through practice, guidance, imitation and observation. This is when someone who is learning a new skill can interact with a more experienced person, ask questions and observe. This occurs in traditional environments where a son learns the technique of wood craft from his father by working with him (rather than reading books or manuals on wood working).
     
  2. Externalization
    This process focuses on tacit to explicit knowledge transfer. Externalization is about making an internal understanding more quantifiable like writing documents and manuals, so that the knowledge can be spread more easily through the organization. The processes of externalization are good at distributing knowledge for repetitive work or processes. An expert describes different parts so that readers can understand “if this happens do the following in order to succeed”.
     
  3. Combination
    The process of combination is about transforming explicit knowledge to another person’s explicit knowledge. A typical case is when a financial department collects all financial information from departments and consolidates this information to provide an overall profile of the company.

     
     
  4. Internalization
    The process of internalization is about transforming explicit knowledge to tacit knowledge. Through reading books, manuals or searching on the web, explicit knowledge can be learned.

There is a spiral of knowledge involved in their model, where the explicit and tacit knowledge interact in a continuous process. This process leads to creation of new knowledge. The central thought of the model is that knowledge held by individuals is shared with other individuals so it interconnects to a new knowledge. The spiral of knowledge or the amount of knowledge grows all the time as more rounds are done in the model.

The basis of all change is that the need for change is known and communicated. If it’s not on the agenda it will probably not be valued. If spreading of knowledge is important for your organization, talk about it with the people involved.


What is the difference between Scrum Master and Scrum Product Owner?

Comments Off on What is the difference between Scrum Master and Scrum Product Owner?

In Scrum, there are basically three roles of Product Owner, Scrum Master and Team Member. The purpose of the Product Owner role is to connect product line with product development. A Product Owner in Scrum, you control the project or development work yourself instead of relying on a project, and that means you have a coach (Scrum Master) to help the team to work efficiently and supports collaboration within and outside the team. Compare it with the OGC model with Steering Committee – Project Manager – Team Leader is to be in the scrum map it to Management – Product Owner – Scrum Master.


A Historical View to Organizing Software Development [updated version]

1 Comment »

When Alan Turing wrote his book “On Computable Numbers, with an Application to the Entscheidungsproblem” in 1936, he laid the grounds for computer science. The first “universal machines” were developed at Bletchley Park to decipher the Germans’ encryptions during the Second World War.

Between the 40’s and the 70’s computer science became more of a scientific instrument than a well-established business technology. It wasn’t until IBM and Apple introduced the PC and Macintosh computers that computer science began to spread outside the scientific institutions, making their way to the largest companies. It was at this time that software development started to take off and we saw the birth of large companies.

Up until the 70s, programs often were quite simple and were operated only by those who created them. But as systems became larger, it also became more difficult to develop and organize software development.

In 1970, a director at the Lockheed Software Technology Center, Dr. Winton W. Royce, published a paper entitled “Managing the Development of Large Software Systems: Concepts and Techniques”. Dr. Royce presented a more structured method for organizing software development. This technique was inspired by the manner in which fields like Civil Engineering and Manufacturing organized their development.

The basic idea is that everything is done in sequential phases. This means that you need to understand everything in a specific phase before you can start doing the next phase. If you change your mind in a later phase it will cost you and be hard to finish the project in time. First you need to understand all requirements, and then you need to do all the design (big design up front) and so on.

Each phase was handled by specialized groups like business analysts (for defining the requirements), system analysts (for designing the programs), programmers (for developing applications), testers (for testing applications) and deployment personnel (for overseeing operations). These groups communicated mostly in writing, and handed over work from group to group.

Managing software development with the Waterfall Model (I discuss this model later in this section) is to investigate what the system is supposed to do, make plans so that it does what it is supposed to do, and to stick to that plan. This model, however, had its setbacks: first, people learned a lot from the first system requirements until they went into production and were used by users. This made it difficult to take advantage of what was learned in the various processes.

Second, it often took a long time between the requirement phase and the user feedback phase. If you didn’t figure out what the users wanted or the users themselves didn’t know what they wanted, that meant more time and money had to be spent to change or adapt the system to users’ needs.

In defense of Royce, it would be fair to say that he actually did warn that these things could happen and he therefore proposed an iterative way of work. But no one adopted this part of his model. That’s how it came to be called the Waterfall.

When the US Department of Defense needed a software development process, they looked at Royce’s paper and they adopted a part of it (unfortunately they adopted the worst part) and named it DOD-STD-2167 (Department of Defense Standard 2167).

When the NATO later needed a model they thought that if it was the best model the US military could find, then it ought to be adopted. And from there, more and more people adopted the theories of the Waterfall. Even if the US Department of Defense changed the standard in 1995, it remained the basis of what the academic world is teaching to this day.

The rise of plan-driven methodologies
In the 80th and early 90th a myriad of new methodologies where invented that where focusing on design. These gained popularity in the same speed that object-oriented programming languages like C++, ADA and Smalltalk gained practitioners.

Naturally there were design methods before this time, even object-oriented ones but the popularity of C++ created the need for a new approach. Most design methodologies before this were data driven and/or functional in nature. When programming in an object-oriented language they were found to be not adequate.

Methodologies that became popular where Rumbaugh OMT, Booch, Coad-Yourdan, OOSE from Jacobsen and Shlaer-Mellon to name a few. All were quite good in certain areas, but seem to not cover the whole design process. Each methodology had its own type of notation and often only concentrated on a sequence of even in a system.

Because of this it was hard to use only one tool; developers adapted their favorite method and added other tools into their hybrid design methodology, splintering the industry even more. The so-called “Method Wars” arise and people where arguing endlessly about the pros and cons of their adapted methodology.

But in the mid-90th three design creators Jim Rumbaugh, Grady Booch and Ivar Jacobson joined forces at a company specializing in design tools called Rational. They became known as the famous “three amigos”. They declared the “Method Wars” over and soon came out with a first release of the Unified Modeling Language.

The RUP process and other methodologies from this time were based on a plan-driven but iterative assumption. The critics was mostly based on that these methodologies was to document focused. You would still need to understand the whole problem before you started the next step everything should be documented and this created a very big overhead that didn’t create a business value. A large complex problem where documented and explain rather good but less complex and more simple problem directly became as big to administrate.

The rise of agile methodologies
During the late 90th people started to react against the plan-driven models. Many people were very frustrated on the demands presented that developers became more agile to business demands, adapting better to knowledge gained during the project. Plan-driven methodologies were considered bureaucratic, slow, demanding, and inconsistent with the way software developers actually perform effective work.

New ways of work like XP, Scrum, DSDM, ASD, Crystal, FDD and Pragmatic Programming where developed as an alternative to documentation driven, heavyweight software development process. These new methodologies, however, had something in common: they focus on a construction and planning plan in the beginning phase and stay with that plan. Many of the Agile methodologies were inspired by “New Product Development Game”, an article written by Hirotaka Takeuchi and Ikujiro Nonaka and published in the Harvard Business Review in 1986. This article is often used as a reference and could be considered the birth of agile methodologies.

In February 2001 a group of different methodology developers meet in a ski resort in Utah, to talk, ski, and relax and to try to find common ground of what they were trying to accomplish. The output of this weekend became known as “The agile software development manifesto”.

Another movement that has gained substantial support within organization of software development in the 21th century is the Lean Software Development theories with Mary and Tom Poppendieck as the main figures. Lean is based on the system thinking theory, which sees an organization as a system. The system shall fulfill a clear customer-focused purpose in a so productive way as possible. They mean that your purpose is probably not to develop software. Your organizations customers probably want their demand satisfied or a problem fixed. If the customer could solve their problems without software, they would be delighted. They way Leans work is that it analyzes the system that the software shall be used and also how to produce that in a so productive way as possible, that focus on adaptive behavior, knowledge acquisition and knowledge workers.

To summarize the history of organizing software development, I will use the words of agile guru Martin Fowlers: “from nothing, to monumental, to Agile”.


Source code are for humans not computers!

Comments Off on Source code are for humans not computers!

It might seems like programming is about writing code for computers to understand, that what a programmer do is translating requirement to a language that computers can understand. But no programmers sees himself as a translator, we see ourselves as creative writers. If two programmers are given the same requirement to implement their source code will differ so it’s not about translating from one language to another. The language used to produce the program isn’t either for the computers, it’s for humans. Most programmers write in a language that needs to be compiles and translated to an executable that computers understand, so the purpose of a language is for people to be able to collaborate and understand each other intention.

In a complex solution the time and effort to deliver, maintain and extend software are directly related to the clarity of the code. The lack of clarity creates technical debt that will eventually have to be paid off with interest, debt that can overwhelm your ability to develop new features. Failure to pay off technical debt often results in bankruptcy: The system must be abandoned because it no longer worth to maintaining. It is far better not to go into debt in the first palace. Keep it simple, keep it clear and keep it clean. Grady Booch author of Object-Oriented Analysis and Design with Application writes that you recognize clean code because it simple and direct, “Clean code reads like well-written prose. Clean code never obscures the designers’ intent but rather is full of crisp abstractions and straight forward line of control”.


Agile Testing Practices – Stay Agile not Fragile

Comments Off on Agile Testing Practices – Stay Agile not Fragile

To be agile and stay agile, you need to change things with confidence without risking change that could result in an unpredictable behavior or in a bug. To do this, verify that previously built features are not affected by newly introduced code. If you don’t create automated tests causing you to do a lot of manual work, you incur unnecessary risks when introducing new features.  As a result, your system will become very fragile.

Another argument in favor of several automated tests is that unit testing and test driven development represents the cost of handling bugs. The cost of fixing a bug that has already reached the production environment can be very expensive. Ideally, a bug should never appear.

Stay agile not fragile

Practices like unit testing, automated tests, continuous integration, refactoring, TDD, iterations and incremental development and simple design where separation of concerns are applied is great insurance to be able to be agile without becoming fragile. As soon as you start tamper with these practices your risks increase and ability to flexibility decrease.


No more comments!

Comments Off on No more comments!

After been completely spammed with comments during the summer! Even if I have enabled Captcha and that comments need to be approved, I have decide to turn of comments for a while. I can’t handle hundreds of comments per week and most are just link robots from non-serious SEO companies.

/Patrik


Accept that you can’t predict the BEST solution.

Comments Off on Accept that you can’t predict the BEST solution.

“The ones that demand that you as a project manager shall know what behind next rock will probably appreciate a horoscope more than reality.”

If things keep changing, the best solution will not be the best solution in the end of the project, if it even will solve the problem. Instead an empirical process must be taken in complex situations.  You have to evaluate a lot of possible solution, discussing pros and cons with different kinds of designs. A complex problem have multiple solutions some are just better than others, but you can’t predict them. Some solutions are more efficient and more productive than others just like when you solving Rubric’s cube.