Wednesday, December 19, 2007

Nokia Test: Are you really agile?

I have recently had discussions with several management colleagues about agile and find that most are not really getting far enough to realize the full value of agile. Jeff Sutherland and many other champions for the Scrum model presecribe the Nokia Agile Test as a litmus test to deterimine if a team really is agile. This test was developed by Nokia internally to assess development teams at Nokia and their partners. Nokia has the largest number of certified ScrumMasters in a company in the world today. Nokia first determines if the team is able to abopt Scrum by determining if they are doing iterative developement.
  • Iterations must be timeboxed to less than six weeks
  • Software must be tested and working at the end of an iteration
  • Iteration must start before specification is complete
Next, the Nokia Scrum Test...
  • You know who the product owner is
  • There is a product backlog prioritized by business value
  • The product backlog has estimates created by the team
  • The team generates burndown charts and knows their velocity
  • There are no project managers (or anyone else) disrupting the work of the team

I would also add some items to the list...

  • The story includes clearly defined acceptance test(s) [validates requirement complete]
  • The accpetance test(s) are automated, part of the code base, and run as a regression suite on a regular basis
  • The story is not fully complete (implemented) until the acceptance test(s) are automated
The bottom line is that there are many companies that think they are doing Scrum, but aren't really and are plagued by legacy processes and measures. It is fine to take small steps to migrate your organization to agile, but keep going until you really get there.

Sunday, December 16, 2007

Agile Lesson Learned: Iterate in the Marketplace

Convoq (a.k.a. Applied Messaging, Zingdom Communications) closed it's doors Nov 30, 2007. I feel very fortunate to work with a very talented and skilled group of professionals in these 5 years. I had a tremendous experience in leadership and management roles that fostered career growth for me. For a timeline of this business and a summary of what happened at Convoq read the blog entry "Convoq and Zingdom - Five Years" by Chris Herot, CTO and co-founder http://herot.typepad.com/cherot/2007/12/convoq-and-zing.html . I have to comment on Chris Herot's list of lessons learned...

  • I can't say enough about the first bullet in Chris' list of lessons learned: "Iterate in the marketplace and not in the conferenceroom. Agile is the only way to go." Especially in startup companies or divisions, where the need for a product or service is identified, but not yet clear how to meet that need, you have to get working product into the marketplace to fail fast. You need a culture that can work with the learnings and let that drive the iterations priorities, along with the larger strategic goals. This doesn't mean you have to be reactionary to what people are asking for, but rather what people will use and how they use it.

  • The second bullet in Chris' list of lessons learned is also important to highlight: "Just because you are using agile methods doesn't mean you don't have to plan. Write your stories before you begin an iteration, but don't waste a lot of time on the details that aren't needed until later." This plays well into my blog entry "Agile peak performance in early startups" http://djellison.blogspot.com/2007/10/agile-peak-performance-in-early.html . Not having stories ready and prioritized by the product owner (voice of business and customer) before the beginning of an iteration breaks the cadence that is so critical to consistent agile velocity. Keep the stories written at the requirements level, include the user statement for each use case (As an 'actor', I need to 'task to perform', such that 'goal to accomplish'), include known acceptance tests (key tests that the requirement is built right), and the adjusted priority (1 to 10).

Iterating in the marketplace, and rapidly acting on the findings to adjust the stories and their relative order of priority for the subsequent iteration, allows you to "build the right product."

Friday, December 14, 2007

Hiring Quality Engineers for Scrum Teams

I have frequently been asked "how do you go about hiring good Quality Engineers for an Agile team?" I really feel strongly that there are several proficiencies that needed to round out a strong test group for agile teams. These proficiencies can be enbodied in more than one person, but rarely is any one Engineer proficient in all of these disciplines.
  • Domain Knowledge:

  • As with any Quality Assurance (Engineering) team, there needs to be enough representation of the customer for each actor (user) of the system. This requires insights into the goals and intended use of of the system, and most likely may be someone hired from the user community.
  • Development Prociency:
    • Unit Tests and representative Acceptance Tests in an XUnit Framework -- This is a tennant capability for working with the developer on a story to write story acceptance tests first (or during the story development).
    • Separation of GUI/Presentation tier from Business Logic tier -- This is key to being able to exercise the business logic (where most of the work is accomplished) in tests that reside with the source of that production code. There is no need to drive these test with the overhead of the client presentation tiers involved. The client and presentation tiers need their own representative acceptance test for regression, and exhaustive tests (e.g. different environments).
    • Exhaustive Tests through data driven tables -- There are several approaches and tools for exercising acceptance tests with many combinations of data sets (e.g. FitNesse or Solenium tables, xml, csv, etc.)
    • Test Fixtures -- This is key to exposing all of the system for comprehensive story acceptance tests. As a company matures with agile, I prefer to create a Test Automation Architect role who is responsible for oversight and creation of enough test fixture infrastructure to enable the team to develop comprehensive regression test coverage that can run following build and unit tests completion daily.
  • Traditional Tests
    • Performance Test Automation -- This includes both local and load tests as found in traditional waterfall and other iterative SDL's (software Development Lifecycles).
    • Manual Tests -- There are some test that are purely asthetic or user experience oriented to assure that the application is pleasing to use and accomplishes all the user tasks and goals of each actor identified for the system.
  • Process Engineering:

  • This is discipline of best practices and well defined procedures that need to be in place for consistency across the team, and to maintain a project cadence. As with software, process can be over-engineered and requires a sensitivity to what is working efficiently, when to turn up the process to improve consistency and predictability, and when to turn down the process to eliminate unnecessary overhead.
The question that usually follows this list is "how do you find a Quality Engineer with a Development Proficiency?" There are talented Quality Engineers with CS and MSCS degrees with years of experience in both development and test engineering. If you can't find them, create them. Find a developer who has an affinity for methodical test coverage in their own work, or who is good at and enjoys testing their own work to step in to a Quality Engineering role. I have been able to convert a few developers to this role and the results are fantastic.