Sunday, March 27, 2011

Embedded Coaches

A significant portion of the team was composed of the original development team. The ones that gave us all the technical debt to begin with.

Not only were we coaching a new product owner and business partner, we we coaching a new team.

So much to learn. Developer pairing, iterations, show and tells, iteration planning meetings. Would it be possible to take the groups of developers that created the mess in the first place to build an agile team that would deliver on time and budget with quality?

It was clear. We needed help. Our engineering practices were weak. Given all the side effects in the software, we decided to introduce test driven development or TDD. But how? Who knew it?

Thus we sought the sage advice of a seasoned agile developer. We put a plan in place along with an embedded coach.

Teaching a TDD class would not be enough. No developer was going to have the courage to work through the learning curve of TDD under project pressure. Frustration would give way to the old way. The class would soon be forgotten.

So we placed an embedded TDD coach on the line. One that would spend one week of every month on the team, pairing with developers. We would do this for 4 months.

We also made TDD visible. At each standup each developer would identify the number of test they wrote. This helped drive accountability to the practice and to the team. The team kept a count of automated tests on a big visible chart. As the numbers increased so did the team's pride and commitment to the practice.

Thus, class room teaching followed by on the job training from the embedded coach and making the change visible enabled the culture of the team to change. Like a kata, the team used repetition to take what was new to own TDD. All guided by the embedded coach who showed them how to use tests to drive development and make commitments.

As code was touched, TDD was laid down. Defects dropped. Test automation led to full unit testing with ever check in. Over the next several months, side effects disappeared and become but a bad memory.

- Posted using BlogPress from my iPad

Technical Debt

It was everywhere. The team could not touch any part of the system without it having side effects elsewhere. Half the team velocity was spent cleaning up side effects. The code was in such bad shape. All unit testing done by the team's predecessor was manual.

The web apps' session data was over a Meg per user. Business logic was spread through multiple tiers. The app would support less than 50 concurrent users before crashing. It had to support 10's of thousands of users. So many bugs so little time.

As with agile teams, we made our technical debt visual. The team provided the news to our product owner. It was serious. The message was, "It would do very little good to add features given that the app would not scale". However, our business partner was on the hook to show new functionality to the key users for the pilot. What to do.

After much hand wringing a decision was reached. The team would finish with the targeted features for the pilot. Given that there we're only a 20 users, performance would not be an issue. The business in turn agreed to allow the team to play the performance technical debt cards after the new features were completed. After all, it would take 30 days to play the performance technical cards. Without the performance fixes the application would never deploy to all the users. Prioritization worked. Technical debt was explained in terms of business impact. The business in turn made a priority call. It was text book.

The partnerships was growing stronger.

- Posted using BlogPress from my iPad

The First Iteration

The business partner had grown fond of seeing working software every two weeks. When they first started they said " We will try this agile thing but we want a 2 hour project plan review with everyone every week". We said OK.

Then came the first show and tell. The shock on our business partner's face when she saw working software, it was priceless. So much for IT being incompetent.

Then came the reflexive response, "That's not what we asked for". The team reviewed the story cards with our product owner. It was exactly what she asked for. Then she gasped, "Oh my gosh, we better be more careful what we ask for". Our business partner did not expect to see anything done on time. Not in two weeks. So she and her subject matter experts did not put much effort into specifying what they wanted. But in front of them was the working software. Exactly as they asked. Accountability was born.

What happen to the weekly project reviews? After seeing working software no one wanted to hear that task 2B was 27% done. No one ever knew what that meant anyways.

- Posted using BlogPress from my iPad