Tuesday, February 12, 2008

Training

One of the most powerful roles the small core UX team can play is that of trainer. The team will need to, over time, implement a comprehensive training program that's repeatable enough to be efficient and flexible enough to meet the needs of each team. Here are some early thoughts about the training program:

Run through everybody in the following rough sequence:

  1. Surrogate evangelists (initial development team, key sponsors, product owners, )
  2. People who need specific skills (scrummasters and business analysts)
  3. Everybody else, one team at a time

Early on, pick one or two development teams and do a deep UX initiation with the whole team. The best team for this would be one that is highly respected by other teams that that's already very open to UX values. This team becomes a key evangelist and thought-partner in creating future trainings and in evolving the model.

Next, bring key sponsors on board. These are people in our web organization who manage the overall business portfolios--they manage the overall demand coming in from customers, set the high level priorities, oversee product management, and they usually supervise the product owners and SMEs. Getting these folks on board with the overall UX movement will grease a lot of wheels, help me get a good budget, and prime the pump with everyone else. (After a year of informal evangelism targeted at these folks, along with a great directive from our director, these folks basically understand the importance of user experience.)

I want to do one formal session with key sponsors to give them three things:

  • Effective language and approaches to use with their stakeholders and their teams. We've learned a lot about how to evangelize UX, and I want these key managers to be as effective as possible.
  • Introduce them to the base tool set, so they have some background when their staff start talking to them about personas, card sorts, IA, etc.
  • Give them specific assignments. I haven't worked this out yet, but I want to help them channel their support for UX. Some possibilities are...
  • Each manager should incorporate some standard language into the formal objectives of themselves and their staff, to give everyone financial incentives to address UX issues. We can give them some boilerplate language as a starting point.
  • Each manager should ask some specific questions during weekly demos to help the team stay focused on UX, and to help the managers maintain a good sense of where each team is at in this regard. We can help them formulate these questions.
  • Each manager should let their staff know that they are encouraged, and in some cases required, to attend UX-related training. This gives staff permission to take an hour or two (or a day or two) away from their deadlines to do UX-related training.

Next, Product Owners:

  • Evangelize them on user experience
  • Introduce them to the core toolset and the support available
  • Think together about how to build UX into the product backlog and sprint exit criteria
  • Work with them on an ongoing shared product backlog for things like creating and implementing design patterns or templates across the site

Next (or at the same time), Scrummasters:

  • Start with an introduction to the kinds of help we have to offer and how to recognize when they need that help.
  • Then transition the "training session" to a collaborative working session to come up with some shared best practices and places to innovate, using as a starting point the "UX-in-the-product lifecycle" model (I'll blog on this soon).

After Product Owners & Scrummasters, hit the Business Analysts:

  • Evangelize--give them something to live for to replace the requirements documents that have dominated their work lives for the last 3 years
  • Introduce them to the core toolset and the support available
  • Give them the hands-on skills to do things like user-centered stories, effective use of personas, and card sorts.

By this time we'll have a pretty decent skeleton of a support system in place, and three people on each team who can tangibly make use of general excitement about the user experience, we then go from team to team showing them:

  • how valuable UX is
  • how easy we're making it for them to be successful
  • how to know when they need help

Sound like a lot of work just on training? Yes, but less work than it would be to embed a UX specialist in every team, and in the long run everything will be easier if the development teams are convinced they need user-centered design in order to produce optimal results.

I'm guessing that the core UX team could create the bulk of the training curriculae in one or two short (2-week) sprints. Using 3 or 4 people from the core UX team as trainers/evangelists, we could blow through this program pretty quickly, simultaneously creating demand for our services; enabling self-service; and establishing some shared agreements for how we'll all work together.

What do you think? Is this crazy? Way too much overhead for agile? Or is it a sensible way to leverage a small UX team across many development teams? We'll give it a try and find out. In the mean time, what's your advice?

Thursday, February 7, 2008

Core User Experience Team (part 2 of 7)

When trying to leverage a handful of UX specialists across a large number of agile develoment teams, the first question is, should we just divvy up the user experience (UX) specialists across all the teams? I say no. That would give each UX specialist 5 teams to support, and would more or less require each UX specialist to be good at the whole range of UX tools, from IA to interaction design to user research. I think a 1:5 ratio is too small for this to work. So I'm starting on the premise of a core team that provides support as a team.

Who's on the core UX team?
This team will include...
  • UX specialists
  • Creative specialist
  • Page developer
  • Product Owner
  • Scrummaster
  • Business Analyst
  • SME who understands the web sites built so far

We start with 2 people with formal training and significant experience doing things like IA, usability testing, interaction design, etc. We've gotten approval to hire two more.

We'll start with one creative specialist (graphic design of UI). He's formally part of a small creative group, so he can call in help as needed, and he'll have help from the creative group in terms of staying on brand, etc.

We'll start with one page developer who will do HTML, javascript, CSS, Ajax, etc. In our case, the page developer(s) will need to really understand how to work with WebSphere Portal (themes, skins, page layouts, etc.), since that will be the presentation layer for much of what we do.

This core UX team will operate as a scrum. Just like any other scrum, they'll have a product backlog (more on that below), sprints, releases, daily scrums, etc. The key difference between this core UX team and the more traditional scrums (are we allowed to call scrums "traditional" yet?) is the nature of the product and the backlog.



A new spin on the product backlog

In most development teams, the product is a functioning collection of software that meets specific non-functional requirements and produces business value. A roadmap might typically include a list of major batches of features, with some infrastructure along the way. And the typical product backlog would be specific bits of functionality that can be coded.

The core UX team I'm proposing will still have a roadmap and a backlog, but the product might be described as "user centered design across the enterprise." An initial product roadmap might include things like...

  • Training program
  • Repeatable process for usability testing
  • High-level IA and wireframes
  • Design pattern library
  • Model for financing work requests from development teams
  • Reusable navigation widgets

The product backlog would include relatively tangible things, like page templates or reusable navigation widgets, but it will also include less tangible "services," like training programs and user research. I could imagine the team focusing a release on an initial training program. In the first sprint of the release, the team might produce all the materials required to train one development team in the art and science of personas and scenarios. Just like "potentially shippable software" of a traditional scrum, the Product Owner could decide whether to go ahead and conduct that training, or to wait for the next sprint to produce training materials that enable a business analyst to conduct card sorts.

Sprints that don't produce code

The team would figure out how to maximize the use of all team members during the sprint--if the creative person isn't needed for the card sort activities, he or she could spend that sprint preparing reusable CSS and background graphics, for which a training could be quickly developed in the following sprint.

We'll have a challenge figuring out how to maintain a unifying theme for each sprint, when not all activities involve graphic design or page development, but this sounds similar to a typical agile team when they're in a sprint focusing on something like upgrading the hardware infrastructure. Team members will go in seemingly unrelated directions, but they'll understand amongst themselves how it will all come together.

Ideally, the team would prioritize repeatable processes that could be used immediately by specific development teams, and would create just enough organizational infrastructure to support the immediate needs (e.g., initially just a sign-up sheet for training; a later release might include a recharge mechanism and a schedule for giving the core training to 200 people.)

These are just some quick examples. The point is that the product backlog contains both software and services, both of which are "potentially shippable" at the end of each sprint. A release could consist not only of software, but of a repeatable process with the infrastructure in place to support it.

What do you think? Does this sound like an agile team? Will it work? What am I leaving out?

Next post will be a draft approach to UX training throughout the enterprise.


Wednesday, February 6, 2008

Leveraging a small user experience team (1 of 7)

We have a small handful of user experience (UX) specialists, and we're ramping up our agile development teams. How can 4 UX specialists support 20 teams when the 20 teams are dedicated, co-located, moving on their own schedules, constantly changing direction, and not necessarily on board with the importance of user experience?

We had a good all-day session today mapping out how we want to start approaching this. The central theme is that the small team of UX specialists will operate as an agile product team. But unlike a typical agile development team dedicated to producing executable code, our product will be user-centered design. Each sprint will produce "shippable user-centered design." Sometimes this will look like software, sometimes it will look more like a service. The result will be a large number of teams producing software designed to produce a great user experience.

Here are the key ideas so far...

We are an agile team, and user-centered design is our product.

  • We form a small core team that uses standard agile methodologies in order to provide user-centered design services and products to all the development teams.

We empower the development teams to do user-centered design.

  • We put a lot of energy into training and UX evangelism.
  • We make it super-easy for the develpment teams to get face-time with end-users throughout their work.
  • We inject the core UX team into development teams at key leverage points in the product lifecyle.

We do hands-on user-centered design.

  • We create reusable artifacts that have good UX principles built in.
  • We provide ad hoc consulting and design services to the application teams.

I'll flesh out each of these bullets in subsequent posts. How does this look as a starting point?




Tuesday, February 5, 2008

User Research as a Commodity (part 3 of 7)

The Problem
I’d like to describe an approach we’ve using at Kaiser to make it easier for development teams to incorporate user insights into their work. Just about every development team on the planet could benefit from more user research—usability testing, card sorts, label tests, brand reactions, cognitive interviews, etc. The more exposure teams get to their end users, the more user-centered their work will be, and the better the user experience (UX).

In my experience, there are two main reasons why teams don’t do more user research:
  1. They don’t understand its importance
  2. It’s too hard. You have to schedule it into the project plan, find a place to do it, prepare stimuli, recruit participants, deal with incentives, figure out what to test, and then spend time actually testing.


There’s a cool relationship between these two barriers to research: If we can make it super-easy for teams to do the research, they’re more likely to actually do it, and once a team does a little user research, they usually understand its importance and want more. The key is to prime this cycle.

So how do we get this cycle going? How do we make it incredibly easy? How do we change user research from a hassle that interrupts the work to a commodity that can be easily acquired on-demand?

The Challenge
Last year we commissioned an agile-like team to build a new web site for brokers--the professionals who help employers select and purchase health plans for their employees. The team needed to produce a beta in two months and a fully operational site in four months, and they needed to do it on an entirely new technology platform.

The pressure was on. At that time, our typical waterfall timeline for even simple projects was over a year. We typically did a week or two of usability testing in conjunction with the requirements phase, before development, or even technical design, started.

We had already done some ethnography with brokers, and the Product Manager was totally convinced of the need to incorporate usability testing, etc. into the design & development work. But many others, including developers, the project manager, and sponsors, thought of user research as a nice-to-have that was likely to blow the tight schedule.

Everybody was excited about moving to agile, but we didn’t have a clue how we were going to fit two weeks of usability testing into the work. Should we do it mid-way through when we’d have good comps to show users? But we couldn’t afford that kind of a break in the project plan. Or maybe front-load the testing, because the developers needed to take several weeks early on to get their environments in order? But we didn’t yet have any idea what the new technology would allow in terms of the UI. Complicating matters, our usability specialists were in Pasadena, CA, while the agile-like team was in Pleasanton, CA—350 miles away. Not only that, but brokers are difficult participants to recruit—they work in offices all over the country and they have very busy schedules, so the team couldn’t just take a paper prototype down to the nearest Starbucks and ask them what they think (as we’ve done with some other audiences).

When asked, the sponsors and most of the team thought the best thing to do would be to crank out an initial site on a tight timeline, doing the best we could based on intuition and a few heuristics, and then test it with users after we went live. We were going agile, they figured, so we can change it easily after we’re live. After all—it’s only UI. ;-)

Clearly, the UX people didn’t understand agile, and the software people didn’t understand UX.

The Solution: Testing as a commodity
The Product Manager and I were convinced that we needed to expose our work to end users prior to going live, and we could see there was no way we’d get even a week out of the schedule. So we decided to try something new—prescheduled testing. Here’s how it worked:

Every other Thursday morning, four or five brokers (our target audience) would show up at our offices in Pasadena. Our user research specialist would work with each participant for about an hour. While this testing took place in Pasadena, we piped audio and video of the testing up north to Pleasanton so the team could watch in real-time and IM questions and comments to the moderator in Pasadena.

Since we knew the testing schedule several weeks, and even months, in advance, we were able to easily schedule a room for the testing. One of our admin staff, who’s particularly good on the phone, took on recruitment. With multiple dates prescheduled, recruitment was easier—“OK, you can’t make it next Thursday; how about two weeks later? How about the Thursday of the following month?” We also had an admin person manage all the incentive checks, greet the participants in the lobby, and help with set-up and tear-down, all of which was essentially the same each week.

In the early stages testing consisted mostly of card sorts and cognitive interviews. As the project progressed, we moved to various stages of UI, focusing on whatever the team had just built or was about to build—one week working on a page layout; the next week focusing on a particular widget.

Results
The key breakthrough for us was that we made it super-easy for the agile team to get the benefit of user research. They didn’t have to stop what they were doing, they didn’t have to deal with logistics, and the only planning they needed to do was to be sure they had something to show users by Thursday morning, along with some good questions to ask.

It worked great. The team made constant course corrections based on the research. We would typically end sessions by asking the participants, “on a scale of 1-5, how easy was this site to use? (5 is easiest)” In two months of iteration we moved this from a 2.5 to a 4.5. The team could come up with ideas and never have to wait more than two weeks to test the ideas with users.
Sponsors could tune in to view the testing whenever they wanted, and they were delighted to see their customers, the brokers, delighted. The personas became real people. We virtually eliminated arguments within the team about what would work best for users. Instead of pressing the point, the Product Manager could just say, “I’ll ask them this Thursday.”

Why it worked
I can’t stress enough the importance of two key elements:

  1. A regularly scheduled research time, scheduled up to months in advance.
  2. Logistical support from outside of the project team

Scaling

So the pilot was successful. Everybody loves the initial site. Everybody wants to go agile. That means we’ll soon be looking at up to 20 agile teams operating simultaneously, working on multiple sites that support several different audience segments (brokers, Kaiser members, employers, etc.). Can we scale this approach to work in that environment? Here are my thoughts so far…

Brokers are now actively involved in a beta program, and we still have every-other-Thursday prescheduled research available to them as a tool. As we start up an agile team for employer groups, I think we can use a very similar model. The challenge will be with Kaiser members, because we’re likely to have many agile team running simultaneously, focusing on different content and functionality, and attending to different subsegments of the member audience. It sound pretty unwieldy to give each team its own half-day slot every other week.

What if we instead treat the user research environment and participants as a commodity available to all the teams? Every Monday and Wednesday we bring in members all day long. Every Tuesday we bring in employers. Every Thursday we bring in brokers. We post a schedule, and individual agile teams can sign up for time.

Not every agile team will have an hour’s worth of user research tasks for participants each week, but the commodity approach helps out again: We can bring in 8 participants in one day and spend an hour with each. During that hour, we may do a simple 5 minute “find this content” task for Team A, a longer “complete this transaction” task for Team B, and a broad “see if we just broke the UI” task for Team C.

What if teams don’t generate enough tasks to fill up the time this week? No problem--our “Platform UI Team” will maintain a backlog of non-urgent areas to test with each audience, so they can fill in as needed. What if Team D has some questions, but next Monday’s schedule is already filled by other teams? No problem--we’ve got open time on Wednesday’s schedule.
With the recruitment and logistics down to a routine managed by admin staff, and the bulk of the testing budget managed as a shared service (and thus “free” to the teams), this leaves the user research specialists and Product Owners free to formulate good tasks and questions and to apply the results of the research.

Back to metrics
Here’s one more possible extension—we haven’t tried it yet, but we’re toying around with the idea.

Kaiser Permanente as a whole is struggling to become a more metrics-driven organization. Our physicians are internationally recognized for how well they practice evidence-based care, but our business practices haven’t yet caught up. Over the next months and years, our performance will be increasingly judged on metrics, and our agile teams will be measured and incented based on metrics like:

  • Health outcomes
  • Sales
  • Operational efficiencies
  • Time-to-market
  • Cost
  • Backlog burndown

Along with these “bottom line” metrics, we’ll also pay a lot of attention to leading indicators—the metrics that show early on whether we’re moving toward improvements in the bottom line metrics. As we become a more metrics-driven organization, how can we ensure that people pay attention to user experience?

Those who already “get it” will know that the best way to achieve great business results is to provide a fabulous user experience designed around the needs and perspectives of the end-users. But for those who don’t yet get it, time-to-market, cost efficiencies, and even sales can seem to be more important than user experience. How can we use the metrics system to incent people to both perform good user research and to use that research to improve the user experience?

SUS Tracking as a Commodity
A while back we had some significant availability issues with one of our sites—too many unplanned outages. A one-page dashboard of metrics was arguably the most powerful influence for fixing the problems. Suddenly, people at every level of the organization could easily see, on a weekly basis, how many times it experienced slowness, how many minutes it was down for planned outages and how many minutes it was down for unplanned outages. The numbers were bad. The numbers were visible. The numbers got executives and team members to come together to make those numbers improve dramatically.


What gets measured gets managed.


What if we published a similar metric for user experience? What if teams were incented to “bring your user experience metric up a point?” What if teams could see that their recent release moved their score up (or down), and what if all of our peers could see our user experience scores? How could we do that without wasting a whole bunch of time and getting in everybody’s way?

What if each site (or product) had a regularly scheduled UX Checkup? I’m thinking maybe every 2-3 months. This would be part of the user research schedule. The user research specialists could work with Product Owners to create a script . Then every couple of months, without the product team needing to lift a finger, we would take participants to the site in the production environment and run through the script.

  • Finally fixed that annoying bug? Score goes up.
  • Haven’t yet implemented the feature the users most want? Score goes stays the same.
  • Rushed to get a new feature in on time and screwed up the IA in the process? Score goes down.
My incentive pay in 2007 was based in part on an objective of brokers scoring the site at least ‘4’ on a 1-5 usability scale. Could we possible scale this so that everyone is incented to provide a measurably excellent user experience? There are two fairly obvious candidates for this metric:

SUS deals with usability quite nicely, but leaves out other critical aspects of the user experience, whereas satisfaction questions are notoriously unfocused.

The two might work well In combination, with the SUS score assessed in-person and the satisfaction question routinely asked via survey.

Inspirational Conclusion

Creating a fabulous user experience is all about a user-centered culture with access to the tools of user-centered design, combined with the ability to deliver software and operations support. The biggest lever we have for creating a user-centered culture is exposing everyone involved to their end-users in valuable ways. If we can do that, the users will thrive, and so will the business.

One way to promote exposure to end-users is to remove the main barriers to basic user research by offering pre-scheduled user research that is easy, scalable, and measurable.

Please comment!