Sakai Conference: User-Centered Design

A note of apology: I took notes in several of the sessions I attended at the Sakai Conference this week, but some of these notes will be of more use to me than anyone else. Still, I hope they give a sense of the session content. It was a really good conference

User-Centered Design in IT: The Low-Hanging Fruit
Allison Bloodworth & colleague, University of California, Berkeley

User-centered design = making the user’s experience simple and intuitive.

User experience = meet user goals and taks while satisfying business and functional requirements (overall experience and satisfaction of user while using system.)

User-centered design increases buy-in and momentum, user satisfaction, productivity for users, less tech support investment, reduced development & maintenance time & cost (because only develop what users need)

Cf. Jakob Nielsen: Usability maturity model

UCD activities: user research (surveys, interviews, interactions with users), information architecture (organization of information for intuitive access and ease of use), interaction design (defines behavior of interactive products), usability analysis (usability tests and research), visual design (concerned with visual product, styling, look & feel of product), graphic design (arrangement of images and text to convey a message).

Case study: Fluid Lightbox (Fluid is an open, collaborateive project to improve the user experience of community source software, including Moodle, Sakai, etc.) Fluid Lightbox is JS-based tool for manipulating images on screen. Drag & drop tool for images.

UCD & usability evaluation techniques:

1. User needs assessment (surveys, interviews, focus groups, field studies, contextual inquiries, ethnography.)

2. Interviews: structured or open/ended, talk to actual end users, encourage the user to speak freely and to give honest answers and feedback, determine the user’s needs, goals, and tasks. (Goals are at a higher level than tasks, which lead to goals.) Don’t ask questions that can be answered with yes or no, leading questions, don’t draw attention to specific issues that you care about, don’t use jargon, don’t react (be neutral), distance yourself from the product.

3. Competitive/comparative analysis: what have other people done? try using other similar services/products to discover what to do & what not to do, interface conventions, “must have” standard features. Might be a list of important features, or a table showing how each product handles each task it should be able to do. Solving other products’ problems can give a great competitive advantage.

4. Heuristic evaluation: Cf. Jakob Nielsen’s ten heuristics (principles for user-centered design.)

5. Personas: After the data-gathering process (using techniques above), use personas to concretize the categories of important users. One persona per category. Avoids “elastic user.” Use images and background about user behavior & habits to flesh out the persona. Should be based on observed patterns in user behavior.

6. Task analysis: Determine tasks needed to achieve user goals. Rate tasks on frequency, importance, difficulty. Tells you what functionality is important. Create table for personas, with features ranked in order of importance & frequency. Can arrange personas in bull’s eye to place most important in center, then use task analysis to determine most important tasks.

7. Usability testing: Test early in the process. Test with 3-5 users (or less!) Ask the user to think out loud. Same facilitation rules as with interviews, plus: don’t help, make clear that you’re testing the product, not the user. No need to write down exactly what each user does–trends are enough. Main focus of testing is to improve the design, not come up with metrics (i.e. # of tasks completed successfully.)

7a: Card sorting: Helps figure out how to categorize items. Each card has an item name and brief explanation. Provide pre-defined and blank (make-your-own) category cards. Same facilitation rules as a usability test. Ask user to sort the cards into piles that make some kind of sense.

7b: Prototype testing: Scenario-based. Can be paper-based, low-fi or hi-fi.

Overall advice:

Designing a new service? Try user needs assessment and comparative analysis.

Improving an existing site? Try a heuristic evaluation.

Lots of information to organize? Try card sorting.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s