Field studies are one of the most valuable methods for setting a design project's direction and discovering unmet user needs. But studying and questioning users does no good if you tell them the answers--because then you won't truly learn anything new.
The New York Times
ran a long
article about companies using anthropological techniques
to study their customers. It's always great to see articles that promote field studies, but the information in the
article perpetuated two common mistakes that not only produce bad data, but squander a company's research budget:
The reported studies emphasized interview questions, even though quietly observing users is more valuable and the real reason to go into the field.
All the talk about "anthropology" obscures the fact that all development teams should do field studies, and that teams can run studies on their own, without hiring a bunch of PhDs.
Don't Ask Leading Questions
The article contained the following snippet from a field study of barbecue-grill customers:
"So you feel that grilling outdoors fosters family togetherness?"
"Is there anyone in your family who doesn't enjoy grilling?"
"But you feel it's a bonding ritual all the same?"
"How does grilling work in the text of your life? Would charcoal have interfered with the process of social bonding?"
"I'm not sure, really. We just prefer gas."
Perhaps the researcher was simply hamming it up for the press. In any case, the above segment violates several basic interviewing principles:
Don't ask questions that can be answered with "yes" or "no."
You can elicit more information from the respondent using open-ended questions, which encourage them to talk and provide salient details.
Whether open or closed, definitely
don't ask leading questions
. Once you state what the person supposedly feels, you bias any subsequent answers. People are reluctant to disagree with the interviewer's "authority."
Don't use jargon
("text of your life" and "social bonding" are obvious examples here). When talking to respondents, speak in their language; this draws them out and helps you understand how they truly feel.
Don't draw attention to specific issues
that you care about (in this case, the "bonding ritual"). Doing so causes people to change their behavior and focus their answers on the issues you emphasize. This problem is particularly prevalent in interface design studies: The second you ask people about a specific design element, they notice it much more thereafter than they would have otherwise.
In addition to interviewing methodology errors, the story highlights an even worse problem by focusing exclusively on interviews instead of
. Once you go through the hassle of setting up a field visit, the most important data you can collect is about
. In other words, you
watch what people do
and not what they say.
Did they bond? Did the Dad stay indoors the whole time? What was really going on?
Run Your Own Field Studies
Most articles on field studies make it seem like they are terribly complicated and require a team of anthropologists. Of course, because the average design team doesn't have such specialists around, they naturally dismiss field studies and proceed on the basis of speculation (or focus groups, which are almost as bad as sitting around a table and making up the data).
In reality, basic field study techniques are fairly simple, and
everyone who works on a design team should go on customer visits
from time to time. Visiting a real customer site is an invaluable experience for designers, programmers, and marketers.
Intranet projects need field studies as well, and have an easier time scheduling the visits since they typically involve setting foot in another department or building.
We teach a
in field studies, where we take a design team on a small site visit with some of its customers. Although you're not likely to be as good after four days as somebody who spent four years studying field methods, your team can get good insights for its project from a four-day event. The basics are easy (though clearly violated in the interview protocol quoted above), and anyone can learn to conduct a simple site visit.
Well-funded projects might rely on elaborate field methods that take months or years and require specialized staff. Such projects will probably learn more than projects that go for the fast methods, but they will not necessarily be more successful because the market opportunity may pass them by. Also, smaller studies permit more data collection at more project stages, and exposing team members to live data rather that digested reports is invaluable.
Intranet design teams
in particular desperately need to observe actual employee behavior in the field; doing so shows them the real opportunities for improved task support.
Collecting field data and visiting live customers are not the exclusive preserve of a closed guild of experts. It's the duty of all those who plan to inflict their designs on others.