Summary: Web users generally prefer writing that is concise, easy to scan, and objective (rather than promotional) in style. We incorporated these and other attributes into a redesign of Web content. Doing so required trade-offs and some hard decisions, but the results were positive. The rewritten website scored 159% higher than the original in measured usability. Compared with original-site users, users of the rewritten site reported higher subjective satisfaction and performed better in terms of task time, task errors, and memory. Implications for website writing and design are discussed.
Our earlier studies of how people read on the Web  indicated that they: prefer to scan rather than read, want text to be short and to the point, and detest overly hyped promotional writing ("marketese"). We found improvements in usability for new versions of a site that were either scannable, concise, or objective (rather than promotional) in style. When all three writing style improvements were combined in a final version of the site, usability increased 124%. These results prompted us to apply the improvements to pages from Sun's website.
Applying the Writing Guidelines
A common thread between conciseness, scannability, and objectivity is that each reduces the user's cognitive load, which results in faster, more efficient processing of information. (Concise text contains less information to process; scannable text calls attention to key information; and questioning the credibility of promotional statements seems to distract users from processing the meaning, our earlier studies showed.) Thus, our aim was to rework existing Web pages so that they would minimize cognitive load and enhance speed and efficiency.
We took two whitepapers (one on new-media processing and one on the market for Java) from Sun's website and used them to create two versions of a study website. Excerpts from both versions of the site are available at: http://www.nngroup.com/ articles/downloadable-files-to-replicate-web-reading-study/.
The original version of the test site consisted of three pages and used the existing whitepapers with only slight modifications: A special homepage and banner were created for the whitepapers, and external hypertext links were deleted so that evaluators would focus on only that site.
The rewritten version of the site consisted of eight pages that were much shorter on average (not counting the homepage, each page averaged 346 words, compared with 2,232 for the original). Total word count for the site was 2,425 words, which was 54% the length of the original version.
Concise: This was the most difficult guideline to follow, because we were concerned about cutting out "too much." We began by separating the whitepapers using what seemed like natural section breaks. Then we cut, trying to strike a balance between keeping useful information and making the whitepapers easy and fast to read. Doing so required not only tightening of language, but also cutting of overly detailed information. Here is sample text from each site:
- Facilities management also portend high growth. To be sure, microprocessors can be found today in electronic thermostats, intercom systems, automatic sprinkler systems, stand-alone light timers and alarm systems that themselves are linked to a central monitoring station. But picture a home network that ties all these things-and more-together into a coordinated facilities and environmental control system. ....
- Facilities management also will rely on new devices. Electronic thermostats, intercom systems, automatic sprinkler systems and alarm systems all will be tied into a coordinated control system linked to a central monitoring system. ...
Scannable: Several changes were made to summarize and call attention to important pieces of text. We added tables of contents and section summaries, as users in previous studies found them particularly useful. We also included bullets, numbered lists, boldface and colored text to highlight keywords, additional headings, and shorter paragraphs. These changes were relatively easy to make and gave the pages a cleaner, more open design.
Objective : Removing marketese from the text was not difficult to do. We removed adjectives (e.g., "great" and "overwhelming"), buzzwords (e.g., "paradigm"), and claims that were not supported with evidence. Of course, it may not be possible (or desirable) to remove all promotional writing from a corporate website. As with conciseness, we sometimes struggled to find what we considered a reasonable balance.
Evaluation of the Sites
To evaluate the original and rewritten websites, 21 technical users participated in a 2-condition (original or rewritten site) between-subjects experiment. Users' job titles included system administrator, systems analyst, software developer, and senior programmer.
The participant's first two tasks were to search for specific facts within the site. For example, one task was to find out: "According to the website, in the future, how will users of the new-media desktop perceive the LAN/WAN interface?" Next was a judgment task, suggested by , in which the participant had to find relevant information, then make a judgment about it. The question was: "The 'Market for Java' whitepaper mentions several characteristics of Java. In your opinion, what is the most important characteristic that is mentioned? Why do you think so?" This task was followed by a questionnaire.
Next, the participant spent 8 minutes looking at the pages in the website, in preparation for a short exam. As an example, one of the questions read: "According to the site, which network-computing application area is the least developed? a) government b) commerce c) consumer d) education."
As predicted, the rewritten version of the site outperformed the original version on all four major measures, t test data showed (see table).
|Condition||Task Time||Task Errors||Memory||Subjective Satisfaction|
The table shows that the rewritten version outperformed the original on all measures. The table shows mean scores for the following measures (standard deviations appear in parentheses):
- Task Time
- the number of seconds users took to complete the three tasks
- Task Errors
- a percentage score based on the number of incorrect answers given in the two search tasks
- comprises recognition (score on multiple-choice questions) and recall (percentage of Java characteristics recalled) measures from the exam
- Subjective Satisfaction
- the mean score (on a 10-point scale) of ratings given by the users for four indices from the questionnaire: quality of the site, ease of use, likability of the site, and user affect.
To determine how much better or worse in percentage terms the rewritten site version was relative to the original, we normalized all mean scores for the major measures. For each measure, the original condition's mean score was set to equal 100, and the rewritten condition's mean score was transformed (by division) relative to the control. The data showed that the rewritten version of the site was "better" for all four measures: task time (80% better), task errors (809%), memory (100%), and subjective satisfaction (37%).
An overall usability score was calculated for each version of the site, by taking the geometric mean of the normalized scores for the four measures. For overall usability, the rewritten version was 159% better than the original.
Users' comments also confirmed their preference for the rewritten version. Users especially appreciated the changes that made the text easier to scan. A typical comment was, "The main ideas keep popping out at you. Boom. It's very easy to follow."
This study showed that reworking some of Sun's Web pages (to make the writing scannable, concise, and objective) made a major, positive difference in technical users' performance and subjective satisfaction, as well as overall usability.
Of course, "How concise is too concise?" is not easy to answer. We made the rewritten version 54% the length of the original. We tried to cut carefully, but it is likely that some of the information we cut might have been useful to some users. However, users preferred the shorter version and even thought it was more complete than the original. (For the question "How complete is the site's treatment of the topic?", the rewritten version scored 7 out of 10, compared with 6 for the original.) Thus, concise writing is not inconsistent with comprehensive writing.
The results for task errors are dramatic. Based on observation of participants, we think the errors are in large part due to original-version users' impatience and unwillingness to wade through long blocks of text, opting instead to guess at the answer. Finally, our studies suggest that in many cases, one can probably double usability of a website simply by rewriting the author's original text:
- Our first study increased usability of a site with tourist information by 124%
- The current case study increased usability of technical white papers by 159%
The first study simply made the text concise, scannable, and objective; the second study followed these guidelines as well as several others, including the use of hypertext to split long text into smaller and more focused pages.
An additional anecdote to support our claim that you end up communicating more to your readers by following our writing guidelines. Mike Garrison recently sent the following email:
I manage an internal web site inside of Boeing. I try to follow most of your suggestions about usability of my Web pages. I had an interesting experience a couple weeks ago.
- Morkes, J., and Nielsen, J. (1997). Concise, SCANNABLE, and Objective: How to Write for the Web. http://www.nngroup.com/articles/how-users-read-on-the-web/ and http://www.nngroup.com/articles/concise-scannable-and-objective-how-to-write-for-the-web/
- Spool, J. M., Scanlon, T., Schroeder, W., Snyder, C., & DeAngelo, T. (1997). Web Site Usability: A Designer's Guide. North Andover, MA: User Interface Engineering.