The Web Backlash of 1996

by Jakob Nielsen on April 1, 1996

The Web will get worse before it gets better.

Several experts have predicted the impending collapse of the Web and/or the Internet due to ever-increasing loads and ever-slowing response times. Most notably, Bob Metcalfe promised to eat his column at the next Web Consortium conference if the Web hadn't collapsed in 1996 (and indeed he did eat it in April 1997)

The Internet doubles every year and has done so ever since it was founded. Currently, the Web grows even faster (doubling every four months or so), though this higher growth rate will have to slow down eventually since the Web is a subset of the Internet and thus cannot outgrow it. In comparison, computers double in power approximately every 18 months, following Moore's Law.

Compare the two growth rates:

  • demand (number of Web users) doubles every 12 months or faster
  • supply (the power of servers) doubles every 18 months

It is obvious that web sites will have to upgrade their servers at an unprecedented rate and certainly much faster than the traditional growth of computers. Very few web sites are currently running on the largest possible servers, so part of the problem can be solved simply by upgrading to higher-end models (which seems fine to me, but then I work at Sun).

In addition to server upgrades, the mismatch between web growth and computer growth will be addressed by significantly increased use of mirror and cache servers to minimize traffic and serve the most popular pages in a distributed manner. Unfortunately, caching is only a partial solution due to the trend toward relationship service on the Web where each user is provided with individualized pages (that cannot be cached since they are different for each user).

In other words, it is likely that the current problems with slow response times on the Web will continue in 1996 and probably even get worse. Capacity problems will eventually be solved, though, since more infrastructure is constantly being put in place and the growth in usage will slow down once all the early adopters are online.

A worse problem is the lack of usability for the novice Web user. The fact that the Internet doubles every year implies that at any time half of the users will have been on the net for less than a year. In other words, we are doomed to have 50 percent novice users for the foreseeable future.

Web browsers are getting harder to use: they have more features and users need to hunt down and install extensions before advanced websites can be accessed. Java may alleviate this particular problem with its notion of software-on-demand and applets that appear automatically without explicit user commands, though there is a definite risk that poorly designed Java interfaces will confuse users with new interaction styles for each website. We may have experienced a temporary sweet spot in Web usability during the period from the release of Mosaic to the release of Netscape 1.1. These early browsers had relatively few features (so users could figure out how to use them) and were more intuitive than the previous generation of text-only browsers.

As the Web adds content, features, plug-ins, and applets, I strongly encourage all site and software designers to take usability extremely seriously. Please don't consider leading-edge users your only design center: the vast majority of users are new, and even going through Yahoo's concept hierarchy is pretty tough for them. If you have been online since 1990, then 98% of current users have less experience than you do. More than 90% of the users who will be accessing your content in the Year 2000 have not started using the net yet.


Share this article: Twitter | LinkedIn | Google+ | Email