The Web page as initially conceived -- simple HTML code served from a single server -- has changed, but many companies haven't realized it yet.
Web pages today, especially ones that use Web 2.0 technologies, are assembled on the fly from files and code on a variety of different servers. They aren't just static pages but dynamic applications.
"The Web application is no longer what people used to think of it," said Imad Mouline, CTO of Gomez, a company that helps companies manage Web applications. "It's not the piece of code the developers create. It's what the users see."
Content now is aggregated in real time in users' browsers. When done well, the experience can be compelling, as it generally is at leading sites like Amazon.com, Google, and Yahoo. For many companies, competing with these market leaders can be tough.
Mouline believes the increasing sophistication of Web sites is raising the bar for everyone. "More end users are expecting a certain amount of richness in their applications, and that's pushing companies to provide that experience," he said. "It doesn't matter whether you're a bank or a retailer, your competition on the Web might be Amazon or Facebook because you're pushed to provide that same level of richness to maintain your brand presence."
But companies trying to match the user experience at top-tier sites often fall short because they lack the ability to adequately assess how pages aggregated from multiple data sources perform in the real world.
It used to be fairly simple to understand bottlenecks on the Internet, said Mouline. It's much more complicated today. Many companies are realizing that they have a lot less control over the user experience at their Web sites than they thought. Ad networks, for example, can dramatically degrade Web page load time, he said.
Geography has a huge impact on Web application performance. "Even among very large players, you can have huge disparities of performance on the Internet between the East Coast and the West Coast," Mouline said.
A large hotel chain, Mouline explained, recently had to decide where to build a huge new data center because it was seeing such a large disparity in performance between the East and West coasts.
A consequence of Web application performance issues like this is that companies may be exposed to brand damage without even knowing it.
"For a retailer, it's the equivalent of seeing a broken window or a light that's out," said Mouline. "It's not going to leave you with a good feeling about the brand."
To help improve that feeling, Gomez relies on a network of about 40,000 peers -- PC users running the company's Web site performance monitoring software. Those peers gather data about how Web sites perform for end users, which is often very different from the way sites perform at the Internet backbone. Armed with that information, companies can identify online problems that just aren't apparent otherwise.
A large electronics retailer with a financing arm recently approached Gomez to test its site as it appeared to users, Mouline said. Though the site worked fine for broadband users, it turned out that the loan application portion of the site only worked for about one in four dial-up users.
"That was a huge surprise to both the folks on the IT side and the business side who were present," said Mouline. "We asked if they cared and they said absolutely," given that the dial-up users were likely to apply for financing.