The ABC television show Extreme Makeover, Home Edition, displays severe spikes in its Web site traffic as each home reconstruction gets underway, builds up cadres of volunteers and onlookers, and then airs Sunday evenings on the ABC network.
When Extreme Makeover arrived in Philo, Ill., however, Web developer Cybernautic decided it was time for an extreme makeover of the show's Web site. A unique site is built for each show. Each project forms connections to the area where the home reconstruction will occur by involving platoons of volunteer re-furbishers, neighbors and onlookers. A local builder often sponsors the site.
Cybernautic, a design firm in nearby Normal, Ill., got the nod to overhaul the site for the Oct. 25 show.
For the Philo reconstruction, Cybernautic CEO Chad Parker met with the show's producers and asked what he should expect in the way of traffic. "You will need to make sure you have an unlimited supply of beer and pizza for your network administrator," he was told by executive producer, Conrad Ricketts. It would be the administrator's job to reboot the site repeatedly for several days as it crashed from spikes in traffic, he explained. Ricketts told him, "Every site we have done for this season has crashed," Parker said in an interview.
Parker had one week to get a site up and running in a highly scalable fashion. He looked at his current servers with his clients' 200 Web sites hosted on them. They were supported by a dedicated host supplier, The Planet. He envisioned all of his customers' Web sites crashing as spikes in Extreme Makeover traffic hit the servers. He decided he needed another solution.
"If our servers went down for five minutes during the Makeover episode, the phones would have rung off the hook. It would be pandemonium around here," said Parker, who was already working 18 hours a day trying to pull together all the necessary elements for a new, scalable site. He didn't relish the prospect.
"I considered a dedicated server at The Planet, but the cost would have gone through the roof" over the several months that the site needed to be maintained, even after the heavy traffic had departed, he said.
Instead he opted to launch the Web site on Rackspace Cloud servers. Rackspace would monitor traffic to all its servers, keeping a buffer of surplus capacity on hand at all times. More servers could be plugged in as needed and if circumstances warranted, Rackspace would spin up additional virtual servers to keep response times short, no matter how many visitors decided to use the Extreme Makeover site.
"I didn't want to have to restart 15 times a day remotely (The Planet runs data centers in Dallas and Houston)," Parker explained. It was unclear to him that a cloud service would be elastic enough to keep up with Extreme Makeover traffic, even though it portrayed itself as a scalable resource. But he talked to Rackspace technical support, examined its data center resources, looked at its client list and tried a few simple tests.
"I thought, 'If it doesn't work, it doesn't work.'" He'd be no worse off than the previous sites that had crashed. But if it did work, Cybernautics would gain bragging rights about how it could cope with the TV show's traffic and other site designers couldn't. The logo of Extreme Makeover, Home Edition is now listed prominently on Cybernautic's site, listed as "Our latest project."
The Philo-focused site was up and running on time, and to Parker's surprise, the biggest traffic spikes came early, before the show aired Sunday evening, Oct. 25. Someone familiar with the project built a Facebook fan page on the upcoming episode "and overnight it had 12,600 fans," said Parker. Each fan, it seemed, wanted to visit the Makeover site several times a day.
Interest continued to zoom as newspapers in the nearby population centers of Champaign-Urbana and Bloomington picked up on the Philo reconstruction story. Some area residents broadcast updates on Twitter. Parker found himself updating the Web site 50-60 times a day, adding fresh pictures of foundation work, framing, whatever work was in progress, along with comments and stories to satisfy the demand.
In one 24-hour period, he had 41,466 unique visitors, many of them driven from Facebook links, who viewed a total of 168,873 pages, or about four pages per visitor, and stayed on the site an average of six minutes. Another surprising source of traffic were entertainment writers in Hollywood blogging about the elements of the next episode of Extreme Makeover in Philo, Ill.
Rackspace Cloud general manager Emil Sayegh also watched the traffic rise and fall. To both his and Parker's surprise, it leveled off as the show aired and fell soon afterward. It was the buildup, the anticipation driven by participation of hundreds of local people and their access to information on the Web, that had been the main driver, he concluded.
Sayegh said Rackspace took the precaution of using its content delivery network supplier, Limelight, to cache pictures and content at multiple sites around the country, speeding access times once traffic built. After the first user's look, the bits were cached in the memory of a server that was likely to be located closer to the visitor than Rackspace's San Antonio data center.
"At times of peak demand, there could have been 100 servers working for Chad's Web site," said Sayegh in an interview. At the same time, the distributed content made sure "if there was a mad rush, there wouldn't be any problems," he added.
Parker says he's dropping dedicated hosting for all his other customers and moving their sites into the cloud, in case one or more of them develops traffic spikes that need to be served. "I don't need to worry any more about whether I need to add another server. The cloud automatically scales to match what I need. That's Rackspace's problem," said Parker.
InformationWeek has published an in-depth report on the public cloud, digging into the gritty details of cloud computing services from a dozen vendors. Download the report here (registration required).