Week in, week out we read stories about websites failing in the face of overwhelming success. Whether it’s online retailers unable to meet the demands of predictable rises in traffic, or unexpected spikes that knock out news sites.
In an ideal world, every website would be designed and built to deal with traffic peaks, whether they’re planned or not. However, it just isn’t economically justifiable to invest in over-resourcing for a potential server overload caused by a one-off upsurge in visitor numbers. But flexible, cost-effective alternatives do exist.
The problem is that website slowdowns come in many flavours. Every year, the media fixates on stories about online sales surges around the main public holidays. They are quick to highlight the failure of e-commerce sites that struggle to cope with the surge of last-minute shoppers, or the launch of highly anticipated online services that crash because of unpredicted demand. Add to this publicised (distributed) denial of service (DoS) attacks, where sites are deliberately flooded with requests that cause typically high-profile sites to crash or shut down.
However, it’s not only these highly publicised issues that result in the underperformance of websites. The Internet is becoming home to more and more dynamic content whose website pages typically adapt to individual user behaviour. This, in turn, is increasing page load times, which by some estimates have quadrupled in the last ten years. Paradoxically, the continued increase in bandwidth means that our patience for site load times has diminished in recent years. If a site doesn’t deliver fast enough, impatient users can quickly and easily look for alternatives.
Then there are a host of other reasons for poor website performance, including incorrect configurations and externally linked content or local ISP issues. But regardless of the cause, one fact remains - an underperforming website translates into a poor user experience. The consequences range from lower brand loyalty to lost business transactions, not to mention lower productivity and higher operating costs.
Solution: Optimise web pages and prioritise traffic. So, the big question: What approaches are available to ensure that organizations are better prepared for online surges?
For starters, web page and content optimisation must be a key foundation of any website development where a consistent, high quality user experience is a priority regardless of the type of access device used. This is becoming increasingly important in the face of the growing amounts of complex and dynamic web content which, when properly optimised, can help draw visitors and increase stickiness while maintaining fast load times.
However, tuning each individual web page so that it loads faster is a time-consuming exercise, particularly if you need to deliver content to both PCs and mobile devices. Automated tools that provide just-in-time web page optimisation can be a good solution since they reduce the number of round trips between the web server and web browser, and compress images and content to match the client or mobile device.
But, how can you effectively deal with an online surge, whether planned or not? Managing and prioritising traffic can help cope with higher than usual data spikes or regular peak periods. Such tools work by identifying the highest value traffic and ensuring that that segment receives an exceptional online service while still serving and monitoring less important, non-revenue generating traffic, albeit at a reduced rate.
In addition, dynamically caching pages helps to ensure that there’s no delay at critical points in the transaction process, such as for online payment processing. It’s no good if a customer fills his or her shopping basket only to find the site slowing down at the checkout page. At this point, there’s a risk they will abandon the purchase and simply take their business elsewhere. Moreover, dynamically cached pages can also be used to ensure faster visits for repeat visitors by prioritising actions according to their previous site behaviour.
While organisations cannot predict every surge in website traffic, there are steps and technologies they can implement to minimise the impact and safeguard user experience.
Paul Sherry is regional director, MENA, at Riverbed Technology.