Coping with online highs and lows

Paul Sherry on how organisations can better prepare for website traffic peaks
Paul Sherry is regional director at Riverbed Technology in the MENA region.
Paul Sherry is regional director at Riverbed Technology in the MENA region.

Share

Week in, week out we read stories about websites failing in the face of overwhelming success. Whether it’s online retailers unable to meet the demands of predictable rises in traffic, or unexpected spikes that knock out news sites.

In an ideal world, every website would be designed and built to deal with traffic peaks, whether they’re planned or not. However, it just isn’t economically justifiable to invest in over-resourcing for a potential server overload caused by a one-off upsurge in visitor numbers. But flexible, cost-effective alternatives do exist.

The problem is that website slowdowns come in many flavours. Every year, the media fixates on stories about online sales surges around the main public holidays. They are quick to highlight the failure of e-commerce sites that struggle to cope with the surge of last-minute shoppers, or the launch of highly anticipated online services that crash because of unpredicted demand. Add to this publicised (distributed) denial of service (DoS) attacks, where sites are deliberately flooded with requests that cause typically high-profile sites to crash or shut down.

However, it’s not only these highly publicised issues that result in the underperformance of websites. The Internet is becoming home to more and more dynamic content whose website pages typically adapt to individual user behaviour. This, in turn, is increasing page load times, which by some estimates have quadrupled in the last ten years. Paradoxically, the continued increase in bandwidth means that our patience for site load times has diminished in recent years. If a site doesn’t deliver fast enough, impatient users can quickly and easily look for alternatives.

Then there are a host of other reasons for poor website performance, including incorrect configurations and externally linked content or local ISP issues. But regardless of the cause, one fact remains - an underperforming website translates into a poor user experience. The consequences range from lower brand loyalty to lost business transactions, not to mention lower productivity and higher operating costs.

Solution: Optimise web pages and prioritise traffic. So, the big question: What approaches are available to ensure that organizations are better prepared for online surges?

For starters, web page and content optimisation must be a key foundation of any website development where a consistent, high quality user experience is a priority regardless of the type of access device used. This is becoming increasingly important in the face of the growing amounts of complex and dynamic web content which, when properly optimised, can help draw visitors and increase stickiness while maintaining fast load times.

However, tuning each individual web page so that it loads faster is a time-consuming exercise, particularly if you need to deliver content to both PCs and mobile devices. Automated tools that provide just-in-time web page optimisation can be a good solution since they reduce the number of round trips between the web server and web browser, and compress images and content to match the client or mobile device.

But, how can you effectively deal with an online surge, whether planned or not? Managing and prioritising traffic can help cope with higher than usual data spikes or regular peak periods. Such tools work by identifying the highest value traffic and ensuring that that segment receives an exceptional online service while still serving and monitoring less important, non-revenue generating traffic, albeit at a reduced rate.

In addition, dynamically caching pages helps to ensure that there’s no delay at critical points in the transaction process, such as for online payment processing. It’s no good if a customer fills his or her shopping basket only to find the site slowing down at the checkout page. At this point, there’s a risk they will abandon the purchase and simply take their business elsewhere. Moreover, dynamically cached pages can also be used to ensure faster visits for repeat visitors by prioritising actions according to their previous site behaviour.

While organisations cannot predict every surge in website traffic, there are steps and technologies they can implement to minimise the impact and safeguard user experience.

Paul Sherry is regional director, MENA, at Riverbed Technology.

REGISTER NOW | Webinar Event | Security you can bank on – Safeguarding the Middle East’s financial sector

Presented in partnership with security and network specialist Cybereason, the second in the three part webinar series will bring together a panel of experts to discuss how banks and financial institutions are evolving their service offering while simultaneously staying one step ahead of the cyber criminals who seek to bring their operations crashing to the ground.

Editor's Choice

Emerson expands analytics platform for industrial enterprise-level wireless infrastructure management
Plantweb Insight platform adds two new Pervasive Sensing applications that manage wireless networks more efficiently with a singular interface to the enterprise
Digitalisation seen as a competitive advantage by Middle East private businesses
Nearly 80 per cent of private business leaders acknowledge that digitalisation can impact business sustainability
Etisalat introduces Multi-Access Edge Computing architecture delivering best-in-class video streaming performance for 5G networks
MEC architecture achieves performance gains of as much as 90% in video streaming, validating how ultra-low-latency applications will be delivered over 4G and 5G networks

Most popular

Don't Miss a Story