The Overloaded Duck

Last nights BBC2 Masterchef The Professionals featured the finalists preparing food at a world renowned 3 michelin star restaurant.

I’m sure (well I know) I was not the only one that whilst the programme was showing tried to access the web site of the restaurant and found the site not responsing at all and timing out. It took a while after the program finished no doubt due to people watching back on iplayer, etc for the site to recover.

For a company that is a leader of technology in food the same could not be said perhaps of its website. Are they concerned, probably not as they are a world renowned brand that no doubt is fully booked weeks in advance, therefore an unresponsive website would not harm their brand.

They are of course not alone many companies that feature on prime time television experience the same issues, when the audience on mass access their websites. Try looking at companies featured on dragons den and the like at the time of broadcast and the same issue arises.

In the past there have been significant failures of launched public websites that have been heavily advertised where the site has crashed and it was found they could not handle the initial traffic the Olympic site, Vue cinemas for the launch of the new Bond movie are recent things at spring to mind. The question is how do you measure what level of traffic you should be able to handle before there is damage to your brand and where are you happy with failure. For a retail brand perhaps your site only experiences high load 4 or 5 times a year during sale days, in general the traffic lasts for a day before subsiding perhaps acceptable. If your site is unresponsive for days in the run up to say Xmas then questions will no doubt be asked and customer confidence could fall.

There are of course ways to mitigate failure, ensuring that load testing is adequate against predicited loads for the year, the server architecture is scalable, caching techniques are used both on the application server and database server, static pages are offloaded to CDNs. The cost of implementing this can be high. Even with load testing there is still a form of fingers crossed nothing else creeks i.e. back end systems, links out to SOA.

For a small company website perhaps just putting up a static page would suffice just for when you think that peak would prove a bad experience.