Tradeoffs in website design
Website design tradeoffs
There are two primary objectives in web design which cause tradeoffs
- Usability, appeal or the ability to convert a view into a sale
- Discoverability, that is design according to what search engines like
A typical example is our front page. It is way too big in terms of download size, typically we should aim for 100-200Kb total. Ours is more like 350Kb total. So, one very important criteria in the search engine algorithm (page load time) will penalize this site and potentially demote it compared to equivalent sites. This is a typical tradeoff in website design. To overcome this we had to heavily optimise the images and modify the fonts, the end result was an A for page load, typically at under 2 seconds. Again, in doing so meant the image quality is not crystal clear and the fonts are ordinary, but at least its a compromise we are willing to accept to minimise any penalty for having too many images.
However, because the site is new with few links and a viral campaign will be started, it was considered better to be impressive and have a higher conversion rate that pure rankings, for now. When designing pages these are the sort of tradeoffs we must always consider.
Another example is this our blog page on the 10 steps to building a website. As you can see it is listed sequentially down, originally it was a table made for easy human reading. The problem is that search engines don’t see all the pretty formatting and graphics that we humans see, they like text. And so, as it was still attractive to a human, we made the decision to design it for the search engine. These sort of tradeoffs in website design is happening all the time. The general rule is to design and write content for humans, after all that is the target market. But at times we must consider what is good for optimal SEO.
There is no such thing as the perfect site
It is easy to design a beautiful site with high-resolution photos, lots of flash animations and heavy on multi-media. But, is it really functional?
The reason search engines demote sites with poor performance is based on legacy considerations. In a land far far away, we all use to have modems running at 56k baud…compare that to our optical fibre 100Mbit lines now and it seems like a different lifetime. But, in many countries (especially developing nations), they still have low-speed lines. And so, the internet should cater for both low and high-speed. Our site is designed for high-end users, but still, the search engine will lessen its rank because the user experience is too slow.
The general rule of thumb is a bit of a compromise. The front page can be flashy, but the individual pages should be around 100-200Kb max. Images should be reduced using compression methods so they are under 60K or so, and try to reuse images so browsers use the cached version instead of downloading it.