Whether you’re looking to gain more exposure for your existing website or you’re building a new site from scratch, there are some simple considerations to take into account that will ensure your target audience finds your site.
Structure
If your site has a clear hierarchy and text links (as oppose to image/flash/javascript links), both users and search engines will be able to navigate your content more easily. Ensure that every page is reachable from at least one static text link and that your most important pages are linked up from more places (for example, in the menu, footer, sidebar etc).
Write good content
The quality of your content has never been so important with Google investing millions upon millions of pounds in developing algorithms that drive the best content to the top. Even if you’re promoting a business – whether it’s window cleaning, internet marketing or your local caving group – you need to create a genuinely useful, original information-rich site that adds value to the net as a whole.
Google encourage you to consider which words your target audience might type in if they were looking for the sort of content that you’re offering, and to use these on your site/page. Avoid putting this content in javascript, images or flash as you’ll make it more difficult for the search engines to read it.
It will further help search engines to classify and rank your content if you label it well. Use ‘alt’ attributes that accurately describe your images and elements, and mark up videos and contact info using formatting such as RDFa.
Pay attention to detail
Google’s huge beef is with delivering a quality user experience. If it can’t crawl half your pages, not only will you miss out on ranking for that content, but you’ll likely rank lower. So check for broken links and correct HTML – I like Xenu’s link checker which is free and you can validate your HTML here.
If you are using dynamic pages (i.e., where the URL contains a “?” character), be extremely careful that they are not causing a duplicate content issue, for which you will almost certainly get penalised. You need to ensure that there is only one way of reaching each page on your website. Make use of your robots.txt file to stop Google and the other search engines crawling search results pages and other auto-generated pages (like WordPress tag pages) that add very little value for users.
Remember, it’s about user experience – so check your site carefully in a number of browsers including a text browser such as Lynx. This is both for appearance and to ensure that your site can be crawled fully. Most search engine spiders see your site much as Lynx would so it is very useful for identifying any features (such as JavaScript, cookies, session IDs, frames, DHTML, or Flash) that are preventing the engines from crawling your site properly. Another way to identify crawl errors is to use Google’s Webmaster tools which will not only tell you if there are any problems, but will even give you advice on how you can better meet their guidelines.
Speed up
As part of delivering a great user experience, Google expects sites to load up fairly quickly. Fast sites simply increase user satisfaction and improve the overall quality of the web. Google gives a list of recommended tools for improving site performance including Page Speed, YSlow andd WebPagetest – and there is a Site Performance tool in Webmaster Tools which gives data on the speed of your website as experienced by users around the world. I’m a big fan of Web Page Analyser, a free online tool for checking your site speed which gives easy-to-follow recommendations.
What to avoid
Google’s list of website no-nos is a lot of common sense when you recall (once again) that they’re simply trying to deliver good results to their users. Here’s what they won’t allow:
- Cloaking – presenting one set of content to the search engine and another to the user.
- Link schemes – link exchange sites, link farms etc – these are all designed purely to manipulate the search engines and having your site listed on them can adversely affect your rankings.
- Use of ‘black hat’ and unauthorised software – this even includes programs to submit pages and check rankings such WebPosition Gold™, but would also include programs to automatically build links through spamming.
- Hidden text or hidden links – search engines can easily identify if your text is the same colour as your background.
- Keyword stuffing – industry opinion is that this won’t necessarily harm your site but won’t help it either as the content is poor so won’t be ranked well.
- Duplicate content – Google’s last major updates have been all about ridding the web of as much duplicate content as possible so keep your content original and be very careful that each web page only has one URL.
- Sites that exhibit malicious behavior, such as phishing or installing viruses, trojans, or other badware.
- “Doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.
Contrary to what some think, SEO and link building is not necessarily considered a bad thing. Google actually advise that you “make sure all the sites that should know about your pages are aware your site is online”. Search engine optimisation and good link building actually helps Google deliver better content to its users. It’s just not good when your site isn’t worth all the links it has, or you use underhand techniques to try and get your site ranked far above where it would likely fall after a human review.
Sitemaps
Create a sitemap for your users to help them find all of your pages. If you have a large site and the sitemap therefore has an extremely large number of links, you may want to break it up into multiple pages. Generally, I’d advise no more than 100 links on any page (sitemaps or otherwise).
Also create sitemaps for the search engines for the same reason and keep these updated. My favourite tool is ‘XML sitemap generator’ which is a free online tool that handles large sites very well and creates html, xml, txt and ror sitemaps for you.
Make sure you add links to your sitemaps somewhere on the site (I usually add mine to my footer) and submit the XML sitemap to Google through Webmaster Tools. These type of sitemaps can contain a lot more links – up to 50,000 – as they’re not part of the user experience.
Leave a Reply