SEO - Website Design and Promotion
As this seems to be on of the Forum hot-topics right now I thought that I should provide a series of posts so that anyone interested can participate on the different aspects individually.
I do not profess to be an expert. Hopefully, those who earn a living doing SEO will provide their own comments and the conglomeration of ideas will produce a valid reference point for other "aficionados" of SEO.
I have provided the following separate post in order to start a thread on each topic.
Please note that the info supplied is not supposed to be particularly informative but simple to start an interesting discussion.
2. Scripted Pages -
Most estate agents sites use some scrip or other to get the details from a database and the script builds the page from scratch and that is to say that, until the script is run, the page does not exist. Robots are able to select parameters for a search and push buttons. However, for some reason I cannot quite understand, there is often no way out of a scripted page for a robot.
How can I tell if the page is scripted? Generally, if the file
extension is not .htm, .html, .pdf, .doc then it has been produced by a script
and if the the URL contains a "?" character then it certainly is.
Google tells us that pages with a "?" are not robot-friendly and advises us to make static html copies of important pages and, in order to avoid duplicated pages (another mortal sin) to disable all scripted pages from the index with robots.txt.
Obviously this solution is totally impractical because, if there has to be a copy of each scripted page, then there is little point in having the scripted page in the first place.
You are probably thinking that this cannot be true and any
site with scripted pages does not stand a chance with Google! Not quite but
almost true. If you don't believe me then do a search of Google looking for
pages with a "&" in the URL. If you disregard the supplementary pages you won't
Although not really a case of "not standing a chance" any site with only scripted pages has a big disadvantage.
There are a couple of solutions -
Have a static index page and a number of link- and spider-baiting pages which are static html. Se my post "?"
I believe that there is a way that the .php script can create a spider-friendly page without the "?". Are these pages actually spider-friendly? I am looking into this. Does anyone have any info?
3. Keywords -
File and folder names are important - Yes! robots read everything in order to determine relevancy. e.g. /images (better "/blue-widget-images), search.php (blue-widget-search.php), pic1234.gif (blue-widget.gif), The only page you are stuck with is index.* - everything else can carry your keyword message to the robots.
How many and which keywords to optimise for?
The best course of action is bottom up (as opposed to top down). As I demonstrated in a previous post, "widgets" is an impossible target, at least to begin because there are 44,400,000 pages optimised for this keyword. "calpe reconditioned widgets" however would take just a few days to get before the fold as only 8 pages are indexed.
Therefore, start at the bottom with extremely obscure
keywords. Getting before the fold and a Page Rank should happen around the same
time then you can work your way up, through "costa-blanca widgets", "spanish
widgets" to widgets.
The best way is to decide on main (easy) keywords and ancillary keywords. Optimise the index page for the main keyword and get a result then push for the ancillary pages with more difficult keywords.
Be patient - nothing happens overnight. It could be months or years before you get there.
Apart from technically correct optimisation traffic traffic is important for your position in the SERPS for a particular keyword. A new site will probably not have good links or a Page Rank but it could generate a relatively large number of clicks for its niche keyword e.g. with only 8 pages indexed for "calpe reconditioned widgets" you cannot fail to get a fair proportion of the hits.
4. Optimising Text -
There is very little that can said about this that has not been repeated time and time again. Keyword density is vital but you will be penalised for stuffing. How far can you go? Personally I have a simple remedy for this when writing my text. I repeat the keyword on the page as many times as possible and then read it back. At the end of the day it has to look right (perhaps just a little strange) for my clients, who are human. If that is the case it won't upset Google.
Apart from this consider tactical use of keyword rich H-text and bold-text titles, Alt-text, anchor-text and hyperlink title-tags.
A good idea is to see what others are doing. Look at the first 5 SERPS for a particular keyword and click to see the cache of the page, all of the keywords will be highlighted. The <View> <Source> and see how many time and where the competition uses the keywords
Optimising 'Photos and Graphics -
Every graphic or photo has its ALT text and every hyperlink has its title (Windows Screen Tip) these should always be considered as an opportunity to introduce keywords. Anchor Text keywords are also important. Why use "Click Here" when "The Selection of Blue Widgets" tells the robot so much more. Visitors don't need to be told where to click anymore. If your hyperlink colour and style are distinctive and consistent throughout your site they will easily recognise the "hot text".
5. Linking -
A very important aspect of SEO - but to get it right takes time. Google have a lot to say about this. The worst thing you can do is join a link farm. My opinion is that swapping links is not much better.
Google have a lot to say about "natural" links and they even
mention a tactic called "link-baiting". This is to present such useful content
and good SEO that other sites find you and link without compromise because you
have something that their visitors might find interesting.
Obviously this is a long haul and can take years but a shortcut is to set up a linking club and link in a way that none of the links is reciprocal. i.e. Site A links to B, B links to C and C links to A.
An alternative is pop-unders organised the same way which provide links and could also produce traffic.
6. Title and Description Tags -
These should really be different for every page. Nobody knows exactly how far Google takes the duplication rule but it is best to be on the safe side.
The Title should consist of the best arrangement of the page keywords, including plurals, without stuffing e.g. "Calpe Reconditioned Widgets | Calpe Widget Shop" or "The Calpe Widget for Reconditioned Widgets"
The Description is not always used by Google in the SERPS, sometimes a selection of words for the page content is provided. However it should be robot- and human-friendly because you want browsers to select your site from others. Repeat the keywords without creating an ugly description. "Reconditioned Widgets in Calpe, Costa Blanca: Visit our the Widget Shop in Calle Mayor (Calpe) or see our our selection of widgets here on-line. Choose from new widgets or our range of guaranteed reconditioned widgets. Our decorative miniature key ring widgets are an ideal gift."
7. Site Maps -
No point in even thinking about SOE unless you have a sitemap. You will find all the info you need a Google Webmaster Tools - just follow your nose!