Search Engine Optimization Stuff #2
Just when we thought that putting up meta tags, carefully choosing a page description, placing alternate tags and linking properly the site goes to a smooth sailing on its way to the top of search engine results. You could be wrong. In what seems to be a perfectly search engine friendly setup, you are still unable to gain ground and instead stuck in the middle or end of nowhere.
These are the possible reasons that keep you away from the coveted first page placements:
1) Changing of the site's IP address - This happens when you switch web hosts, reorganization in your network infrastructure or for any reason your site moved its IP address to another location. This causes a partial disruption to the indexing service provided by search engines and therefore content becomes not spidered for a while. This is a problem to better performing engines prior to switching of IP address. But this is deemed as temporary.
2) IP's banned by search engines. Often associated with spam e-mails, IP addresses of originating e-mails tend to be blocked by mail servers after relegating them to the blacklist. You are in deep trouble if you were performed 1) which happens to be into a blocked IP address. You can check blacklists from websites belonging to the following site: http://www.dnsstuff.com/
3) robots.txt - What is robots.txt? This is the instruction file that tells the search engine robots what to do during their stay in the website indexing its contents. Most probably for those who want their pages to appear prominent such as sales pages or promotion articles, would like to have the pages indexed but for sensitive files (such as member only access folders or privacy-related pages) they would not want certain contents in their websites. That is why an instruction must be made. To create robots.txt, save the following in Notepad or any text editor with filename robots.txt (see, this is not just a term to refer to robots but really the filename you will identify such instruction file):
User-agent: *
Disallow :
Explanation: The first line specifies the name of the user-agent or the robot name. Since the asterisk character "*" is specified, just like wildcards, it represents: "to all robots". Then the second line tells them to Disallow nobody (as represented by the blank content after the colon separator).
This is for search engine robots to scour every page in the site (or any document such as MS Word or Adobe Acrobat format). These bots visit the site at around once every four to six weeks and more often when the site is often updated. Thus, I am advocating sites to have fresh content as often as they could so as to be attractive to search engines.
When you're done typing that 22-character file and saving it up, place it at the root folder of your website (where your index.html or default.asp is found). It's located there typically because normally homepages are accessed from the homepages initially and is thought that links to various sections of the website is accessible from this node.
A slight mistake in this simple instruction file can be disastrous in your SEO efforts.
4) and 5) in the next edition.
0 Comments:
Post a Comment
<< Home