Search Engine Rules must for SEO|Contentwriting.Blog
SEO Must follow Search Engine rules for online visibility
Search Engine Rules knowledge is your edge on your competitors. For any game, knowing the rules of the game is very important and following them is even more important. Similarly, knowledge of the search engine rules will save you from crawl-ability problems implementing them will improve your website SEO.
Normally search engines send a Robot (called as bot. spider, or crawler) to look on the web as;
- What’s new on the web?
- General performance and what’s going on the web?
- Look through the web pages, index these, evaluate and find how useful these are for the visitors.
Whenever you launch a website, make an invitation to the search engine by following the set protocols. As, search engines will only come to the website if they know about it. To check whether search engine is visiting your website or not, just enter your domain name in the browser search page, use Google Search Operator, as
Following four possible situations can come up:-
- No result found in the browser.
- Some pages are listed.
- Number of pages shown as indexed is more than you have in your website.
- You find that your website is complete in all respect.
In the above scenario you will notice that only situation “4” is satisfactory while 1, 2, 3 are worry some. You have to do something to solve the issue.
The possible problems could be:-
It could be possible that your site hasn’t been submitted to the search engines resultantly getting no results.
Solution is to submit your site to the search engines just by typing the “Add URL to google” and follow the link while going to google you will get as under:
Add a web page via website tools (if you have access to it) and if you don’t own a website then follow as under:
If adding a page in your website via webmaster tools, you can see the detail instructions on:
For Bing click the link as under and follow the instructions. It’s easy and simple,
Similarly submit your website to all search engines, those matter to you.
Another way is to get link from some respected website with good page ranking to your website. Search engine would love to reach your site from that referring link.
Search engine is facing difficulty to crawl your website. This problem occurs when your website is not search engine friendly. It is simply inconvenient site for spider or spider simply cannot crawl through.
Website and search engines work on some logic. The robots of the search engine are always on the web, they go visit the website, website pages and links between different websites and web pages.
As the robot visits the website they evaluate the web page as per the criteria and report to the search engine. Search engine updates the index and data of the web page. This process continues, robots check the old spots and explore the new one as well to keep the search engine index and data up-to-date.
Following points are important and necessary for the robots to consider when crawling. At the same time, play an important role for SEO purposes as well.
1. txt File:
This file is guide for robot, visiting the website. Provide it in the specified format, what to visit and what not to visit. If you think, something is not good for your search engine ranking, just mention it in your robot.txt file not to visit. Robot will not visit that page.
Although it’s the job of your webmaster to make robot.txt file but you can also create the one yourself by using this link.
Important: Be careful when making robot.txt file at your own, a small mistake may render your website to disappear from the search engine’s index or may create some other complications. Once you are done with it, always run through the validator, to see as, it’s written correctly. It’s better to apply google tool or you may use this as well, any way both are free:
2. Hosting of website:
Always get the services of fast and reliable hosting so that server is not down when search engine spider visits your website. As the websites, frequently with down link, are not visible to users hence providing poor user experience to the visitors. Robots take these as unfriendly.
3. Website Performance:
Always measure the performance of the website, use google performance tool, its free and effective. Use it, optimize the website performance:
4. Use Static URLs.
Pages on many websites are with unclear URLS or dynamic URLs. Meaning by you cannot determine which goods or services this URL leads to where. Like,
|A. www.seo.com/?item=32554 Unclear)
B. www.seo.com/keyword-research.html (Clear)
C. www.seo.com/links.html, (Clear)
You can see from the above illustration that dynamic URL at “A” is hard to understand while B & C are static URL and can be easily determined by the descriptive words they contain. Advantages of static URL are:
- Static URL contains keyword which is positive signal for the search engine and good for SEO work to improve in ranking.
- Search engine wants to list pages in its index those are unique.
- Descriptive and speaking URLs are understood by the users and robots equally. (Meaning by Good user experience)
- Special characters and parameters are hard to understand for search engines so static URLs stand better in ranking.
Important: Old URL with parameters should be hidden from search engines by using robot.txt or find the tool to rewrite the URL or ask your webmaster.
5. Website Navigation:
Website navigation matters and matters a lot. Whenever placing the content place it in groups and categories. You can go to sub categories but not to sub-sub categories. Make it easy for the visitors to access the desired information.
Do not make the navigation complex, make paths to reach a page i.e. from many pages from within the domain or from outside other relevant domains.
6. Links in Text are better than putting Links in Images:
Search engine do not recognize the images so where ever it’s possible put text links. Use Alt txt under the image to make it understand the search engines. Try to use key words, it will help you in ranking.
These all can make the design of your website very nice and impressive but for search engines they are like broken links. It is hard for the search engines to read these. So they are bad as for as the search engines are concerned. As a result your web pages are not indexed by the search engines. To have deeper understand of all these is little important:
It’s the graphic animation and being used by the people without any purpose. It’s bad for the website health, because if there is some text in it search engine won’t recognize it. Any keyword, anchor text in it will not be read by the search engine. As a result, website is deprived of the ranking benefit.
Frame allow more than one HTML document in the same browser window. “If your browser window is broken into more than one parts and each one look like a separate web page, which can be scrolled independently, the site is using frames.
Another check is while clicking on different menu items and tabs, if the URL does not change, the website is built of frames. It is inconvenient for the robot to reach all the pages in the website.
8. Never Use Welcome Page:
Never, ever use the welcome page, as some people make the home page as welcome page and then ask for another click, to direct the visitor to their main website. Users hat it and so the robots.
9. Broken Links:
Broken link can be explained as, the web page that is missing some elements or incorrect elements from the link HTML code, or a link that leads to a non-existing page.
Fix Broken Links By All Mean!
A free on line tool can help you to fix these:
10. No Duplicate Content:
Duplicate content on your website, lowers the ranking of your website. As, the pages are competing to each other, Search engine finds it difficult to decide which is the original and more important. Get rid of them immediately by using.
Make use of rel =”canonical”
Let’s dive deeper to understand the duplicate content further, to make the things easy to comprehend. We will frame few questions and give their replies to make them easy to understand.
- What do you mean by duplicate content?
- How to detect one?
- Once you diagnosed how to handle it?
- What is Internal and external duplicate content? And how to handle these?
To start with the definition of duplicate content:
It is the block of content on your website, which completely or partially matches, with some other content, within the domain or on some other outside domain. The bad part of duplicate content is that google focuses it as 2nd class content from SEO prospective and gives no benefit as far as the ranking is concerned. So ensure that only right pages appear in the search engine index.
Besides this google may rank the wrong page on your website, as his algorithm may think the other page more authoritative while you want a different page to rank. But as both the pages have a similar content it all depends on the algorithm to decide.
How to detect or identify duplicate content?
The first thing is that your site structure should not allow the duplicate content to approve. Even then if accidentally it happens and some duplicate pages are generated. The first step is to systematically navigate your site, to find duplicity you can use different free tools as well. Like;
- Google webmaster Tools. It is the best and free.
- Google Search Operator ( In the Google Search just write as under)
Quote “ Put the content/phrase you want to check”
- Copy Scape: It is another useful tool to find duplicate content.
How to handle duplicate content?
For the sack of clarity we will take internal and then external content separately:-
Internal Content Duplicate, reasons and remedies:
Reasons for Internal Content Duplication:
- Different URL leading to the same content (meaning by Duplicate paths in the URL)
Solution to this is URL= “canonical” >
Add this tag in the head element which will set that URL as canonical while the other will be counted as not to follow.
- Same page with same content available at several URLs.
This can be handled by using a 301 redirect (By using this you will redirect the second version to the main page and accordingly the link juice is passed from the redirected page to the chosen one. It will identify the Search engine that other pages are with the identical content while considering the main page as most useful.
You can find further detail about this on this link:
- Some online businesses tag the visitors, this tagging new visitor, with a tracking parameter results in indexing. Consequently add 1000s of duplicate pages to your index.
Solution to this is, remove the session ID from the URL and store it in a cookie.Whenever user browses your site in future it will notify you about the user’s activity.
Use canonical Tags
Use Google webmaster tools to specify URL parameters you want google to ignore.
External Duplicat Content:
Some of the reasons may be:-
- Webmaster sets up a new website and forgets to delete the Old one.
- Affiliates or re-seller copy your content with permission
- You syndicate large content from other sites.
- Just make the cross domain canonical tag, don’t copy the descriptions rather write a unique one to give it to the affiliate.
- Educate affiliates and re-sellers.
- Anyway Google algorithm renders credit to the original pages.
FIX HTML=CSS mistakes
It’s bad to have mistaken in HTML. Any mistake in the important tag will render the search engine, not to identify that tag and content in it.
Anyway check it free with the validator;
It will give you the flaws in coding and the explanations, as, how to fix these.
Similarly check the CSS code with online tool like:
Create the site map: Site map are of two types:
- HTML sitemap: made for human users.
- XML sitemap: made for search engines with “S” being capital in site map.
Must create site maps and submit
You can recreate one with the help of:
And once you have made it you have to submit it to the search engine. For google, you can get help from this link,
After fixing some of these aspects of your site you will see the mark improvement, in your sites performance. We have discussed mostly manually and free resources. Although, there are many tools available in the market with cost to help you and make the things faster.
- We have learned as how the search engine should know about your site.
- What issues to be fixed in the website to make it convenient for the search engine to crawl your website fast and without any problem?
- How your visibility can enhance, just by fixing of these issues.