Seattle Web Design Seattle Web Design

SEO Tips and Tricks #2

 

Improve your website rankings on Search Engines

In part 1 I talked about the meaning of SEO and seven useful SEO tricks. You learned how to make some easy changes on your website and get the great search engine’s ranking result. Now it is time to learn more about how to introduce your website to Google and Bing. Also I am going to talk about “sitemap.txt” and “robot.txt”.

google web master

 

Google Web Master Tool

It is a free Google service for all website owners which allows them to check their website indexing status and optimize web visibility. You can submit and check your website’s sitemap, crawl rate, “robot.txt” file, and find out broken links on your website. If you are optimizing your website, you definitely need to make an account and submit your website on the Google web master tool. To access “Google Web Master Tool” please check https://www.google.com/webmasters/tools .

 

Bing Web Master

As you know Bing is the second famous search engine and most people use to search on it. Just like Google Webmaster Tool, Being web Master helps you to monitor your website and optimize it for Microsoft search engine lovers. You can check traffic on your website and submit sitemap. You can submit or ignore any URL on it. Check crawl control. I have a good news for your. Being webmaster has “SEO Analyzer”. Just enter your web URL and see what errors comes out of your website to fix.

To access “Bing Webmaster” please check http://www.bing.com/webmaster .

 

Sitemap.xml

This is an important and useful file for your web SEO. In fact it is a map of your website which list all your web pages in xml format. “sitemap.xml” should be located in root of your website.

Please FTP your website and check whether there is such a this file in it? If you find it, so download to your desktop and open it up with a text editor. Check the pages’ paths. If they are not correct, then you should create an updated one. If there is not such a this file, then generate one and upload it to your root of website.

 

Here is a free tool to generate “sitemap.xml” for your website:

http://www.xml-sitemaps.com

 

Robots.txt

It is talking part of your website to search engines. As you know search engines crawl (visit) your website to index your content. By “robots.txt” you can control on what part of your website should be visited and which parts should be excluded. In the other words by robot.txt you tell search engines which pages you do not like them to visit and index.

 

How to create robots.txt

Creating robot.txt file is pretty simple. It is an endless list of user agents and disallowed files and directories. The syntax of this site include two main words, “User-agent” and “disallow”. User agent is search engines’ crawler and disallow shows excluded files and directories.

 

Example:

 

# All user agents are disallowed to see the /personal-files directory.

User-agent: *

Disallow: /personal-files/

 

Note: If you want to create this file manually, pay special attention to punctuations and typing. A typo or missing a symbol, for example a colon, can make serious security problem for you. If you do not know how to create it, just use online tool to create it, then you need to upload “robots.txt” to your web root.

 

Check or Generate

If your website already has robots.txt file, then you can check it out online to see is it what you expected to have on your website.

 

http://tool.motoricerca.info/robots-checker.phtml

 

To generate “robots.txt” file you can use this online tool

 

http://www.mcanerin.com/en/search-engine/robots-txt.asp

 

 

Read more about SEO’s tips and tricks on part 3 of my SEO tutorial. Please email me and let me know if you have any question.

 

 

 

 

 

 





Leave a Reply

 
Kianoush Facebook Profile Kianoush Google Plus Profile Kianoush LinkedIn Profile Find Kianoush Twitter