Robots.txt file is used to guide the robot of the search engines know which page or section on your website or blog regularly to read and to index, what pages or sections should not be read. Most sites and blogs have all the folders irrelevant or have no effect whatsoever for search engines such as image files and admin files. Therefore creating a robots.txt file is possible to improve the indexing of your website or blog.
Robots.txt file is a simple text file, you can create it using Notepad. If you use WordPress, the robots.txt file can create the following pattern:
Disallow: / wp-
Disallow: / feed /
Disallow: / trackback /
"User-agent: *": this statement means allowing all the search bots (from Google, Yahoo, MSN and other search engine) will use the guide to read your website or blog you. Unless your website or blog, too complex, otherwise you will not need to set different guidelines for different spiders. You simply set y as above is enough.
"Disallow: / wp-": This statement helps the bots of search engines to ignore all the WordPress files (files in folders such as wp-admin, wp-content, wp-includes) the indexing of your blog. It helps you avoid duplicate content and admin read sensitive files.
If you do not use WordPress, please replace the Disallow line in the file or folder on the website that you do not want the bots to snooping. For example:
Disallow: / images /
Disallow: / cgi-bin /
Disallow: / any file or folder that you do not want crawlers to read /
After you have created a robots.txt file, please upload it to the root directory on your web host. That's it alone. Now you can rest assured that the robots of search engines only read the content on your blog, not read the contents of the WordPress files. This is really important because it helps your blog content will not be duplicated, and the information needs of sensitive files secure on the web host they will not be snooping.
Poster : beStyle
Viewed : 281