How to Use and Create a Robots.txt File for Human SEO

Creating and installing a robots.txt file for your website is one of the most simplest things to do.

Search Engine Spiders Image

This file is one of the pivotal elements of white hat search engine optimization (SEO) as it is this file that tells the search engines what to crawl on your website. Search engines will automatically crawl the file when they reach your website and not generate a 404 error in the server logs. Follow these simple steps to help those search engine spiders know what content you want them to access and index.

Steps to Creating and Installing a Robots.txt File

  1. Open up a text editor such as Notepad
  2. Copy and paste the following code to allow all spiders to visit all files:
    User-agent: *
    Disallow:
  3. Save the file as robots.txt
  4. Upload the file to your website’s root directory

With the robots.txt file in place, you will increase your chance of having the search engine spiders index your website more accurately. However, it should be noted that there is no guarantee as some search engine spiders adhere to robots.txt directives better than others. Also, remember that this file is just one piece of white hat SEO and should not be used exclusively to increase your chances of increasing search engine results.

Lastly, be careful not to block pages or directories with numerous inbound links pointing to them as this could have negative results. Of course, there may be pages that you do want to exclude from indexing and this article on Robots Exclusion Standard will provide further examples on correctly creating exclusion statements in your robots.txt file.

Newsletter

Stand out with Human SEO

33 SEO Questions & Answers Guide

Sign up to receive my Newsletter and get a copy of my SEO Success Guide FREE!

Please note: I reserve the right to delete comments that are offensive or off-topic.