Adding Robots.txt file to Django Application

adding robots.txt in Django project

Robots.txt is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.

Why robots.txt is important:

Before a search engine crawls your site, it will look at your robots.txt file as instructions on where they are allowed to crawl/visit and index on the search engine results. If you want search engines to ignore any  pages on your website, you mention it in your robots.txt file.

Basic Format:
Example:
Steps to add robots.txt in Your Django Project:

Lets say your project’s name is myproject.

Create a directory templates in root location of your project. Create another directory with the same name as your project inside templates directory.

Place a text file robots.txt in it.

Your project structure should look something like this.

Add user-agent and disallow URL in it.

 

Now go to your project’s urls.py file and add below import statement

Add below URL pattern.

Now restart the server and go to localhost:8000/robots.txt in your browser and you will be able to see the robots.txt file.

Serving robots.txt from web server:

You can serve robots.txt directly from your web server. Below is the sample configuration for apache.

Quick Tips:
  1. robots.txt is case sensitive. The file must be named robots.txt, not Robots.txt or robots.TXT.
  2. robots.txt file must be placed in a website’s top-level directory.
  3. Make sure you’re not blocking any content or sections of your website you want crawled as this will not be good for SEO.

 

Host your Django App for Free.

(Visited 178 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *