1. Home
  2. Drupal
  3. How to protect UAT or Dev environment from indexing by search engines in Drupal

How to protect UAT or Dev environment from indexing by search engines in Drupal

This how to applies to a very typical scenario where you might have one or more dev/test/UAT environments. Without these steps, your Drupal website will eventually get discovered and indexed by search engines, driving real user traffic to those environments.

There are two ways to protect your test environments from search engines

Shield module to keep search engines away

Using Shield module is the recommended way to add username and password in front of your website. Having your test website protected with a username and password will prevent any search bots from indexing it, it also protects if from someone accidentally accessing it.

Shield module in production website

Once you install Shield module, it becomes available to your production website too. Check below the way to transparently overcome this issue. The method described below allows you to easily synchronise your production website database to any of your test environments and not to worry about updating the Shield module configuration.

To protect your Drupal website from search engines, follow these steps:

  1. Download and install the Shield Drupal module into your production website first. If you have drush installed, execute `drush en shield -y` in the terminal, from Drupal docroot. Depending on your release process, it may be necessary to install the shield module in your test/dev/UAT environment first, in which case you need to ensure Shield module is enabled after the production release.
  2. Update the settings.php in your test/dev/UAT websites, adding this code to the bottom of the file:
    1. Drupal 7:
      $conf['shield_enabled'] = 1;
      $conf['shield_user'] = 'testuser';
      $conf['shield_pass'] = 'somepassword';

      It is not necessary to provide secure or complex password here since this protection is not security related.

    2. Drupal 8:
      Drupal 8 has a different way to specify the configuration variables.

      $config['shield.settings']['user'] = 'testuser';
      $config['shield.settings']['pass'] = 'somepassword';
  3. Update your settings.php file for your production site, following this example:
    1. Drupal 7, provide empty values for shield user and password:
      $conf['shield_enabled'] = 0;
      $conf['shield_user'] = '';
      $conf['shield_pass'] = '';
    2. Drupal 8,
      $config['shield.settings']['user'] = '';
      $config['shield.settings']['pass'] = '';
  4. Test your configuration. Once you configure the module in production website, try copying your production database over to your test site and see that a username/password prompt is coming up.

Robots.txt module to prevent search engines from indexing your site

An alternative way would be to use the Robots.txt Drupal module. Robots.txt module will replace the robots.txt module that is shipped with Drupal, and here we will show you how to alter the content of robots.txt module for non-production environments such as dev/test/UAT.

  1. Install robots.txt module as any other drupal module. If you have drush support, just execute `drush en robotstxt -y`. Keep in mind that the Robots.txt module will only work if you remove the physical robots.txt file from your docroot.
  2. In your non-production website (dev/test/UAT), add the following code to the bottom of your settings.php file:
    1. Drupal 8:
      $config['robotstxt.settings']['content'] = "User-agent: *
      Disallow: /";
      

      It is important to preserve the new line, instead of using the “\n” delimiter.

Updated on 7 June, 2017

Was this article helpful?

Related Articles