Shopify - Using The ‘Robots.txt’ File
The purpose of a ‘Robots.txt’ file is to tell the search engines what to crawl on your website. In Shopify, the ‘Robots.txt’ file cannot be controlled but in 2021 Shopify announced that owners will be able to customise the ‘Robots.txt’ file for their websites.
Businesses need to use the ‘Robots.txt’ file to prevent search engines from indexing some parts of their website. The bots will not index what you do not let the search engine find. It is used to block search engines from finding low-quality pages which need not be crawled.
Shopify uses the robots.txt.liquid to generate the 'Robots.txt' file. You can control the web crawlers and the search engines from accessing your web pages through the 'Robots.txt' file. A ‘Robots.txt’ file should be simple, readable and load quickly.
This file can be used to tell the web crawlers what they can access. This file helps you to prevent crawlers from accessing your website.
Reasons to use a ‘Robots.txt’ file
The main purpose of a robots.txt file is to stop the robots from indexing certain web pages. This helps in reducing the website bandwidth load and server strain. This is helpful whether your website is high-volume or has low traffic.
Another purpose of using this file is to prevent web spiders from downloading any content from a website. If many web crawlers hit your server at the same time then you can put more stress on it.
For example, if your website has a page that contains sensitive information and is password-protected then you can just use the ‘Robots.txt’ file to allow the web crawlers to crawl your web pages without finding the sensitive information.
Creating a ‘Robots.txt’ file
A ‘Robots.txt’ file can be easily created. You can create it using text editors like Notepad, Wordpad and more. The steps are as follows:
Use a text editor to create and file and save it file with the name ‘robots’ all in lowercase letters. Choose ‘.txt’ and the file extension.
Example: In your file, you can add code like
The robots or the spiders are called ‘User-agent’. The ‘*’ indicates that all the spiders are allowed. The Disallow command does not have any files/folders listed, this means that crawlers can access any of the directory on your website.
This command is used to block the web crawlers from crawling your website completely.
This is used to tell the bots that they cannot access the ‘About’ and the ‘Database’ subdirectories or folders. Each of the ‘Disallow’ commands can have only one folder or file. You can use the Disallow command as per your requirement.
Once you create the file you have to upload it to the ‘Root’ directory if you are using a custom-coded website or WordPress website.
Creating a ‘Robots.txt’ file using Shopify
Shopify announced a new option that allows you to update the ‘Robots.txt’ file. Users are allowed to do the following:
- Block certain crawlers.
- Prevent or allow the crawling of certain URLs.
- Introduce ‘Crawl-delay’ rules for certain crawlers.
Most Shopify website owners need not adjust the ‘Robots.txt’ file; usually, the default configuration handles all the cases.
Example: In your file, you can add
Disallow:/search - To block internal site search
Disallow:/cart - To block the shopping cart page
Disallow:/checkout - To block the checkout page
Editing the ‘Robots.txt’ file using Shopify
If you want to add a rule to the ‘Robots.txt’ then you can add the required code into the ‘Robots.txt.liquid’ file. You can add code to block the internal site search function, multiple directories and more.
If you need help with your Shopify website you can approach a Shopify SEO company.
- Log in to post comments