Search our site

Top 10 On-Page SEO Tips for Beginners

Top 10 On-Page SEO Tips for Beginners

Recently, Tobi Lutke, the CEO of Shopify, announced the recent development of editing robots exclusion protocol (commonly called robots.txt). According to his tweet, Shopify store owners can now have complete control over how bots crawl their stores for SEO optimization. They can edit their robots.txt file.

 

 

If you run a Shopify store, this is good news for you as it allows you to target the pages and content that you want search engines to crawl. This development is advantageous as it will help you rank your pages or listings quickly.

 

In this comprehensive guide, we will discuss how you can edit your robots.txt file. But before getting into it, it is essential to know what precisely the robots.txt file is. So let's discuss this below!

 

What is the Robot Exclusion Protocol (robots.txt)?

 

Robots Exclusion Protocol (also known as robots exclusion standard and robots.txt) is a plain-text file used for communicating with search engine crawlers by a website. Generally, it tells the crawlers about specific URLs on your website that need to be crawled and the areas that should not be scanned for processing.

 

This file mainly assists with avoiding overloading your website with search engine crawlers' requests.

 

Usages

 

Primarily, robots.txt is responsible for managing crawler traffic to your website. It prevents crawlers from crawling unimportant and already crawled URLs by keeping the file off Google.

 

Web Page-A robots.txt file can be used with a web page (like HTML, PDFs, and non-media formats that search engines can read) for managing crawler traffic. For example, if you get many requests from search engine crawlers, you can control them using the robots.txt file.

 

Media File-You can use a robots.txt file to prevent media files (like images, videos, and audio clips) from appearing in search engines. However, this will not prevent other users and webs pages from linking your media files.

 

Resource File-You can block the unimportant pages, scripts, and style files using a robots.txt file. However, if the search engine crawlers find it difficult to read and understand other pages of your site without those script and style files, you should not block them. Blocking those pages may stop crawlers from performing their tasks on your pages.

 

A robot.txt file can be found at the root of your website. For example, if your website is www.exampleone.com, you can find the robots.txt file at www.exampleone.com/robots.txt. It allows crawlers to crawl all files on your website until you specify.

 

Now that you know about the basics of the robots.txt file and its usages let's come to the point.

 

Editing A Robots.txt File

 

Shopify stores have a default robots.txt file that is optimized for crawling purposes. When you choose to create a Shopify online store, you don't need to create a robots.txt file separately. However, you can edit it through a robots.txt.liquid theme template if required.

 

Here are some of the edits you can process using a robots.txt file. Check them out below!

 

  • Give crawlers access to specific URLs
  • Block specific URLs from being crawled
  • Add extra site-map URL
  • Add crawl-delay rules for specific crawlers
  • Block specific crawlers access your website or specific web pages

 

Shopify cannot help with robots.txt.liquid edits, so you should seek a Shopify expert's help or an SEO agency Kelowna with expertise in both code edits and search engine optimization. They will help you with customizing your robots.txt file.

 

Steps to Edit a Robots.txt File on Your Shopify Store:

 

  • Head over to your Shopify admin panel.
  • Go to Online Store > Themes.
  • Select Actions, and then click on the Edit Code option.
  • Next, click on the Add a new template option, and then select robots.
  • Click on the Create template.
  • Make the changes to the default template according to your requirements.
  • Then, save changes to the robots.txt. Liquid file in your website theme.

 

With these steps, you can readily make changes to your robots.txt file as per your requirements.

 

Once you have finished editing, you should test the updated robots.txt file using Google's robots.txt to ensure if they are correct. However, if you find it challenging to customize your robots.txt file, search for an SEO company near me and approach them. They will surely help you with this.

 

Steps to Retrieve Default Robots.txt File:

 

If you want to delete all the customizations or retrieve the default robots.txt file, you need to follow the following steps:

 

First, make sure to save a copy of your robots.txt.liquid template customizations as you will have to delete it to retrieve the default file. Once your robots.txt.liquid template customizations are deleted, you cannot retrieve them.

 

  • Head over to your Shopify admin panel.
  • Go to Online Store > Themes.
  • Click on the Actions, and then select Edit Code.
  • Next, click on the robots.liquid option, and then select the Delete file.
  • If you still want to delete robots.txt.liquid, then click the Delete file.

 

You can edit your robots.txt file anytime you wish to allow or disallow crawlers to crawl certain pages of your website. Thus, you can completely control the crawlers' activities on your website.

 

To sum up, it would not be wrong to say that this development helps empower your website's SEO. For detailed information on this development, you can check out the comprehensive guide from Shopify Help Center.