Documentation
Feedback
Guides
Storefront Development

Storefront Development
Store FrameworkSetting up cross-border stores
Creating robots files for cross-border stores

A robots.txt file defines which folders or files a search engine may or may not ignore when crawling a website. Considering the effects of this file for a store's SEO, the following step by step explains how to create robots files for cross-border stores.

If your store is not a cross-border one, check this link for a step by step on how to create a robots file through your store's admin.

The basic structure of a robots.txt file contains the following directives:

  • User-agent: The search engine robot for which the rules specified in the following are suitable. If the rules are the same for all robots, you can specify the User-agent as *.
  • Disallow: The paths, relative to the root directory, of the files or folders that the search engine, specified in the User-agent field, must not index to the search results.
  • Allow: The paths, relative to the root directory, of the files placed in a folder that has been Disallowed, but which are allowed to be crawled and indexed by the search engine.

You must adjust these directives according to your scenario.

This feature is available for stores using vtex.edition-store@3.x Edition App. To check which Edition App is installed on your account, run vtex edition get. If it's a different Edition, please open a ticket to the VTEX Support team asking for the installation of the vtex.edition-store@3.x Edition App.

Step by step

In this step by step, you'll learn how to develop and release your own Robots app, an app responsible for managing your cross-border store's robots.txt files.

  1. Using the command below, clone the store-theme-robots app boilerplate repository.

_10
git clone https://github.com/vtex-apps/store-theme-robots

  1. Once successfully cloned, open the local app directory in your code editor.

  2. Open the manifest.json file and edit the vendor field with the name of the developing account.


_10
{
_10
"vendor": "myaccount",
_10
"name": "robots",
_10
"version": "0.0.1",
_10
"builders": {
_10
"sitemap": "0.x"
_10
},
_10
...
_10
}

  1. Inside the sitemap/robots folder, create a .txt robots file for each supported locale binding. The name of each file must be the id value of its respective binding.

Follow this tutorial to check your stores' binding ids.

Your app's folder may end up with a structure similar to the following:


_10
store-theme-robots
_10
├── manifest.json
_10
├── README.md
_10
└─┬ sitemap
_10
└─┬ robots
_10
├── 706e9126-d0fc-47de-9o2d-5f9649e61877.txt
_10
└── 748aafcf-1674-456d-9ffc-7ddb3f26e43f.txt

  1. Edit each file from the sitemap/robots folder with the desired content for each one of your robots files.

  2. Once everything is set up, use the terminal and the VTEX IO CLI to log in to the VTEX Account in which you are currently working in.

  3. Run vtex use {workspace} to use a developer environment.

Remember to replace the values between the curly brackets according to your scenario.

  1. Run cd store-theme-robots to go to the local app directory.

  2. Run vtex link to link your new app to your development workspace.

  3. Check the robots file generated for each store by accessing https://{workspace}--{account}.myvtex.com/{locale}/robots.txt on your browser.

  4. Once you're happy with the changes, follow our documentation on making your new app version publicly available to run your app on master.

Now, you are ready to check out your store's robots files by accessing https://{account}.myvtex.com/{locale}/robots.txt on your browser.

Once you finish the configuration process, your store will benefit from having a robots file, which can improve your store's SEO and help to avoid overloading your site with undesired requests.

Contributors
2
Photo of the contributor
Photo of the contributor
+ 2 contributors
Was this helpful?
Yes
No
Suggest edits (Github)
Contributors
2
Photo of the contributor
Photo of the contributor
+ 2 contributors
On this page