The Robots.txt Editor creates a file which excludes robots of certain search engines or disallows access to specified folders of your website. It has to be placed in the root directory of the domain, so that search engine robots can find it. You will find the Robots.txt Editor in the Tools section of the main menu.
The Robots.txt works as follows: With User-agent: Name of the Robot you determine for which search engine robot the orders are valid. By adding Disallow entries you can determine folders which you don't want to be indexed, for example, because they have not yet been finished. In addition, you can add comments for yourself which are identified by an # and ignored by robots.
To address all robots, use the * character.
To create a completely New File, click the according button in the toolbar, otherwise click Open File.
Click New to add Robots Exclusions and the path to your sitemap. By clicking Edit you can make changes to the existing entries.
Tipp: Use the Site Scanner to read the structure of your site and select directories to be locked for selected agents.
Below the list of User Agents and Disallow paths you can see the source text of your robots.txt file. Click Refresh to apply any changes you made in the upper part to that text or vice versa.
When finished you can save your file or export its content to a text file.