The aim of this article is to answer the top five questions asked in Prestashop forum about robots.txt and sitemap.xml. It is one of the most Popular Questions ask in Prestashop Community and other communities.
The robots.txt file generated by using the Prestashop default features contain line which is “Disallow: /*controller=authentication.” and auto generated sitemaps include the controller URL which give birth to a Google error (sitemap contains urls blocked by Robots.txt) which can only be removed by synchronizing both files manually.
Solution # 1 – Remove It from Sitemap and keep it blocked – Recommended
Manually edit sitemap and remove resubmit it after removing controller=authentication URL. This solution is recommended as there are strong reasons justify why PrestaShop by default block this controller URL.
In order to remove it from sitemap follow these steps:
1. Login to your site FTP and download the sitemap.
2. Open it using dreamviewer and edit it manually.
3. Remove all the URLs containing “controller=authentication”.
4. Save the edited sitemap and upload it in the website.
5. Upload the sitemap in FTP using similar sitemap name and replace the existing sitemap if ask.
6. Do not forget to back old sitemap.
Solution # 2 – Remove disallow command from robots.txt
Following are steps to remove this disallow command from robots.txt:
1. Login to your site FTP and download the robots.txt.
2. Open it in notepad edits it manually.
3. Remove Disallow: /*controller=authentication.
4. Save the edited robots.txt upload it in the website.
5. Do not forget to back old robots.txt.
6. Upload and Replace the existing robots.txt file through FTP.
If this article does help you or in-case if you any query please follow us on,
- Facebook – https://www.facebook.com/FmeModules
- Twitter – https://www.twitter.com/FmeModules
- Visit our site – http://www.fmemodules.com/
We love to answer customer queries according to the Google standards and advice accordingly without any charges.