How can I block the Dataprovider’s crawler?

We respect the robots.txt on every website we visit.

We take copyright and privacy very seriously and do our best to comply with the laws in the countries where we index the web. If you, however, feel that we aren't living up to these promises, please contact us. We'll do our best to address any concerns you may have.

As previously mentioned, Dataprovider’s crawler strictly follows the robots.txt file on your website, so you can fully control it if you want to. If for some reason you want to prevent our crawler from visiting your website, place the following line into the robots.txt file on your server:

  • COPY
    User-agent: Dataprovider.com
    Disallow: /

Please note that it may take some time before our crawler processes the changes in your robots.txt file.

Please note that if your robots.txt file contains errors and our crawler is not able to recognise your commands, it will continue crawling your website the way it did before.