Uploaded image for project: 'Evergreen'
  1. Evergreen
  2. EVG-6499

Add robots.txt file to prevent search engine crawling

    XMLWordPrintable

    Details

    • Type: Improvement
    • Status: Closed
    • Priority: Major - P3
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: v2019.08.01
    • Component/s: legacy-ui
    • Labels:
      None

      Description

      Read tickets were low for many hours on 7/25. There were a number of queries against the inefficient task history endpoints, which appear to come from a range of IP addresses belonging to Google's web crawler. We should add a robots.txt file to prevent this from happening again.

        Attachments

          Issue Links

            Activity

              People

              Assignee:
              sam.kleinman Sam Kleinman (Inactive)
              Reporter:
              brian.samek Brian Samek
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Dates

                Created:
                Updated:
                Resolved: