TechLair

  • Home
  • contact
  • About
  • Privacy Policy

Google wants to make the 25-year-old robots.txt protocol an internet standard

Monday, July 1, 2019 by Piyush Suthar | Comments

Home News Tech Google wants to make the 25-year-old robots.txt protocol an internet standard

Google’s main business has been search, and now it wants to make a core part of it an internet standard. The internet giant has outlined plans to turn robots exclusion protocol (REP) — better known as robots.txt — an internet standard after 25 years. To that effect, it has also made its C++ robots.txt parser that underpins the Googlebot web crawler available on GitHub for anyone to access. “We wanted to help website owners and developers create amazing experiences on the internet instead of worrying about how to control crawlers,” Google said. “Together with the original author of the protocol,…

This story continues at The Next Web

Or just read more coverage about: Google

Authored by Piyush Suthar
Pro Blogger


Follow me on Twitter, Facebook, Google+, YouTube.

Load comments
  • Newer Post
  • Home
  • Older Post
  • techlair
    Over 1,500+ Readers

    Get fresh content from TechLair

    brand222 facebook brand2 envelope-o

    BEST OF TechLair

    Belgium adds 28 new scam sites to its cryptocurrency blacklist
    CBI to probe Cambridge Analytica data breach, determine if firm violated Indian laws, says Ravi Shankar Prasad
    Start a six-figure career as a project manager with this $25 training package
    Overwatch Winter Wonderland Event Kicks off Tomorrow, Here Are All the New Skins


    Copyright © 2019 TechLair. All rights reserved.
    Privacy Policy • DMCA • Contact