TechLair

  • Home
  • contact
  • About
  • Privacy Policy

Google wants to make the 25-year-old robots.txt protocol an internet standard

Monday, July 1, 2019 by Piyush Suthar | Comments

Home News Tech Google wants to make the 25-year-old robots.txt protocol an internet standard

Google’s main business has been search, and now it wants to make a core part of it an internet standard. The internet giant has outlined plans to turn robots exclusion protocol (REP) — better known as robots.txt — an internet standard after 25 years. To that effect, it has also made its C++ robots.txt parser that underpins the Googlebot web crawler available on GitHub for anyone to access. “We wanted to help website owners and developers create amazing experiences on the internet instead of worrying about how to control crawlers,” Google said. “Together with the original author of the protocol,…

This story continues at The Next Web

Or just read more coverage about: Google

Authored by Piyush Suthar
Pro Blogger


Follow me on Twitter, Facebook, Google+, YouTube.

Load comments
  • Newer Post
  • Home
  • Older Post
  • techlair
    Over 1,500+ Readers

    Get fresh content from TechLair

    brand222 facebook brand2 envelope-o

    BEST OF TechLair

    Vivo to focus on mid and premium devices in the Indian smartphone market
    Jack Dorsey sent out the first tweet 13 years ago today: Here's how to find yours
    Xiaomi reveals the back panel of Mi CC9 and Mi CC9e ahead of the official launch
    Rolls-Royce plans to use snake, cockroach-inspired robots to inspect engines


    Copyright © 2019 TechLair. All rights reserved.
    Privacy Policy • DMCA • Contact