You are here
Google wants to make the 25-year-old robots.txt protocol an internet standard Technology 

Google wants to make the 25-year-old robots.txt protocol an internet standard

Google‘s main business has been search, and now it wants to make a core part of it an internet standard. The internet giant has outlined plans to turn robots exclusion protocol (REP) — better known as robots.txt — into an internet standard after 25 years. To that effect, it has also made its C++ robots.txt parser that underpins the Googlebot web crawler available on GitHub for anyone to access. “We wanted to help website owners and developers create amazing experiences on the internet instead of worrying about how to control…

Read More
Contact us | Privacy Policy