What is Robots? How to write a Robots file?

What is Robots? How to write a Robots file?

For SEO website optimizers, every small detail of the website is crucial and we need to optimize it, such as: Sitmap, H1, ALT tags, 404, 301, etc. I have already talked about the ones listed above. Today, editor Dongguan SEO Feng Chao will explain Robots to you. What are Robots? How to write a Robots file?

What are Robots?

The full name of the Robots Protocol (also known as the crawler protocol, robot protocol, etc.) is the "Robots Exclusion Protocol". The website uses the Robots Protocol to tell the search engine which pages can be crawled and which pages cannot be crawled.

How to write a Robots file?

First, create a robots.txt text file in the root directory of the site. When a search spider visits this site, it will first check whether there is robots.txt in the root directory of the site. If it exists, the search spider will first read the contents of this file:

File writing

User-agent: * Here * represents all search engine types, * is a wildcard

Disallow: / This definition prohibits crawling all the content of the site

Disallow: /admin/ This definition prohibits crawling directories under the admin directory

Disallow: /ABC/ This definition prohibits crawling directories under the ABC directory

Disallow: /cgi-bin/*.htm prohibits access to all URLs with the suffix ".htm" in the /cgi-bin/ directory (including subdirectories).

Disallow: /*?* prohibits access to all URLs containing question marks (?) on the website

Disallow: /.jpg$ prohibits crawling all .jpg format images on the web page

Disallow:/ab/adc.html It is forbidden to crawl the adc.html file under the ab folder.

Allow: /cgi-bin/ This definition allows crawling directories under the cgi-bin directory

Allow: /tmp This definition allows crawling the entire tmp directory

Allow: .htm$ Only URLs ending with ".htm" are allowed to access.

Allow: .gif$ allows crawling web pages and gif format images

Sitemap: The sitemap tells the crawler that this page is a sitemap

Listed below are the names of the more famous search engine spiders:

Google's spider: Googlebot

Baidu's spider: baiduspider

360 spider: 360spider

SOGOU's spiders: Sogou web spider/4.0 and Sogou inst spider/4.0

According to the above instructions, editor Dongguan SEO Feng Chao gives you a case reference. For 360, the robots.txt code that prohibits crawling is written as follows:

User-agent: 360spider

Disallow: /goods.php

Disallow: /category.php

People who read this article also read:

How to make robots? How to write robots?

What are dead links? How to deal with dead links on the website?

What is a 404 page? How to set up a 404 page?

<<:  What are dead links? How to deal with dead links on the website?

>>:  How should SEO novices do search engine optimization?

Recommend

Gynecological master or devil from hell? The Dr. Sims you don’t know!

Get the nine-valent vaccine, do HPV screening, TC...

4 key points to quickly optimize the bidding OCPC delivery effect

In recent years, the competition for traffic in b...

Google Maps Super Heavy Update: Offline Maps

Google Maps is a map service loved by many people ...

What is the magic of big data from the Internet to the Internet of Vehicles?

The movie that has been popular in the circle of ...

If the corals are gone, where will the crabs attach themselves to?

At present, global climate change, especially glo...

Taichi iOS 8.4 for Mac is now jailbroken

Taiji jailbreak team finally released the iOS 8.4...

Nokia's mistake leaves scar on Microsoft: biggest quarterly loss in history

[[141611]] Microsoft has released its financial r...

Community operation: teach you how to easily build a high-conversion community!

Everyone should have seen these two groups. There...

What are those little insects flying around the trash can at home? Flies? No.

When the trash can at home has not been emptied f...

How to improve the conversion rate of live streaming sales?

Now everyone is talking about live streaming sell...