Shanghai dragon tutorial five items of writing robots files

2. website security protection


website optimization Shanghai dragon friends all know the site map, but there will always be some friends do not know how to submit, just think a link on the page, will come to an end. In fact, robots is.

extended reading: "Huizhou Shanghai dragon station blog K mystery" (Ye Jianhui also served in the blog at the beginning of the line because of similar problems by K, specifically to see the

many websites content browsing version, although users in large extent, improve the user experience, the spider has caused some difficulties, because the spider is difficult to identify the primary and secondary, once let it think you are malicious repeat, then the light is right down, heavy K station depilates.

Provide a variety of

4.For now,

hotlinking is not much, but once the search engine "hotlinking", so I’m afraid fast broadband too much ah, so do not want to avoid pictures website, then you can be hotlinked, shielding.

what is robots? It is a protocol, rather than a command.

Disallow:/admin/ "the prohibition of spiders to crawl under the admin directory of all files"


extended reading: the skills of "safety" (set the WordPress program I have been attacked, so the safety problem even if the details can not be ignored, the specific view Ye Jianhui blog)


as a learning or engaged in the website optimization Shanghai dragon friend must know the spider, but for this search spiders follow the protocol, but not often heard. Robots is the first to view the file crawl site. It is used to tell the spider in the server space, what files can be grab, what documents do not need to be captured. Because of this, it is easy to use robots to make your site right, that is how to write

Disallow:/sitemap/ "

maybe some people will have a big question mark. Robots and the safety of website how to pull relationship? This comes to hackers, many low-level hackers is through the search default landing back in order to achieve the goal of the website security intrusion, so had to prevent.

3. to prevent the link was "stolen" by



" no spiders crawl the pages of text

1. to avoid duplicate website page




Submit a site map