In this paper,
Copyright: Shanghai Dragon School www.lyylx贵族宝贝
Through the love of Shanghai
webmaster tools detection, and found out the reason hidden in the robots.txt file, when he found that the site is bng when he banned the search engine spider crawling, and found this site is forbidden to crawl, will default to a period of time not to climb in. Love Shanghai webmaster tools, website robots.txt file update time to stay in the day prohibited search engines crawl, later will not have been updated in the. This also proves that spider crawling in to a ban grab page, it is not in default in the next crawl, and to grasp the time interval, which may explain why many owners in the lifting of the robots.txt file to limit spider crawl, cause to a new will reincluding content, just love does not come out of Shanghai webmaster tools before we do not know it.
bug until after the completion of the repair site, the user once again to modify the robots.txt file, allowing all search engines crawl. On the surface this step can do such a search engine to crawl the content, but a week later a website no response, view the site log, even a spider whose come. So I deleted the robots.txt files.
station in the new line shortly after the internal site appeared significant bug (vulnerability). The first station with the robots.txt file shielding search engine crawl, then the site of bng repair.
today I saw a friend of the owners complained about why he didn’t be included from the railway station, discovered the cause of the problem to solve the problem in a series of work. The feeling is typical, many people may encounter, now posted for everyone to share.
know the principle, then the solution will be much easier. Please see below:
go to open the love Shanghai webmaster tools -robots.txt tools, can update the new station is not included in the robots.txt file must not be ignored.