السلام عليكم // جائتني رسالة من الاستضافة الي فهمت من عندها انه المشكلة الي تصير عندي في استهلاك موارد النظام هي زواحف محركات البحث ونصحنوني بالذهاب
الى لوحة التحكم في google و bing وتقليل مستوي الزحف وقمت بتقليل مستوى الزحف لكن لااعرف ماهو الحل المتبقي
ياريت لو تنصحوني بالحلول وهذه الرسالة
It seems that most of the issue on the account is caused by crawlers.
Total number of unique IP addresses: 10109
Total number of requests: 419755
Total number of unique request strings: 60012
Total number of unique referers: 12700
Total number of unique user agents: 1829
Total bandwidth sent in responses: 5G [6197229300B]
Top 10 requesting IP Addresses based on count:
COUNT: 36081 / 08.60% IP: 66.249.73.183 HOST: crawl-66-249-73-183.googlebot.com
COUNT: 11974 / 02.85% IP: 157.55.33.17 HOST: msnbot-157-55-33-17.search.msn.com
COUNT: 8041 / 01.92% IP: 157.55.32.104 HOST: msnbot-157-55-32-104.search.msn.com
COUNT: 7065 / 01.68% IP: 157.55.32.154 HOST: msnbot-157-55-32-154.search.msn.com
COUNT: 5348 / 01.27% IP: 157.56.93.84 HOST: msnbot-157-56-93-84.search.msn.com
COUNT: 3952 / 00.94% IP: 109.205.118.6 HOST: Unknown Host
COUNT: 2309 / 00.55% IP: 86.111.146.66 HOST: Unknown Host
COUNT: 1741 / 00.41% IP: 37.238.16.170 HOST: Unknown Host
COUNT: 1729 / 00.41% IP: 5.10.226.85 HOST: Unknown Host
COUNT: 1543 / 00.37% IP: 66.249.81.4 HOST: google-proxy-66-249-81-4.google.com
Top 10 Web Crawlers:
Count: 33021 / 07.87% Crawler: bingbot
Count: 32493 / 07.74% Crawler: googlebot
Count: 5954 / 01.42% Crawler: baiduspider
Count: 944 / 00.22% Crawler: msnbot
Count: 6 / 00.00% Crawler: yandexbot
We recommended setting the crawl rate as slow as possible during peak traffic times (mid-day), and only slightly higher during off-peak hours. Changing the crawl rate does not effect how Bing views your site, how often it is crawled, or how deeply your site is crawled. Reducing the crawl rate will not effect your search engine results, it will simply slow down the rate that Bing crawls your site, and as a result will cause less stress on the
NON BING/GOOGLE*************** We recommend that you limit the crawl rates for all bots or disable crawl access. To manage non (Google/BING) bot crawler rates
, you must create a robots.txt for each domain and insert the following lines in the file: User-agent: * Crawl-delay: 15