Re: issue with serch engines
Re: issue with serch engines
- Subject: Re: issue with serch engines
- From: Michael Engelhart <email@hidden>
- Date: Fri, 23 Jul 2004 13:04:38 -0500
Check here:
http://www.robotstxt.org/wc/exclusion.html
But I believe you can just do:
Disallow: /cgi-bin/*
That will tell the crawler not to crawl any pages under that directory
Mi
On Jul 23, 2004, at 12:36 PM, Cretu Catalin wrote:
Hi,
I have an issue with search engines (crawler). Because
when they start searching, they creates a lot of
sessions and it keels the server.
How can I resolve it (using robot.txt - to restrict
access for robots)? This is not the best idea, because
I like to let access for searching in some certain
pages.
Anyone has any idea?
Thanks,
Catalin
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com
_______________________________________________
webobjects-dev mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/webobjects-dev
Do not post admin requests to the list. They will be ignored.
_______________________________________________
webobjects-dev mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/webobjects-dev
Do not post admin requests to the list. They will be ignored.