Re: D2W and Direct Actions
Re: D2W and Direct Actions
- Subject: Re: D2W and Direct Actions
- From: Denis Frolov <email@hidden>
- Date: Sun, 3 Jun 2007 12:29:07 +0400
On Jun 1, 2007, at 7:50 PM, Guido Neitzer wrote:
Consider an example (we don't own the project, so I hope you won't
consider this an advertisement): http://www.legalsounds.com/
InspectArtist?__key=3339
On the first glance I have two questions: how well does the
application handle search engines? I saw that you're storing
session ids in cookies to get the bookmarkable urls in combination
with DirectActions. We had some trouble years ago with search
engines not taking the cookie and flooding the application with
sessions (like over 200 Sessions created per minute, the app went
down with about 3000 sessions spread over four instances).
We don't do any special handling of search spiders requests. The
point is that you should be anyway ready for peaks in user generated
traffic which can lead to even more sessions created. So, the proper
solution is probably minimizing session memory footprint, adding more
instances and lowering session timeout.
Another idea is using one common session for all the spiders. It
should be pretty easy to add a method to Browser object like "isRobot
()" which checks user agent against a known list of spiders and use
this method to feed one session to all the spiders.
You would also probably want to add a "Disallow: /wo/" line to
robots.txt if you use url shortening approach suggested in Mike's email.
--
Denis Frolov
Design Maximum MA
Tel: +7 863 2648211
Fax: +7 863 2645229
Web: http://www.designmaximum.com
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Webobjects-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden