Re: WO, Direct Actions, Google, Sitemaps
Re: WO, Direct Actions, Google, Sitemaps
- Subject: Re: WO, Direct Actions, Google, Sitemaps
- From: Arturo Perez <email@hidden>
- Date: Mon, 22 Aug 2005 12:01:41 -0400
Hi David,
Just thinking out loud here:
I think you had previously said that your code creates/restores etc the
sessions. Why don't you do something like wosid=ROBOTSID and handle
that case specially?
David Griffith wrote:
Hi all,
I have build a site which is completely dynamic and employs some of the
clever mechanisms described by Chuck and others to use Direct Actions
to allow good indexing by google and other search engines.
So far it has been working very well. I have been passing the session
ID in the URL where required (this is the only way I can do it, cookies
etc are not an option).
So a URL like this is generated:
http://www.ipodshoppers.com/app/WebObjects/iPodShoppers.woa/1/wa/
showItem?itemID=1065&wosid=Gtjqxtuu4rSvffqcXSwSBg
This is fine and GoogleBot indexes it and records that URL which is
basically a direct access to the page.
I need to use sessions, so in the Direct Action java file, it will try
and restore the session and if it doesn't exist, it will create a new
one. This is great. Work lovely. For normal users.
However, we recently tried using a sitemap (which we submitted to
Google) to improve the indexing etc. It appeared to be working well
but I have recently moved server and I am not having serious problems.
They stem from this:
First of all the sitemap that was created had a different 'wosid' for
each link. The sitemap was obviously created before I rectified that
problem. This means that everytime Google hits our site (the bot I
mean) it forces a new session to be created. You can imagine the havoc
that causes with 1000 hits in a few mins.
So I thought, well, if every link in the sitemap had the same session
ID, that might help, but of course it won't because that session will
always be expired, therefore creating new sessions each time, same
problem.
Again, if there is NO session ID at all in the URL, a new session will
be created each time.
I believe this is just a problem with sitemaps in conjunction with a
direct action approach to the website. I believe if google crawls the
site normally, it will use only one session (although I don't know that
for sure yet).
I am just wondering, has anyone else tried to do this and if so, have
you had this problem and did you find any solution?
Kind regards,
David.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Webobjects-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Webobjects-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden