Hi
This site does very well in Yahoo, etc. but not well in Google.
http://www.autoagency.com
The site's insurance quote system used to use a cookie system, but now the site uses a different system, which does not apprently use cookies.
I saw this article on the web regarding cookies/session id's etc.
One of the things the article mentions is the Google spidar will not accept cookies. So I am hoping that the next time Google arrives it will be able to properly index the site.
Can you tell by looking at the stie if the new system DOES use cookies or session id's?? And if so, how do I (according to the article) turn OFF session id's when a Google spider indexes the site.
Any other suggtions on how to make this site more GOOGLE search engine friendly would be appreciated.
BELOW is the article: (including a link to the full article): (THANKS!)
http://www.boxuk.com/server/show/ConWebDoc.469/CMS-SEO-and-Accessibility.html
Session IDs. Many modern sites use sessions to allow the persistent tracking of a user throughout a site (so that the user remains logged-in, or for user-path analysis, etc.). To allow this persistence across multiple pages of a site, the CMS will create a unique number (session id) for the user, and store it in a) a cookie, b) a per-session cookie, or c) the query string (URL) of each internal link. As many users/browsers will not allow cookies, a) and b) are often replaced by c) when the CMS cannot create a cookie for the user. The Google spider, amongst others, will not accept cookies, and the site may therefore include the session id in URLs for the Google spider. As Google needs to uniquely identify each page (so that it doesnt re-index the same page multiple times), this session id will present Google with different URLs for each visit (a new session is started on each visit), and as Google cannot obtain a single unique URL for each page, it wont index the site. To prevent this, sessions (or at least URL based session ids) should be switched off for any search-engine-spider visits. Search engine spiders can be detected (and sessions switched off accordingly) by detecting the robots identifier in the HTTP headers.
|