<snip>
On a related subject, I haven't heard any discussion recently on the
frequency of Google spider visits. I don't check every day but it
seems like every time I do the Googlebot has been there that day.
Could it be possible that a frequently updated and modified web site
gets visited more frequently than one that doesn't change? This
would seem to contradict advise I've read suggesting that you not
change what seems to be working.
</snip>
Googlebot, unlike many of the SE who also claim to learn site update
frequency, does indeed learn it. The rest seem to be
"scheduled" refreshes. Googlebot is definitely sensitive to pages which
are data driven and reflect changes quickly. I don't believe it is worth
"playing with fire" which is what you are doing if you regularly modify
a page **just** to induce indexing or trying to appear fresher to
raise ranking.
If it is an index page which leverages internal linking structure then it
may be worth inducing indexing of new content. However then the
page has built in this function so the negatives are more easily
controlled. You are actually modifying the page for a reason other
than to induce or raise relevancy. You are providing a more user
friendly environment which is always the best reason to do anything.
IMHO, the most important page in any directory is the default page.
If you want to know why request a directory without a default page
and see what the result is. Unix provides a list of contents if it isn't
set up properly and NT will deliver a page configured in the set up
or the woeful "forbidden" error. None of these are user friendly or
of much use. I may be the exception but this is something I frequently
do if I want to find related material.
IMHO, these are also favorites of sites linking to you because these
"indexing pages" can be easily targeted by subject. This also encourages
better text in the links because they can link to a topic specific index.
IMHO, the best description is the subject. Many will also use titles
or descriptions from your page as comments for the link text as well
providing another means of controlling link text and text in the proximity
of the link. When all of these are used then you have got one killer
link depending on the quality of the site linking to it.
Good websites regularly add new content or update content so it
makes sense for any engine to be sensitive to updating frequency.
It provides a fresher set of results and can indicate site quality.
The site in my sig wasn't updated for years Google knew about it
but slowly dropped pages until eventually it was only indexing 2 pages.
I started updating the site and it is now indexing most of the old pages.
Problem is Googlebot doesn't index the new content so there is a lesson
there as well.
As to the marble theory I would concur, with Detlev it is a nice image,
but I agree with Lee because of fragmented link popularity and leveraging
internal link structure. I would use indexes containing "indexing pages"
for each color of marble. I also believe "mini sites" are just playing with
fire because of the crosslinking fiascos they can cause. In many cases
you are just one crosslink away from a "bad neighborhood" and from
my experience it may not show up as a grey line in the Google
toolbar or low PR. It may just penalize the site that mistakenly cross linked
sites they thought were related. IMHO, it doesn't matter what you
think is related Googlebot has the last say and it isn't necessarily based
on sound reasons, it is about what intent is indicated in the analysis it
does for links. In other words if it looks like you were manipulating link
popularity not providing a path to more relevant material.
Crosslinking sites, particularly those on the same network definitely
raises a spam flag. The degree of the penalty is probably tied directly
to the degree of non relevancy or the intent. If you have 4 or five
of these "mini sites" I'm sure the intent is pretty clear to Google! |