[ic] search engine indexing scan/ MM=0f73bb47ac44f4e422.....
Kevin Walsh
kevin at cursor.biz
Sat Jun 17 08:03:21 EDT 2006
Jamie Neil <jamie at versado.net> wrote:
> Kevin Walsh wrote:
> > 1. Recognise spiders and don't paginate the list for them. Set
> > ml=999999, or whatever seems large enough.
> >
> This is the way we've been doing it and it seems to work very well.
>
> However it's worth noting that if your pages end up too large (I'm not
> sure what the limit is, but I think it's around 200Kb) then Googlebot
> may ignore everything after a certain point. We just make sure that our
> categories aren't too large and the HTML on the search results pages is
> as compact as possible.
>
Another way would be to recognise spiders and reduce the size of the
HTML output. Only showing a list of links to products (dropping all
the nice user-orientated stuff like prices and images) is one way to
do that.
Yet another way would be to have smaller categories, perhaps using
sub-categories, thereby keeping the individual listing length to a
minimum.
You could completely side-step the whole issue and produce some pages
that only list links to products. If you manually submit those pages
to the search engines with "noindex,follow,noarchive" in <meta robots>
then the listing pages won't be indexed, but the target pages will.
There's always more than one way to do it.
--
_/ _/ _/_/_/_/ _/ _/ _/_/_/ _/ _/
_/_/_/ _/_/ _/ _/ _/ _/_/ _/ K e v i n W a l s h
_/ _/ _/ _/ _/ _/ _/ _/_/ kevin at cursor.biz
_/ _/ _/_/_/_/ _/ _/_/_/ _/ _/
More information about the interchange-users
mailing list