Sharepoint Forum

Ask Question   UnAnswered
Home » Forum » Sharepoint       RSS Feeds

Help Needed

  Asked By: Code    Date: Dec 25    Category: Sharepoint    Views: 626

The pages in one of the our publishing site are around 1500. So it's showing
on search crawling index and sometimes throwing an error like System out of
memory (usually in full crawl). Our ram size is 4 gb for index server. Is there
any method for archive the pages in page document library. And the important
thing is the data must be handy, so the archival process should accommodate
ability to retrieve the archived content quickly. pls suggest me.



2 Answers Found

Answer #1    Answered By: Ravindra Salvi     Answered On: Dec 25

How often do you really need to do a full crawl? Usually an incremental
will do. Beyond that, RAM is cheap; do you have room to upgrade the box?

Answer #2    Answered By: Rahul Sharma     Answered On: Dec 25

1500 pages  is nothing. This would mean very little effort on the crawler. We
have one doclib (!!) with 350.000 documents in it. Full crawl takes about
1,5 hour. So again 1500 pages would not mean anything. Off course we are on
64bit 4 quad core machines. 1 machine does indexing an querying

So perhaps you can limit nr of simultaneous documents for you content
source. How long does your crawler run before it reports the error? Maybe it
has problems with certain pages? I have noticed bad error handling in the
crawler if it encounters problems with for instance a "strange" element in a
custom site  def.

Off course you can split the pages over several libraries, but i doubt this
will help  you because again this volume should not be any problem.

Didn't find what you were looking for? Find more on Help Needed Or get search suggestion and latest updates.