Sharepoint Forum

Ask Question   UnAnswered
Home » Forum » Sharepoint       RSS Feeds

AreaManager.SuggestDeepCrawl for managing site crawls

  Asked By: Ramon    Date: Jun 16    Category: Sharepoint    Views: 2905

I have used code similar to the following for programmatically adding
Sharepoint site URLs to be crawled for search. Works great:

"" /* LargeIconUrl */,
"" /* SmallIconUrl */,
"$$$default$$$" /* Index Catalog */,
"$$$default$$$" /* Index Scope */

I would like to find a way to programmatically check whether the URL
being added via this call already exists in the search/crawl system. The
API does not appear to support this. If you add the same entry twice,
the system just allows it and you get multiple entries in your crawled
sites list.

Any ideas?



2 Answers Found

Answer #1    Answered By: Todd Hamilton     Answered On: Jun 16

I didn't check  (teaching in Denver this week) but often you can iterate
a collection and compare each existing entry  to determine existence when
the API doesn't sport an existence method. It isn't a very efficient
approach but you probably won't be adding new search crawls  that often.

Answer #2    Answered By: Edgar Castillo     Answered On: Jun 16

Thanks for the response. I would do that, but I can't find  a way to get a reference to the collection containing the "DeepCrawls". It appears that the only property/method dealing with them is AreaManager.SuggestDeepCrawl (add a new crawl).

Didn't find what you were looking for? Find more on AreaManager.SuggestDeepCrawl for managing site crawls Or get search suggestion and latest updates.