Sharepoint Forum

Ask Question   UnAnswered
Home » Forum » Sharepoint       RSS Feeds

List item limit could make performance crawl?

  Asked By: Sudhir    Date: Aug 02    Category: Sharepoint    Views: 1949

Am I going to run into a problem with the recommended 5000 (or
whatever #) item limit
per list (or folder in a list), using the Knowledge Base application
template from MS? Over time it will probably surpass that. Is it
going to start to crawl once there are more than a certain # of items?

(This template doesn't seem set up to use folders easily, if items
were to be placed inside folders to avoid having too many in any given 1)

I would really appreciate anyone incite and experience on this issue.



8 Answers Found

Answer #1    Answered By: Sandra Alexander     Answered On: Aug 02

It's really about how many items are *viewed* at any given time, so if a
folder or view has more than 2000, you'll start to see performance
degradation. However, if you're searching or using some other interface
that just pulls items piecemeal, you shouldn't have a problem.

Answer #2    Answered By: Nalin Rao     Answered On: Aug 02

That's mostly true. The viewing of large lists is the main problem  with
them. However, there are some Object Model performance  issues as well.
These kinds of issues show up when the Object Model is walking through
the list, whether it's being presented in a web browser or not. Walking
through a large list  to index it may very well incur the same penalties.

Answer #3    Answered By: Thomas Davis     Answered On: Aug 02

If you're going to walk a large list  via the WSS API, you have one of two

1) Don't. Use GetItemById instead.

2) Pull the whole SPList object into memory and mess with it there, rather
than incurring the multiple-table-search overhead every time you need to
pick up an item. You'll see tremendous performance  gains this way.

Answer #4    Answered By: Dominic Davis     Answered On: Aug 02

Those probably work better than what I do. I'm by no means a programmer.
I only ever use foreach loops to walk through lists. I'm certain there
are better ways to do it. Unfortunately foreach loops are pretty easy
and not everyone knows not to use them.

Answer #5    Answered By: Indu Raj     Answered On: Aug 02

Yeah, foreach loops are slow. You might want to read this white paper from
Microsoft titled "Working with large lists in Office SharePoint Server 2007."
It covers different coding techniques to retrieve data from lists and the
corresponding performance.


Answer #6    Answered By: Khushi Srivastava     Answered On: Aug 02

I have also heard pulling the large list  data via its
web service is also more efficient than the OM.

So you think it is possible that the SP crawler could incur the
penalties when indexing a list over 2000? I would hope MS does it in
a more efficient way. On the other hand I would expect the penalties
on to be on full crawls (if they occur on crawling).

Answer #7    Answered By: Arti Patel     Answered On: Aug 02

Check out blog.solanite.com/.../Post.aspx?ID=15 for
another really good overview including some perf numbers on different
list access methods.

Answer #8    Answered By: Claire May     Answered On: Aug 02

Just an FYI on the article linked below. Be aware that if you take
their code as an example of how to program against the object model,
it is not "Best Practice".

If anyone is doing custom developement through the object model, I
strongly suggest you read the article at:


Basically, whenever you create a New spSite object without a using
statement and without calling dispose, you will be leaking memory on
your server. So if you walk through every site in your site
collection you could run  out of memory.

Just something to be aware of - we got caught with this when we first
started programming with the OM.

Didn't find what you were looking for? Find more on List item limit could make performance crawl? Or get search suggestion and latest updates.