I receive emails regularly asking me for my advice, my opinion on how big a catalog should be? Many people are complaining about how slow Lightroom is.

First some technical information:

  • Lightroom has always used a database called SQLite which is owned by Oracle
  • SQLite doesn’t have many limits. The limits are:
    • How SQLite was compiled. The developers decide how big it can get. See: SQLite limits
    • The operating system will determine how big the SQLite database can be. It’s only 1 file.

I decided to do my own test. I copied thirteen thousand photos three times by adding xxx, yyy and zzz into the filenames. I imported the 50,489 photos with render previews: minimal. Slightly less than a full day to import all of the fifty thousand photos on a 3Gb core2duo.

Lightroom 13,000 photos 50,000 photos
Disk Size 179 Mb 673 Mb
Startup identical can’t see the difference
Grid speed identical can’t see the difference
Develop speed identical can’t see the difference
Queries excellent depends on the query

I have noticed a significant slow down on very large smart collections. This only happens only when I open some of the smart collections, especially the negative smart collections. A negative query is what’s missing and it takes a full scan, instead of using any index.

  • Lightroom is quite dumb when it comes to detecting duplicate files. Changing the name of the photo and the extension of the photo will make it as different photos.
  • The previews are NOT part the SQLite database. They can always be deleted, and Lightroom will rebuild them on the fly when needed.
  • Notice that also there is 3 times more photos, the disk size is only slightly larger than 3 times the size. Please note than not all the photos have titles, captions and/or keywords.
  • See Upgrading Lightroom and how it affects speed.