PC spec for Lightroom

Author
Discussion

GravelBen

Original Poster:

15,773 posts

233 months

Sunday 23rd June
quotequote all
This might be a question for the geeks among us...

Since the introduction of the various AI features to Lightroom (subject and object detection masking, denoise etc) I've been finding LR classic increasingly slow, unstable and prone to crashing/freezing when using those functions, requiring me to close and re-open LR each time which ends up in a fair bit of frustration and wasted time - especially when I'm processing hundreds of rally photos and at times I'll only get through a small handful of photos in between crashes.

I'm wondering if my PC is just getting too old for the demands of these new LR features, it does seem to happen a bit less frequently if I make a point of closing everything else when I'm using LR.

It was pretty high spec for its day, but about 10 years old so that day was a long time ago now - i7-4790 (3.6-4ghz 4 core/8 thread), 32gb RAM (but only DDR3 speed), GTX1050ti.

Upgrading this PC isn't a practical option due to the age and limitations of a small form factor case (Dell Optiplex), so I'd be starting from scratch.

Scouted a few options, with a bit of man-maths to justify it I'm looking at i7-13700, 32gb DDR5 RAM and RTX 4060 or 4060ti, in a gaming PC case with good cooling etc (It would be used for a bit of casual gaming too, but I'm not a serious gamer by any means).

Obviously that would be a massive leap above my current setup, and hopefully fairly futureproof for long term use / upgradability when the time comes.

What sort of PC spec are my fellow PHers using for LR and how are you finding performance and stability etc? Does the spec I'm looking at seem sensible?

eltawater

3,131 posts

182 months

Sunday 23rd June
quotequote all
I have an i5-9500T,32GB and crappy intel onboard graphics equipped HP desktop mini. It still chugs a fair bit with any of those AI features, enough for me not to want to use them but it doesn't cause my LR classic to crash. Indeed, my only crash instances occur when LR classic still decides to sit and wait for several minutes after initial startup or when trying to creation a collection when importing.

I only have a use case for denoising and I prefer to use DXO PureRaw3 for that and leaving it to process a batch of them overnight rather than as part of my editing workflow. It still takes 6 hours to denoise 100 images at DeepPrime setting and that's with it auto importing back into LR classic. You may find similar or worse performance with Lightroom as I get the feeling those features are still in their developmental infancy.

Your choice of upgrade should suit your needs reasonably well, but I can't help feel that Adobe's slightly lacklustre attention on optimisation means Lightroom will still drain you of all available CPU.

Byker28i

62,115 posts

220 months

Thursday
quotequote all
Don't know current lightroom but I built my PC for the older version. Samsung EVO 850 disks for the OS, another smaller one just for the cache files, and another for this years raw files, plus large storage for previous years raws/exports.

Fastest i7, 32gb ram. Works really well still

mikeh501

730 posts

184 months

Thursday
quotequote all
7 year old i7 with 32gb ram, with raw files stored on a NAS. Copes fine, can take a bit to export jpgs and do previews etc but otherwise fine tbh.

GravelBen

Original Poster:

15,773 posts

233 months

Interesting - maybe the LR freezing/crashing is just due to my PC being that few more years older, or a more specific compatibility issue. I was keeping an eye on system load while processing some rally photos this week and the LR crashes generally seemed to be accompanied by a spike to 100% load on the GPU.

Even just with jpg files (I shoot raw for most things but jpg for motorsport) it was very laggy and often crashing every 2 or 3 photos, it has definitely got worse over the last 6 months or so and seems worse with each LR update.

I guess at 10 years old with a 4th-gen i7 it was probably only a matter of time before something became an issue and triggered a need for replacement.

Edited by GravelBen on Friday 28th June 05:59

JohnS

936 posts

287 months

Saturday
quotequote all
I've been in the same position for a few years, but have always managed to put it off for another year.

The new AI features in LR make very heavy use of the GPU, especially denoise. It's heavily optimised for Nvidia graphics cards, so choose one of those over an AMD equivalent. There does seem to be a point at which more GPU power doesn't make a big difference, e.g. a 4090 card is only 10-20% than a 3080 despite being more than twice the price and consuming vast amounts of power.

I've currently using an i7-6900k (8 cores), but with 64Gb RAM and a 1070Ti GPU.

I noticed everything was a lot smoother after jumping from 16Gb RAM to 64Gb (cheap Ebay deal), and I also fitted the CPU with the most cores that would fit my current X99 based motherboard. An M.2 NVME SSD as the primary drive also made a big difference and generally I'll work on the latest album of my motorsport photos on the SSD, before moving it to some large spinning rust disks.

I create a new catalogue for each motorsport event in the same directory as the photos are stored (by date and event name), and on import, I build 1:1 previews and leave it running for an hour or so. After this, it's much quicker to navigate through, zoom into images and check if they are sharp enough etc.

I avoid AI Denoise as it takes about 15-30 seconds per image, and even copying settings including AI masks such as sky can take 5-10 seconds per image.

Whilst editing individual images, LR predominantly relies on single core performance, so a recent processer with a fast clock speed performs better than a machine with masses of cores and a slower base clock speed. However, when importing images, building previews and exporting, LR makes use of all available cores and upgrading from my older 4 core/8 thread processor to the 8 core one literally halved the time spent on these tasks.

Have a look at Puget Systems, who publish a LR benchmark and you can view the results of various configurations: https://benchmarks.pugetsystems.com/benchmarks/?ag... They also have various blog articles comparing Intel and AMD processors. Intel are currently the fastest for most use cases, but can consume a lot more power and generate more heat than AMD.

AMD will be releasing their Zen 5 processors next month on the AM5 platform, and I'm likely to go down this route using the 9950X 16 core processor with at least 32Gb or RAM and a fast M.2 NVME drive and suitable cooling. Intel's latest processors are due later in the year as well, which means if you can hold off another few months, then you might be able to pick up a bargain especially in black Friday sales.

One more thing that helps with the recent releases of LR, is to quit LR and reboot every so often. I think there may be memory leaks as I do notice a drop-off in performance after a while, and restarting seems to help a lot.


GravelBen

Original Poster:

15,773 posts

233 months

Yesterday (00:31)
quotequote all
From some further interweb research I've learnt that apparently AI functions (especially denoise) lean heavily on something called tensor cores in the GPU, which older cards like my GTX 1050ti don't even have. Reported results on other forums show massive improvement in AI denoise processing time from several minutes on older systems down to 15-20s on something more recent with a modern GPU like RTX 30- or 40- series.

I use AI denoise sparingly (partly because it takes so long) but use AI subject & sky masks pretty frequently for editing motorsport photos. Shooting motorsport in JPG means AI denoise isn't even an option there! Working through a batch I often copy+paste develop settings from one photo to another, pasting develop settings with AI masks seems to be the most common trigger of my LR crashes. So I've been forced to close and re-open LR pretty frequently anyway!

Anyway to cut a long story short I decided to bite the bullet and upgrade properly, 'buy once cry once' instead of a cheaper upgrade that would do ok for now but might run into similar issues getting out of date sooner.

So I've ordered a decent spec gaming PC with i7-13700KF, 32GB DDR5, RTX 4060ti. 2TB NVMe drive will hopefully be big enough (current PC has 512GB SSD for OS etc and 4TB HDD for storage, but only half full even with a few years of photos stored), but I can always add another drive later if needed (or just be more disciplined at moving older stuff onto external drives).

Funny comparing performance benchmarks between my current i7-4790 (4 core/ 8 thread, 4Ghz turbo) and the i7-13700 (16 core/24 thread, 5.4Ghz turbo).
But truly hilarious comparing benchmarks between the GTX 1050ti and RTX 4060ti!

Edited by GravelBen on Sunday 30th June 00:44