Quantcast
Channel: Adobe Community: Message List - Lightroom Classic — The desktop-focused app
Viewing all articles
Browse latest Browse all 138470

Re: Lightroom 4.2 very poor CPU usage

$
0
0

Bob, you’ve discovered the key to success! Your technique is identical to that used in the mainframe computing environment. You’ve spread out your high access, smaller sized files to different physical pathways. What you’ve done is called storage management in storage industry tech-speak and it is the only way LR users with performance issues will find steady and consistent relief. By adding multiple access points (3 separate SSDs) to your various working datasets you have alleviated the response time issue caused by multiple threads trying to access different files down the same physical pathway.

 

I promised a demonstration of the effect of multiple threads accessing related data on a single disk. What I did was run a program called RoadKill Disk Speed (I’m not making this up) to test the speed on my main OS HHD disk. I ran this two minute test on a completely quiet system and got these results:

 

Access time 6.62 ms

Cache Speed 321.12MB/second

Max Read Speed 148.64 MB/second

Overall Score 2464.2

 

I then started a second copy of the tester and ran both of them at the same time in order to judge the interference factor caused by two threads accessing the same drive concurrently. I got these results:

 

Access time 10.36 ms - increased by 56%

Cache Speed 211.77 MB/second - decreased by 52%

Max Read Speed 57.35 MB/second - decreased by 159%

Overall Score 742 - decreased by 232%

 

I won’t go into detail about each of these stats, but I think it’s pretty obvious that unless the data can be strategically spread out among several disks there could be potentially serious interference when accessing data from a single source.

 

Lightroom has begun to expand our capabilities to edit, find and catalogue our photos once they are moved from our cameras to our computers. The volume of the data that represents our photos is growing at a staggering rate. So too is the complexity and the expense of the storage subsystem needed to effectively and efficiently store it all.

 

So what are we to do? For those of us who can’t afford multiple SSDs (I would if I could) there is at least the option of adding a second or third HHD and spreading the data as Bob has done. I would also strongly suggest an investment of $40 for the Perfect Disk product. At least that is a start toward a total storage management approach to dealing with the vast amount of data created by digital photography. Just sayin’…


Viewing all articles
Browse latest Browse all 138470

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>