## Why are Photoshop and Lightroom extremely slow on my computer?

For the last year and with the latest updates I’ve experienced an awful behavior with both Photoshop and Lightroom. Both programs are extremely slow. I experimented with performance settings on both programs (enabling/disabling GPU, memory increase, etc.) but had no effect. This is the most laggy and slow experience I have ever had with those programs. I am thinking of suspending my subscription to Adobe, because I am almost unable to work with this software.

Lightroom:

Panels are laggy. When switching across modules, sometimes the panels can’t load for a several second or freeze. All operation in Lightroom, like filtering images, browsing images, switching between modules and panels, are very slow. Luckily the sliders in Develop mode are working well.

Photoshop:

Brush is insanely slow. When I draw with a big brush, it can take half a minute just to draw from one side to the other of the canvas. Opening files, switching between open canvases, or even closing files and closing Photoshop itself is very slow. When I open an image from Lightroom to Photoshop it sometimes takes up to 15 seconds to perform this action.

All those actions are performed under a PC in good health and Windows with the latest updates and I confirmed that no background tasks are running that could consume resources or slow work.

## computer architecture – Cache Blocks Direct Mapping

If the main memory address has 18 bits (7 for tag,7 for line and 4 for word)
and each word is 8 bits.
I found that the main memory capacity is 256-KBytes, total cache lines is 128 line,
total cache words is 128*16(16 word per block/line) = 2048 words.
Then what will be the size of cache words?
I am very confusing on it. I can’t get the definition of cache words.
Can anyone tell me what is the cache words?
Thank you!

## cpu – What subfields in computer sciences may one study without learning Object Oriented Programming?

Object Oriented Programming is a type of programming paradigm.

A Computer Science degree is mostly theoretical (not only machine learning and applied statistics! Believe me there is so much more), so you wont see any of this in most courses, however, in a Software Engineering degree I suppose you do learn more about OOP.

Anyways, OOP is always good to know. Its not as complicated as you would think from its fancy name, and it gives a nice way to write organized code, and most programming languages support that kind of programming.

However, there are some programming languages that use a different type of programming paradigm called “functional programming”. I recommend you to take a look at it too.

If you are wondering about what kinds of things there are in a CS degree, feel free to ask me!

BTW: This Stack Exchange site is for theoretical computer science, so questions about theoretical computer science problems are seen here all of the time.

## Improve Prim’s algorithm runtime – Computer Science Stack Exchange

Assume we run Prim’s algorithm when we know all the weights are
integers in the range {1, …W} for W, which is logarithmic in |V|. Can you improve Prim’s running time?

When saying ‘Improving’, it means to at-least:
$$O(|E|)$$

My question is – without using priority queue, is it even possible?
Currently, we learned that Prim’s runtime is $$O(|E|log|E|)$$

And I proved I can get to O(|E|) when weights are from {1,….,W) when W is constant, but when W is logarithmic in |V|, I can’t manage to disprove/prove it.

Thanks

## computer vision – Matching superimposed image

We are given two grayscale images, one of which contains a large, mostly contiguous patch from the other one. The patch can be altered with noise, its levels may be stretched, etc.

Here’s an example

We would like to determine the region of the image which was copied onto the other image.

My first instinct was to look at the local correlation. I first apply a little bit of blur to eliminate some of the noise. Then, around each point, I can subtract a gaussian average, then look at the covariance weighted by that same Gaussian kernel. I normalize by the variances, measured in the same way, to get a correlation. If $$G$$ is the Gaussian blur operator, this is:

$$frac{G(A times B) – G(A)G(B)}{sqrt{(G(A^2)-G(A)^2)(G(B^2)-G(B)^2)}$$

The result is… not too bad, not great:

Playing with the width of the kernel can help a bit. I’ve also try correlating Laplacians instead of the images themselves, but it seems to hurt more than it helps. I’ve also tried using the watershed algorithm on the correlation, and it just didn’t give very good results.

I’m thinking part of my problem is not having a strong enough prior for what the patch should be like, perhaps a MRF would help here? Besides MRF, are there some other techniques, perhaps more lightweight that would apply? The other part is that correlation doesn’t seem to be all that great at measuring the distance. There are places where the correlation is very high despite the images being very visually distinct. What other metrics could be of use?

## computer architecture – Detecting Data and Control Hazards for a mips 5 stage pipeline

I’m practicing data and control dependencies, but having trouble detecting them. For this example, I’m assuming this pipeline is fully bypassed (with forwarding). I think the only data dependency is i3 on i2. Is this correct? I also don’t know how to detect the control dependency. I know there will be one because of the bne, but not sure how and on which one…

i1: lw $t0, 10($t1)
i2: lw $t4, 3($t2)
i3: addiu $t0,$t4, 3
i4: addu $t0,$t0, $t2 i5: addiu$t2, $t2, -8 i6: addiu$t1, $t1, -4 i7: bne$t1, \$0, i1


## lightroom – Is there a way to export the develop setting of multiple photos and use them on another computer which has the same photos?

I have a number of photos that need to be edited in Lightroom. I have 2 copies of each image (perfect copies, same filename) on 2 separate computers. I need to edit the photos on computer A in LR, and then move them to computer B (which also has LR installed) to perform further actions on them.

Exporting the edited photos and sending them from computer A to computer B is unfortunately not feasible because the size of the photos is very large. Uploading them would take so much time it would defeat the purpose, so I’m looking for an efficient way to transfer the custom develop setting but not the photos themselves.

## computer architecture – Big-endian systems and the smallest memory address

I read on Wikipedia, that “Big-endian systems store the most significant byte of a word at the smallest memory address and the least significant byte at the largest. A little-endian system, in contrast, stores the least-significant byte at the smallest address.”
But I don’t know what is “the smallest memory address”, and how does Big-endianness is the dominant ordering in networking protocols, “such as in the internet protocol suite, where it is referred to as network order, transmitting the most significant byte first”, in those type of protocols, the first is accessed the smallest memory address and in little-endian system is accessed the memory address?

## C language programming – Computer Science Stack Exchange

Thanks for contributing an answer to Computer Science Stack Exchange!