I see that you're going through desktop image bitmap pixels to calculate luminosity. I wonder if it would be more efficient to downscale the image in graphics memory first. Then analyze tiny interpolated image instead of bringing the whole image to RAM every second.
Report
This is really clever and honestly something I've thought about 1 million times when I'm cmd-tabbing between apps that have dark themes and other's that don't! Brilliant...
TensorFire
Iris
HazeOver
signalayer
TensorFire
Material Design Palette Deck