Gnome (Ubuntu, Linux) allows a custom shader to be applied to the whole desktop, including fullscreen apps.

Here is an up to date fork with some example shaders:

I use it to make sure a 'sensitive' pixel on my screen never turns on (it's a row of pixels which, if the difference between it and the pixels to the side have more than a certain difference in brightness, the whole screen fails - presumably due to a power supply fault in the column driver circuitry).

I do this for a living. Sony digital cinema projectors use a type of LCD panel (SXRD) where the uniformity drifts over time. A special camera takes about 35 minutes to create a LUT to restore the projected image to a uniform white.
> I haven't yet watched a whole movie with the new color

Love the hacker mindset. Once the problem is solved, the underlying issue loses its appeal :)

This is almost identical to a problem I'm trying to solve, which is to turn a potato-quality picture of a sheet of paper into a clean scan, turning whatever levels of gray conform the paper background become a uniform #ffffff white. The obvious solutions (equalizing, converting to bitmap, …) don't work because what's white in the top left (say #ccc) is wildly different from what's white in the bottom right (say #888), and the shift is non-uniform due to potato-quality lighting.

Glad I caught this post, I hope the solution can contribute to my problem (although I do not have a way to obtain a fixed ground truth — lighting will change for each picture.)

Wow, I'm very glad to see this!

This exact idea has been floating around in my head for ages, and I always wondered how well it actually worked - now I don't need to wonder. I started thinking about it as a potential solution to OLED burn-in. Thankfully, my OLED TV doesn't have any burn-in yet, so I never needed to investigate further.

I wouldn't call this over-engineering. It's a reasonable solution to an actual problem.
I wonder if this is so 'generic' that e.g. AppleTV could add support for this? Take a photo of your TV with an iPhone when AppleTV is showing test image. And then the AppleTV output is calibrated appropriately to compensate for uneven backlight.
Clever solution! It kind of reminds me of when I sometimes run sound through customized EQ and compression to compensate for crappy speakers. My approach is not quite as scientific as this though, just "tweak until it sounds good."

I didn't know there was a version of MPC with a live shader editor, that is also very cool. This is actually a pretty good use case for such a feature.

I have over a decade of experience in the design and manufacturing of advanced display systems and ran into precisely this problem around twenty years ago. At that time we experimented with and developed pretty much exactly this type of compensation; implemented on custom FPGA-based real time image processing boards.

Just looking at the pictures, this does not look like a backlight problem but rather degradation of the liquid crystal layer. Yes, sure, there's interaction between the two. The purple shift, however, is very much something that was happening twenty years ago with some liquid crystal chemistries. Back then you could definitely tell which panels were not using high quality LC fluid.

Apple had this issue with their second generation HD Cinema Display product (the first aluminum enclosure, 24 in, 1920 x 1200 model). Some percentage of them would turn purple. I don't have Apple's stats on this. From my own experience the number fluctuated between 15% to as much of 50% of the panels in a batch going bad after moderate burn-in.

Having said all that, this type of compensation or fix might be OK for a TV at home or the computer monitor on the desk of a doctor or even a coder. Not good --at all-- for someone doing critical color work, such as a graphic artist. The reason is that you introduce spatial nonlinearities and differential errors.

The simplest way to put it is that you no-longer have the full 256 (or 1024) steps per R, G, B channel between 0 and 100%. Hypothetically, you might have 256 for green and, say, 200 for red and 175 for blue. This means that the path from black to white is no longer monotonic. You can have serious color rendering errors through the color space. For example, it might be impossible to make an accurate 50% gray because you just don't have the RGB values needed to accomplish that. Worse yet, everything between 47% and 53% gray might look exactly the same.

You can also introduce serious gamma distortion. If, on top of that, you add a temporal element (video), well, it can be a real mess.

The real solution (for critical workflows) is to replace the panel.

BTW, this can apply to RGB OLED as well.

I damaged my laptop's screen by leaving it running at 100% CPU with the lid closed for too long(not my intent - that was "sleep mode"). The adhesive keeping the LCD layers came off, creating a diagonal striped clouding pattern.

I was meaning to do something similar to the author[0], but couldn't make time and just used this opportunity to buy an external screen.

I'm glad someone put in the work so that now I may be able to use my original screen again.

[0] My most desperate idea was to run a RDP session locally and process the displayed image. Seemed simpler than trying to modify the content of the screen directly.

This is such a cool idea and execution. I also like the DIY approach to patching MPC BE yourself, which shows how far OSS can take you.

I wonder, when applying a linear transformation like in the shader described, will the total available color space decrease? Simply put, if a one-dimensional color value on the arbitrary scale between 1 and 100 needs to be decreased by 20 for correction, the resulting maximum will be 80. Does that mean the total available color values will be less?

I wish you could do this with phones. I know more than one person stuck with a crappy pOLED screen with that terrible green tint to the bottom third of the panel.
This reminds me of the world cup in South Africa, where the spectators used vuvuzelas to make a unique sound that I found hard to endure. I figured it would be easy to use a notch filter to remove it, but I could never find a way to implement it it real or near real time, only as post processing.
This technique is cool but it seems like it would cause poorly performing pixels to get worse over time.

Would it be possible to create an inverted image that would correct the backlight by burning in the bright areas? It would seem like a more difficult task to accomplish because pixel burn-in time is a variable that’s hard to measure.

I have the same idea each time I read something about backlights;

How easy would be to place a cheap LCD at the back of the main screen and mirror the same output (in horizontally inverted mode)?

Technically it might require synchronizing the frame latency differences between the two devices, but would such a hack improve the perceived quality?

When I installed my DIY backlight solution (ala AmbiLight) to my TV I also had a red shift for some of my LEDs. But that was mainly due to the fact that not enough voltage reached the later LEDs on the strip to power green/blue because they need higher voltage than red.
If only my screen had issues with the picture. Instead, its internal clock lags behind the source, causing audio to be ever more delayed. It resets after a standby/wake-up cycle, though. It's annoying to restart TV after about 3 hours, though.
I wonder if something similar could be used to keep my TV from ratcheting up the overall brightness/backlighting whenever a subtitle is on screen.

The constant ramping of brightness is rather distracting.

I need this for my Android phone to compensate for a burnt-in image.
How much does backlight drift over time? I have a few calibrated monitors with "uniform compensation" that has gotten decidely less uniform over the years.
Pretty cool, but you could instead replace the LEDs.
The quality of computer monitors is appalling - after buying and returning several monitors because of quality issues, I briefly considered doing this system-wide to fix colour uniformity problems. Eventually I chose to keep my old monitor instead. It has uniformity problems as well but it doesn't cost me $1000 extra to keep.

But it's a damn shame that you can't throw enough money at manufacturers to make them make monitors without glaring QA problems. No matter how much you spend they sell you shit.