How to crypt patches on screen

74 views Asked by At

I am wondering (out of curiosity) how to encrypt a chunk of pixels (e.g. a captcha) in a server application, such that a client cannot use any kind of pattern recognition (neural networks etc.) to decrypt the pixels but will see the correct pixels on his / her screen. I have heard of techniques such as HDCP and I am wondering if there are any libraries to implement this. So my questions are:

  1. Is HDCP the droid I am looking for / are there other solutions?
  2. Are there any libraries that help me to implement this (in C++, Python, Go, Java, whatever)?
  3. Is it possible to use this technique for various (small) patches of the screen (not fullscreen)?
  4. Maybe it is even possible to encrypt/decrypt pixel patches with transparency?

Thank you for your help.

1

There are 1 answers

6
deceze On

From your description I'm assuming you're talking about a server-client relationship across the internet here. In that case: No. Way.

In order to display anything on screen, something has to decrypt/decode the data on the client and then send it to the screen. That decryption/decoding would be happening in the browser, on the CPU/GPU, and the decoded image would then be stored in memory. From there it's available to any other process, including neural networks and whatnot.

What you would need for this is some way to send encrypted data over the internet directly to the monitor, where it needs to be decrypted and immediately displayed. You would also somehow need to keep the implementation detail a secret, so nobody could be building a "fake monitor" to do the decryption elsewhere and get at the data this way. That's fundamentally infeasible, and even more so given the open standards based protocols and file formats on the internet.