I have nopt seeing anyone else trying to do this. It is completely possible I am apperoaching this the wrong way. Basically, I have a computer with a DVI input. If nothing is attached to the DVI input, then a program on the computer loads some images on screen. If an output source is connected to the DVI port, then my program should stop writing images and use the DVI video feed instead.
What mechanisms exist to determine if a DVI input exists, and if there is currently a valid video signal present? How can I read the video stream?
Or am I going about this the completely wrong way?
At a hardware level most video input subsystems, analog or digital, are capable of detecting the presence of an input signal, or at least something that has a lot of the characteristics of one.
For a digital standard, you have actual clocking data either on its own wire, or encoded in a serial data stream. If there appears to be a clock, and if its frequency is regular and reasonable would be a first test (though for some standards, reasonable can cover a huge range of frequencies).
Next, video (not just digital, even analog) has a repeating structure of lines and fields, so there should be two identifiable submultiples of the pixel clock, one corresponding to the start or end of each line, and the other to the start or end of each field (screen). Again, these might have their own wires, might have unique means of encoding (special voltages in the analog case), or might represent time gaps in the pixel data. Even if there were no sync and no retrace times, statistical analysis of the pixel data would probably give clues to the X and Y dimensions as many features in the picture would repeat.
Actual video input subsystems (think flatpanel monitor) can have even more complicated detection and auto-adapting circuits - they may for example resample the input in time to change the dots-per-line resolution, or they may even put it in a frame buffer and scale it in both X and Y.
What details of the inner workings of the video capture circuit are exposed to consumer, or even driver level software would depend a lot on the specifics of the chipset used - hopefully a data sheet is available. It's pretty likely though that somewhere there is a readable register bit that indicates if the input is capturing something that the circuit "thinks" is a video signal. You might even be able to read out parameters such as the X and Y resolution and scanning rates or pixel clock rate.
Similarly, the ability to get data out of the port would be chipset dependent, but if the port is going to be useful for anything, there is presumably an operating system driver for it which provides some sort of useful API to video consuming applications.