I'm using the code below to take a screenshot of all the active displays combined.
Rectangle totalSize = Rectangle.Empty;
foreach (Screen s in Screen.AllScreens)
totalSize = Rectangle.Union(totalSize, s.Bounds);
Bitmap screenShotBMP = new Bitmap(
totalSize.Width, totalSize.Height,
PixelFormat.Format32bppArgb);
Graphics screenShotGraphics = Graphics.FromImage(
screenShotBMP);
screenShotGraphics.CopyFromScreen(
totalSize.Location.X,
totalSize.Location.Y,
0, 0, totalSize.Size,
CopyPixelOperation.SourceCopy);
I've tested this on two different machines so far, and when World of Warcraft is running full-screen one machine takes a picture of the WoW screen and the other one takes a picture of the desktop. I wouldn't have been surprised if this code never worked to take a screenshot of WoW, because (I assume) it uses DirectX to write directly to the video card. However, since it does work in one case, I'd like to know if there's something I can change in the code and/or the machine configuration to make it work in all cases. (For WoW, at least. I realize there are probably many other games that won't work.) It doesn't seem to be a framework version issue, as I have compiled my code against different versions and the behavior doesn't change.
Machine #1, which takes the WoW picture, is running 64-bit Win7 Professional and has .NET Framework versions 2.0 thru 4.0 installed.
Machine #2, which takes the desktop picture, is running 32-bit Win7 Home Premium SP1 with .NET Framework versions 1.0 thru 4.0 installed.
Edited to add: Another data point is that if I switch Machine #2 from a Win7 Aero desktop theme to the "classic" theme, I start getting WoW pictures instead of desktop pictures.
I added a call to:
as per the answer to Enabling/Disabling Aero from a Windows Service, and that fixes the problem. I don't really understand why, however.