I'm kind of a PowerShell rookie and I'm trying to write a script which will run several winmerge tasks in a row.

The winmerge processes will need to access external disks so I don't want them to run at the same time, but rather one after the other. BUT the UI of a finished comparison should stay open.

So what I would like to do in PowerShell is:

in PS: Start WinMerge UI for "folderA" "folderB"
give the focus back to PS and wait for the UI to complete its task
start a NEW WinMerge UI for "folderC" "folderD"
give the focus back to PS and wait for the UI to complete its task
...

My plan is to let it run and to get back to it later, having 3 UIs open with the results. For now, I've only managed to start all UIs at once (but this slows down the process overall: I'm accessing an external HDD and an USB).

The calls to WinMerge that I'm trying to use look like this:

winmergeu.exe -r -wl "C:\xxx\Documents\" "Y:\xxx\Documents"

So far I've tried with:

cmd /c ... or cmd /c start /wait ...
start-process ...
invoke-command ...
start-job | wait-job | receive-job
a PS wrapper for winMerge (https://communary.net/2016/01/17/invoke-winmerge/)
appending | out-null (this mostly leads to an error)

As I said I'm kind of a noob and I'm having a hard time firguring out what the correct call would be. In most of the cases the UI opens up and does its job, but the next PS call does not start until I manually close it (and as I said I'd like to have several UIs open in the end).

I guess I have two problems here: (1) getting back to PS after the UI opens (2) waiting for a result from the UI (? can I even get a result? I guess that depends on the WinMerge implementation?) to trigger the next UI

EDIT

I'm actually just trying to check my files for integrity after a backup with robocopy. I thought it'd be nice / a good exercise to write my own backup scripts in PS, as long as I don't screw up my paths :D

The problem is that I have a lot of paths to check because of the two physical drives + my cloud provider, each with encrypted/plain partitions. I tried FCIV and computing/comparing hashes on my own (with Get-FileHash) but I guess I must have done something wrong because they were not detecting changes. WinMerge works fine out-of-the-box, it can compare 3 paths at once and I like its UI... but running n UIs at once is probably a bad idea, that's why I was trying to set them up sequentially. But I'm open to suggestions if you have better ideas.

1

There are 1 answers

6
Steven On BEST ANSWER

As a GUI application WinMerge probably doesn't have any signaling to let PowerShell know the merge is completed. It's not an API. Any of the wait approaches described so far will probably fail because it's waiting for the process to close which, by your design will never happen.

2 things to think about:

  1. If WinMerge has a log file you could monitor the log file for some kind of job completion message. You can write the wait routine looking for that line/text.
  2. If there is a log functionality, perhaps use the log to review the jobs after the fact. This may negate the need to leave the interface open, which would solve the wait problem.

Update Per Comments & Edits:

I was wrong I have used WinMerge before as part of Tortoise SVN. Though I prefer Beyond Compare.

I don't think you need to verify the files after RoboCopy. It's reliable enough that corruption is truly an outside risk. RoboCopy uses standard Windows APIs to copy files, and it's doubtful the same skeptisism would accompany a GUI based copy. Understandably some skeptisism may result from the confusing array of options. However, that complexity revolves around what to copy, logging etc..., once RoboCopy decides to copy a file it should do so reliably or error. The copy itself should be considered separate and apart from any uneasyness you may have over the tool's complexity. And, crafting an appropriate Robocopy command is usually a front end effort inclusive of testing, and not often followed by such verifications.

Note: You can also use RoboCopy a second time, to either refresh the data or using the /L option to see if anything was missed or since changed.

Note: When a file copy is in progress RoboCopy sets any new destination file's modified time to 01/01/1980 and only changes it to match the source when the copy is complete. If a file is different by size or modified time, they too aren't changed until the transfer is complete. This is how RoboCopy can be restarted and still find / overwrite incomplete copies.

There is at least one hypothetical situation that could cause corruption. When a device, for example a WAN accelerator gets in between the source and destination. Hypothetically, RoboCopy could think everything is fine based on errant or corrupt responses from that device.

So, dealing with then unlikly risk of file corruption. Your original idea of using Get-FileHash sounds appropriate. However, note that the hash is only performed on file contents. Files with equivalent content but divergent meta-data like name & date will return the same hash. That's quite different than how RoboCopy senses and reacts to file differences which very much relies on name, date & size.

Note: Depending on options RoboCopy may consider other meta-data; attributes & ACLs come to mind.

Without Writing a huge script to calculate and compare hashes across a large set of files you can simply run something like below on the source & destination:

Get-ChildItem <Path> -Recurse |
Get-FileHash |
Select Path,Hash |
Out-File <SomeOutputPath>

Do this on each of the folders. Then compare the resulting text files in WinMerge, Beyond Compare etc...

Note: You may want to set the -Width paramter of Out-File to a very high number. That might accommodate very long paths.

Note: For Get-ChildItem you may want to state the path like \\?\c:\FolderName this will allow traversing very long paths, though I can't be sure how Get-FileHash will react.

Note: This may take quite a while to run, and may generate huge text files.

This probably doesn't need to be a repeatable process. Once you've gained confidence that RoboCopy is copying with full fidelity, it's probably overkill to continue or try to refine.

Update:

If you want to use RoboCopy to do a listing suitable for comparison you'll need to do a few things:

  1. You'll have to craft your RoboCopy command to get as close to a pure path line/string as possible.
  2. You'll have to pipe that output to a ForEach-Object{} loop to manipulate the output strings; removing the dissimilar leading portions of the path.

Example:

RoboCopy C:\temp\TestFiles3 "C:\Empty" /E /FP /NS /NC /NP /NJH /NJS /L |
ForEach-Object{ $_.Trim() -Replace "C:\\Temp\\" }

This RoboCopy command:

  • /E :: copy subdirectories, including Empty ones.
  • /FP :: include Full Pathname of files in the output.
  • /NS :: No Size - don't log file sizes.
  • /NC :: No Class - don't log file classes.
  • /NP :: No Progress - don't display percentage copied.
  • /NJH :: No Job Header.
  • /NJS :: No Job Summary.
  • /L :: List only - don't copy, timestamp or delete any files.

Note: These descriptions came directly from the help file RoboCopy /?. It is worth reading!

These options would output something like:

        C:\temp\TestFiles3\
        C:\temp\TestFiles3\TestFiles2\
                        C:\temp\TestFiles3\TestFiles2\Test1000_KB.txt
                        C:\temp\TestFiles3\TestFiles2\Test100_KB.txt 
...

The ForEach-Object{} loop will then change the data to:

TestFiles3\
TestFiles3\TestFiles2\
TestFiles3\TestFiles2\Test1000_KB.txt
TestFiles3\TestFiles2\Test100_KB.txt

You can see it trims the white space and removes the leading portion of the path as instructed. As previously mentioned if you add Out-File and run on both the source and destination you should get files worthy of comparison...

Note: Used -Replace in the example because .Replace() is case sensitive. pwsh community generally prefers using PowerShell native approach...