I want to read large number of files which spans across multiple folders in a network drive.
I have approximately 20 folders, each folder has 50 sub folders and each sub folder has approximately 400 json files. I need to read one keyword in json file and if then if the data is good then I need to parse the file. From all these files I may need to parse only 300 files a day, but I need to go through each file to find if it's good or not.
I have tried reading files fro each folder using,
Directory.EnumerateFiles
But it takes a long time. What is the bes way to handle this situation? Is there a better way to some powershell script or Pearl etc to traverse through all folders and get paths of all good files in a list as text file and I can read only those good files?
My program is in c#