I was given a dump of logs in 2 folders (so like 200 individual files from a server that I cannot access), to see what errors/exceptions were occurring and causing server downtime/other issues?
I have tried HTTPLogParser which cannot read the files and I have LogParser 2.2 from Microsoft (old) but I have too many files to look through individually.
I have no access to the server so cannot connect anything which sucks.
Any good ideas on how to load a folder of these 2022-11-xx.log files and get some analyses done?
An example of the data in the files are:
G:\LogFiles\W9BVC3\u_ex20113046.log:204:2022-10-30 00:00:18 1112.23.3.12 GET /cool-website - 80 - 147.115.242.13 Pingdom.com_bot_version_1.4_ (http://www.pingdom.com/) - 200 0 0 0
G:\LogFiles\W9BVC3\u_ex20113046.log:982:2022-10-30 00:01:18 1112.23.3.12 GET /cool-website - 80 - 147.115.242.13 Pingdom.com_bot_version_1.4_ (http://www.pingdom.com/) - 200 0 0 0
G:\LogFiles\W9BVC3\u_ex20113046.log:1061:2022-10-30 00:01:31 1112.23.3.12 GET /cool-website/api/v2/status/day - 80 - 147.115.242.13 Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/105.0.0.0+Safari/517.36
https://examplewebsite.com/cool-website/ 200 0 0 0